Real time Object Detection and Analytics using RedisEdge and Docker
1. Real Time Object Detection & Analytics using
Docker & RedisEdge Stack
2. - Blogger @ Collabnix
- Docker Captain
- DevRel at Redis Labs
- Docker Community Leader
- ARM Innovator
- Worked in Dell, VMware & CGI
$whoami
@ajeetsraina
www.collabnix.com
3. - The Rise of AI Database
- A Typical AI Data Pipeline
- Challenges with AI Serving Platform
- Why Redis & RedisAI?
- Why Docker?
- Overview of RedisEdge Stack
- Demo
Agenda
Target Audience: MLOps & DevOps
5. - A database built with the sole purpose of
speeding up Machine Learning (ML) model
training as well as model serving.
- AI databases simultaneously ingest, explore,
analyze, and visualize fast-moving, complex
data within milliseconds
- Helps you better wrangle the volume,
velocity and complex data governance and
management challenges associated with
training ML and Deep Learning models to
save time and optimize resources.
- Comes with lower costs, integrate ML
models so that businesses can make more
efficient, data-driven decisions.
What is an AI Database?
6. Data Sources Data Ingestion Data Preparation Model Training Model Serving
High Performance
Transient
Centralized
IoT
Business Processes
Data Pre-Processing Trained Model
Deep Learning Framework
Deploy Trained Model
Inference
AI Pipeline is data intensive
8. Data Sources Data Ingestion Data Preparation Model Training Model Serving
Data VelocityData Variety
Data Volume
Data Quality Data Access Latency Data Caching
Response Time Throughput
AI Pipeline Characteristics
9. Challenges with existing AI Serving Platform
- Slower End to End Inferencing
- Lack of HA and Downtime
- Lack of Platform Agnostic Solution
- Complexity in deploying multiple models
10.
11. Introducing RedisAI
- It is a Redis module
- Provide tensor as a data type
- Turns Redis into a full-fledge Deep learning
Model Execution
- Runs on CPUs and GPUs
Data Structure:
- Tensor
- Model
- Script
DL/ML Backends:
- Tensorflow
- PyTorch
- ONNX
- TensorRT $ docker run -p 6379:6379 -it --rm redisai/redisai
12. Benefits of RedisAI
- AI inferencing where your data lives
- Deploy new models with no downtime or performance penalties
- Serve AI over a robust, scalable, and production-proven platform
- Superior performance
- Built-in support for all major AI backends
- Runs everywhere
13. RedisEdge
Redis OSS Streams Modules
- Bundles Open Source Redis + Redis Streams + Redis Modules
- A Multi-model database built for IoT Edge
- Can ingest millions of writes per second with <1ms latency
- A Small footprint (<5MB), easily resides in constrained compute environments.
- Runs on a variety of edge devices and sensors ranging from ARM32 to x64-based
hardware.
16. Around 94% of AI Adopters are using or plan to
use containers within 1 year time.
Source: 451 Research
17. Running RedisEdge using Docker Compose
Build on Open Source
init - A Service that initializes Redis with
the RedisAI model, RedisTimeSeries
downsampling rules and the RedisGears
gear.
capture - A Service that captures video
stream frames from a webcam or
image/video file and stores it in a Redis
Stream.
Server - A web server that serves a
rendered image composed of the raw frame
and the model's detections.
18. Running RedisEdge using Docker Compose
Build on Open Source
Grafana + Redis Application
python init.py --device GPU
With GPU