DAMS - Driver Alertness Monitoring System
Real-time drowsiness detection using computer vision and AI
Tech Stack
5-10s
Warning Time
<5%
False Positives
30+
FPS
Problem Statement
Commercial vehicle accidents due to driver drowsiness cost billions annually and endanger lives, yet traditional monitoring systems are expensive hardware-based solutions with high false positive rates.
Overview
An advanced real-time driver monitoring system that detects drowsiness and distraction using computer vision and machine learning to prevent accidents. The system monitors facial expressions, eye state, and head position with weighted queue algorithms to minimize false positives.
My Role & Contributions
Computer Vision Engineer - Designed the multi-modal detection system, implemented weighted queue algorithms, integrated MediaPipe and TensorFlow Lite models, built AWS cloud integration for fleet management.
Tech Stack
Challenges & Solutions
Challenge
Eliminating false alarms in varying lighting (night/day) and occlusions (sunglasses/masks) while maintaining <5% FPR at 30 FPS
Solution
Engineered weighted moving average queues: eyes (15 frames, 0.4 weight), mouth (25 frames, 0.35 weight), head pose (50 frames, 0.25 weight) with adaptive thresholding
Challenge
Deploying production-grade system on Raspberry Pi 4 with <100ms inference using TFLite without sacrificing accuracy
Solution
Quantized MediaPipe and custom TFLite models to INT8, leveraging NEON SIMD on ARM; implemented frame buffer pooling to reduce GC overhead
Challenge
Building fault-tolerant incident recording with 5s pre-buffer, threaded S3 upload, and graceful degradation on network failures
Solution
Built circular buffer with mmap for zero-copy I/O, async multipart upload with boto3 TransferManager, and local SQLite queue for offline persistence