Skip to content

Deep learning-based facial-gesture analysis system for motorized wheelchair control. Final BSc Biomedical Engineering Project, UPSIN (2018).

License

Notifications You must be signed in to change notification settings

adcondev/deep-chair

Repository files navigation

Deep Chair 🪑

Deep learning-based facial-gesture analysis system for motorized wheelchair control. Final BSc Computer Science Project, UPSIN (2018).

Team

  • Adrián Constante
  • Brandon Vizcarra
  • Erick Sámano

Project Overview

Deep Chair uses computer vision and deep learning to analyze facial-gesture posture in real-time to control wheelchair mottors and directions, providing feedback for movement..

Features

  • 📹 Real-time Analysis: Live facial-gesture detection from webcam
  • 🧠 Deep Learning: CNN-based facial-gesture classification
  • 🎯 Accuracy: 94% classification accuracy

Architecture

Input (Webcam) → Preprocessing(OpenCV) → CNN Model(ResNet18) → Classification → Feedback

Technologies

  • PyTorch
  • OpenCV
  • NumPy/Pandas
  • Matplotlib

License

MIT License

About

Deep learning-based facial-gesture analysis system for motorized wheelchair control. Final BSc Biomedical Engineering Project, UPSIN (2018).

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages