This repository presents an innovative approach to enhance human-machine interaction through real-time facial gesture recognition, integrated with EEG signals for controlling prosthetic devices. The goal is to leverage brain-computer interface (BCI) technologies to create a more intuitive user experience.
- Real-Time Facial Gesture Recognition: Developed a robust model to accurately detect and interpret facial gestures, facilitating the control of assistive technologies.
- EEG Signal Processing: Utilized advanced machine learning techniques, including Independent Component Analysis (ICA), Principal Component Analysis (PCA), and Linear Discriminant Analysis (LDA), to analyze and extract meaningful motor execution signals from EEG data.
- Exploration of Motor Imagery: Investigated the potential of motor imagery for enhancing prosthetic control, providing insights into the challenges and opportunities within BCI systems.
- Programming Languages: Python, MATLAB
To set up the project locally, clone this repository using:
git clone https://github.com/Pianissimo-3115/EEG-project
For training the model, you can refer to the Google Colab notebook here: Google Colab Notebook (includes eye blink detection).
Feel free to explore the code and contribute to this project! Your feedback and contributions are welcome.