Skip to content

Latest commit

 

History

History
83 lines (56 loc) · 3.16 KB

File metadata and controls

83 lines (56 loc) · 3.16 KB

Wheelchair Control System Using Invisible Steering Gesture Based on LSTM

🚀This is an original project by I Putu Krisna Erlangga, but it has been modified to control a wheelchair.

This project is still under development. It involves controlling the wheelchair with an invisible steering wheel, using hand gestures as if steering. The approach utilizes an LSTM model to send output to the wheelchair, corresponding to the five available classes.

🎬 Demo

mantap

Static Badge

🔨 Installation

PyPi version

Scikit-learn version Keras version matplotlib version MediaPipe version Tensorflow version OpenCV version IPyKernel version

Please use seperate file for collecting dataset and training also use seperate folder for control. venv setup for training

  python --version
  python -m venv nama_venv
  nama_venv\Scripts\activate
  pip install opencv-python
  pip install mediapipe
  pip install numpy
  pip install matplotlib
  pip install tensorflow
  pip install seaborn
  pip install scikit-learn

Actually, you need an ESP32 and the wheelchair to run it.

  python --version
  python -m venv nama_venv
  nama_venv\Scripts\activate
  pip install mediapipe
  pip install opencv-python

Collecting Dataset

change the dataset folder name and class for your setup.

  DATA_PATH = os.path.join('p_ganti_ini_pake_namaDataset') #Change it with whatever you like

  actions = np.array(["Follow", "Github", "Agung", "Hari", "Bos"])
  no_sequences = 50 #set jumlah sequence
  sequence_length = 6 #Lama ngambil per sequence

run all cell on AMBIL_DATASET_AGUNG. it should open camera feed like this.

ambildataset

after you normalized the dataset you can start training the model

💬 Feedback

Static Badge

It is better to use a CNN-LSTM approach.

Authors

Static Badge

Static Badge

Static Badge

License

GitHub License