We have tried to develop a proof of concept for the following:
- Virtual Doodle
- Self Checkout
- Classroom
Virtual Doodle
Grocery Self Checkout
The user needs to move his hand just like a wand of a magician, to transform his imagination into real artwork. The doodle can also be saved in thein doodle gallery afterward.
This tool offers touch-free navigation for teachers using common desktops to deliver lectures in the classroom.
The self-checkout provides a touch-free navigation and form-filling option using speech and gesture recognition.
- Python
- OpenCV
- Mediapipe
- Django
- HTML/CSS/BootStrap
- AutoPy
- Speech Recognition
Django
$ pip install django
Speech Recognition
$ pip install Speech Recognition
MediaPipe
$ pip install mediapipe
PyAudio
$ pip install PyAudio
Autopy
$ pip install autopy
OpenCV
$ pip install opencv-python
The first thing to do is to clone the repository:
$ git clone https://github.com/S-JZ/TouchMeNot.git
Create a virtual environment to install dependencies in and activate it:
$ virtualenv --no-site-packages env
$ source env/bin/activate
Then install the dependencies:
(env)$ pip install -r requirements.txt
Note the (env)
in front of the prompt. This indicates that this terminal
session operates in a virtual environment set up by virtualenv
.
Once pip
has finished downloading the dependencies:
(env)$ cd TouchMeNot
(env)$ python manage.py runserver
And navigate to http://127.0.0.1:8000/
.