- This is a real-time SIBI (Sistem Isyarat Bahasa Indonesia) Sign Language Recognition that was made for my personal bachelor thesis.
- Made with Jupyter Notebook and OpenCV.
- Uses six different dynamic SIBI gestures as action classes/labels, namely "apa/what", "bagaimana/how", "berapa/how many", "di mana/where", "mengapa/why", and "siapa/who".
- MediaPipe is utilized to collect datasets by extracting the values of hand keypoints and then storing them in NumPy array format.
- A total of 180 original data (30 data for each class) were augmented with four variants into 3060 data (510 data for each class).
- The model was built using three LSTM layers and three Dense layers with a data split of 2616 for training, 153 for testing, and 291 for validation. This combination produces a categorical accuracy value of 99.85%, a loss of 0.0059, and an overall performance matrix value of 100% after training with 150 epochs.
- Clone the repository by running the following command in your terminal or Git Bash:
git clone https://github.com/BlingBong/SIBI-SignLanguageRecognition.git- Make sure you have Jupyter Notebook installed on your computer, if not, follow these instructions Jupyter Install.
- Open your terminal in the downloaded folder.
- Run this command "python -m notebook" (may differ on each Operating System and Jupyter version).
- Wait for the command to run successfully, you will be redirected to the Jupyter Notebook tab in your browser.
- Open SIBI SLR.ipnyb in the Jupyter Notebook tab. You can now run the code!
If you encounter any errors or issues, you can refer to the "Issues" section of the repository or contact the repository's maintainer for help.
| Description | Screenshot |
|---|---|
| Data Collection | ![]() |
| Real-Time Testing | ![]() |
| Model Loss Graph | ![]() |
| Model Accuracy Graph | ![]() |



