This project was successfully completed at Heracia Lab at UTA
Researching and developing assistive robots for paraplegic users and users with severe motor impairment disabilities is a difficult task, considering that such an interface has to be hands free. This paper focuses on developing a head gesture interface for a robot to recognize simple head gestures and interpret those signals to perform translation operations with robot arm. A state diagram is developed to navigate through different states of control.