Skip to content

Neurotech-Davis/RoboticArm

Repository files navigation

Neuro-Prosthetic EEG Controlled Robotic Arm

Neurotech@Davis

Data Collection Team: Abrianna Johnson, Avni Bafna (@AvniBafna), Deiva Dham, Grace Lim, Matangi Venkatakrishnan (@MatVenkat)

Hardware Team: Adit Jain, Dhruv Sangamwar (@dhruvsangamwar), Evan Silvey (@EvanSilvey)

Software Team: Ahmed Seyam (@AhmedSey03), Prakhar Sinha (@prakhargaming), Priyal Patel (@priyalpatell)

Introduction

Our project utilizes rhythmic patterns associated with the waveforms for each mental command. This is accompanied by the mental imagery of the task, which can be explained by utilizing mirror neurons for the motor-related tasks. A baseline for these commands is computed using the recorded “neutral” state, where the subject does not imagine stimulus. All commands are recorded from the same channels shown below on our EEG headset. The user trains the commands to associate with robotic arm movements, individualizing their own mental commands.

Methods

  • The onset of the stimuli began approximately 5 seconds before the training recording, giving the subject time to be primed and form a mental image of the task.
  • Each recording of a mental command lasted for 7 seconds after the stimuli was presented. Following each presentation, the subject had to decide whether to accept or reject that training session based on their confidence in their mental imagery of the task.
  • During the neutral state, the subject refrained from making any association with the mental commands.
  • Commands were added consecutively after one another, enabling the subject to gain confidence in each task before training a new command.

Commands

  • The "grab" command was primed with a video depicting the slow, low-arousal action of lifting the subject’s phone.
  • The "left" command was primed with a video showing the slow, low-arousal movement of the subject’s water bottle to the left.
  • The "right" command was primed with a video portraying a fast, high-arousal stabbing motion with a pen to the right.
  • The "drop" command was primed with a video featuring a cartoon piano falling down, eliciting fast, high-arousal responses.

Pipeline

Using the built-in integration of Emotiv’s software with the Node-Red toolbox, we seek to in real-time detect 4 mental commands—lift, drop, left, and right. We will use integer mappings to each mental command: 1 for lift, 2 for drop, 3 for left, and 4 for right. The final integer output will be sent through a serial connection with the Arduino.

Results/Conclusion

We use the Elegoo UNO R3 microcontroller with the Arduino IDE to manage the hardware. Servo Motors are used for the movement of fingers and the wrist. A serial connection is established between software and hardware to relay mental commands from the Emotiv Headset to the servo motors. Fishing line is connected to the pulley system and servo motors to control tension and when the fingers should grab. The Arduino IDE uploads the necessary program to control the hardware using mental command inputs. A USB connection is made from the Elegoo Uno R3 to the computer running the Node-Red software. This establishes the software to hardware connection through the serial port connection. Upon a valid mental command, specific servo motors associated with the command will activate to represent the instruction via the 3D printed hand

References

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •