Project Owner: Jeremy Siburian
Last Updated: September 28, 2023
Contents of this documentation are the following:
- Overview
- Hardware & Software Requirements
- Folder Navigation & Program Explanation
- How to Run Programs
- Robot Setup (Listen Node)
- Sensor Setup
- Related Links
Note:
Documentation is still under construction, some contents may be missing.
Future Updates:
- Move detailed documentation into GitHub Wiki, README only contains high-level information.
The repository is for controlling a Robotiq gripper mounted on an Omron TM Robot using uSkin tactile sensors.
This codebase was created as part of a 6-month internship at the Plant Automation Team, Vehicle Manufacturing Engineering Japan, Daimler Trucks Asia.
Project Title:
uSkin Tactile Sensors for Adaptive Grasping & Bin Picking Solution
For the full documentation of the project, please follow link here.
The hardware requirements of this project are the following:
- 4x6 uSkin tactile sensors from XELA Robotics
- 2F-85 Robotiq Gripper
- TM5-900 Collaborative Robot
Required Python modules:
- numpy
- matplotlib
- scipy
- scikit-learn
- keyboard
- dearpygui
- msvrt (Win) / getch (Linux)
- pyserial
- crcmod
- techmanpy
and some socket io modules as written in the XELA Robotics software manual.
- websocket-client
- websockets
Explanation of each folder in the repository:
- GripperControl --> Controlling Robotiq Gripper via COM PORT.
- RobotControl --> Controlling robot movement using techmanpy communication driver.
- TM_Demo --> Demo programs on controlling gripper and TM robot based on uSkin feedback.
- TwoSensorControl --> Prototype for two sensor clustering model.
- SensorUtils --> Clustering middleware and XELA Robotics utility programs
To combine sensor feedback, gripper control, and robot movement together, two main control architectures are used as a prototype:
- Python-Only Control (TMflow in Listen Node)
- Python + TMflow (Robot movement is controlled in TMflow)
Prerequisites:
- In the TMflow software, a Listen Node must be active.
- In the code editor, "clustering_middleware.py"
In this control architecture, the robot movements are fully controlled in a Python script. The TMflow is left in Listen Node only.
Main demo program(s):
- TM_trial_no_slip.py
- TM_trial_slip.py
To run the demo/trial program, the steps are the following:
- In the TMflow software, a Listen Node must be active in order to receive robot commands from the Python script.
- In Visual Studio Code, "clustering_middleware.py" must be run at all times to access high-level data from the middleware.
- After Listen Node is active and the clustering middleware is run, the main demo/trial programs can be executed.
In this control architecture, the robot movement is fully done within TMflow. The main purpose of this control architecture is to develop a protototype of picking using TM Robot's built-in vision features.
Main demo program(s):
- TM_socket_test.py
For more detailed explanation on how to prepare the robot, please visit the [Listen Node] and techmanpy documentations.
For more detailed explanation on how to set up the sensors, please refer to the official documentation from XELA Robotics website.
Below are a compilation of useful links related to the project.
| Related Link | Description |
|---|---|
| techmanpy | Python communication driver for Techman Robots. |
| Listen Node (Manual) | Official documentation on Listen Node (How to activate, how to send external scripts, etc.) |
| TM Robot TCP/IP | Youtube tutorial video on how to set up TCP/IP connection for TM robots. |
| Related Link | Description |
|---|---|
| XELA Sensors & Software Introduction | Youtube tutorial covering the setup of the sensors, use of the software, ways to integrate software in 3rd party one and general troubleshooting. |