I'm a robotics engineer who is passionate about robotics, control, and AI. I have experience working with various robotic systems, including manipulators, mobile robots, and humanoid robots. I'm also interested in the intersection of robotics and AI, and how we can leverage machine learning algorithms to enable robots to perceive and interact with their environment.
Investigating the use of various techniques (including machine learning) for object recognition and tracking using tactile sensing in robotics applications
Contributing to open-source robotics projects
Email: [email protected]
LinkedIn: Ayan Mazhitov
- Robotics: Kinematics, Dynamics, Control, Perception, Navigation
- Control Systems: Classical control, Modern control, Nonlinear control
- Artificial Intelligence: Machine Learning, Deep Learning, Reinforcement Learning, PDDL, Description Logic (Protege)
- Programming Languages: Python, C++, MATLAB
- Tools and Frameworks: ROS, PyTorch, Gazebo, Simulink, Git, YARP,
- Python: sklearn, matplotlib, pandas, numpy, seaborn, FastAPI
- Snake robot: Simulated planar (2D) robot in Gazebo. Also incorporates control based on position/effort controllers with tuned PIDs, and MoveIt.
- Cartpole Deep Q-Network based solution: An implementation of Deep Q-Network reinforcement learning algorithm tested on OpenAI Gym's Cartpole problem.
- Recursive Newton-Euler algorithm: MATLAB based implementation of recursive Newton-Euler algorithm for robot's dynamics. Also provides several examples.
- icub_gazebo_sensitivity: Force control for the index finger of ICub humanoid robot for tactile exploration purposes within YARP framework.
- Franka Emika robot's Cartesian control: ROS-based controller to control the robot in Cartesian space
- Autonomous or manually controlled mobile robot: This project features a Gazebo simulation of a mobile robot with a user interface that enables users to autonomously navigate the robot to specific coordinates, manually control the robot using the keyboard, or drive the robot while receiving collision avoidance assistance.
- Mobile robot with RGB camera to inspect the environment: This project presents a Gazebo simulation of a mobile robot that employs an RGB camera for indoor environment inspection, the robot's behavior is controlled by a State Machine created using the SMACH library.
- Autonomous mobile robot: This project is a ROS-Stage based simulation of an autonomous mobile robot successfully completing a circular path without colliding into walls.
- Human–robot handover with prior-to-pass soft/rigid object classification via tactile glove: Human–robot handovers constitute a challenging and fundamental aspect of physical human–robot interaction. This paper describes the design and implementation of a human–robot handover pipeline in the case in which both soft and rigid objects are passed by the human to the robot.
- Deformable object recognition using proprioceptive and exteroceptive tactile sensing: The somatosensory sense is a combination of several perception systems. It plays an important role in the exploration and recognition of the environment. In this work, we show how proprioceptive (internal) and exteroceptive (external) sensing can help to distinguish deformable and rigid objects
In my free time, I enjoy:
- Reading books on robotics and AI
- Building and tinkering with robots
- Watching sci-fi movies and TV shows
- Playing video games