You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, developer. Thanks for sharing this good repo. For now, I have a suit of xsens motion capture device, including the manus gloves. However, I found that I cannot derive the joint states about the fingers. So may I ask if I want to modify your code to achieve this goal, which document I should read? I can't find the details about the joint names of fingers and the data segment of real-time network streaming. Looking forward to your reply.
The text was updated successfully, but these errors were encountered:
As far as we know, MVN Studio is only streaming the link position and not the joint state of the Manus fingers. Unfortunately, I am not able to find documentation on the topic.
What I would do is the following:
augment the link and joint states to accept the finger data;
use an IK algorithm to retrieve the joint state knowing the link position.
If I have any updates, I will let you know by replying to this issue
Hi, developer. Thanks for sharing this good repo. For now, I have a suit of xsens motion capture device, including the manus gloves. However, I found that I cannot derive the joint states about the fingers. So may I ask if I want to modify your code to achieve this goal, which document I should read? I can't find the details about the joint names of fingers and the data segment of real-time network streaming. Looking forward to your reply.
The text was updated successfully, but these errors were encountered: