Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How can the output data be converted to bvh format? #3

Open
makeme-zgz opened this issue Jun 29, 2022 · 13 comments
Open

How can the output data be converted to bvh format? #3

makeme-zgz opened this issue Jun 29, 2022 · 13 comments
Labels
help wanted Extra attention is needed

Comments

@makeme-zgz
Copy link

I'm trying to convert the rotations and joint position into bvh format so that I can do better visualization. I can see that there is an IK method in the motion_process.py file which might help me to get the local rotation information. But it turns out that it's not correct. I fed in the joint positions returned by recover_from_ric method.

Would be appreciate if any help or hint is provided.

@EricGuo5513
Copy link
Owner

EricGuo5513 commented Jun 29, 2022

Hi, I tried many times to visualize the results in bvh format but failed. I feel it is more complicated than I suppose. So I will be glad if you can figure it out. As far as I know, bvh is in "euler angle" rotation representation.

Use the following code:

from common.skeleton import Skeleton
import numpy as np
import os
from common.quaternion import *
from utils.paramUtil import *

example_data = np.load(os.path.join(data_dir, example_id + '.npy'))
example_data = example_data.reshape(len(example_data), -1, 3)
example_data = torch.from_numpy(example_data)
skel = Skeleton(n_raw_offsets, kinematic_chain, 'cpu')
offsets = skel.get_offsets_joints(example_data[0])

face_joint_indx = [2, 1, 17, 16]
quat_params = skel.inverse_kinematics_np(example_data, face_joint_indx, smooth_forward=False)

Then the quat_params are the local rotations in quaternion representation. Note the root rotations are respect to Z+ direction. Hope it helps.

@makeme-zgz
Copy link
Author

Thank you very much for the reply! Quick question on the example data, is it the global or local position of the joints? What would the first entry be like?

@EricGuo5513
Copy link
Owner

Hi, the example data use global positions of joints. It is in the same format as what you obtained through recover_from_ric method. You may also need to add these two lines:

n_raw_offsets = torch.from_numpy(t2m_raw_offsets)
kinematic_chain = t2m_kinematic_chain

So this quat_params give you the local rotations of each bone with respect to the pre-defined offsets.

@zhuangzhuang000
Copy link

Hello! has this problem been solved? Can you share the method with me?

@foamliu
Copy link

foamliu commented Oct 29, 2022

Converting the SMPL pose to BVH was easy, and I got that done yesterday.

@zhuangzhuang000
Copy link

Converting the SMPL pose to BVH was easy, and I got that done yesterday.

Hello! Could you share the method with me? Thank you very much!

@EricGuo5513
Copy link
Owner

Converting the SMPL pose to BVH was easy, and I got that done yesterday.

Hi, if possible, could you please send the code or github repos? I am very interested. Thanks.

@EricGuo5513
Copy link
Owner

For whom are interested in converting the positions into bvh format, this script can help a lot. https://github.com/DeepMotionEditing/deep-motion-editing/blob/master/utils/InverseKinematics.py What you need to do is 1) finding a smpl template bvh file that contains offset information, 2) using their bvh loader to load bvh file into an Animation object, 3) passing the animation object and positions into this inverse kinematics and get the rotations, 4) write the animation into a bvh file. I have succeeded using other data in a similar way. You may contact me if you have question.

@EricGuo5513
Copy link
Owner

The BasicInverseKinematics should suffice already. For initial rotations at the beginning of optimization, you could just use identity transformation.

@SPIKE40
Copy link

SPIKE40 commented Mar 20, 2023

Hi i am looking for solution that converting result npy files to bvh files

so anyone who give me some detail guides and sample codes?

I tried many things but finally failed.

@EricGuo5513
Copy link
Owner

EricGuo5513 commented Apr 17, 2023

Hi, I got a lot of comments that our current rotation representation seems not compatible to other 3D softwares like blender. I kind of get the reason. In IK/FK in skeleton.py, for i_th bone, we are calculating the rotations for itself. While in bvh, actually we should get the rotations of it parent instead. Therefore, in line 91, you could try to use its parent bone, instead of the bone itself. I am not sure if it works. Here I attach the codes of our FK and bvh FK, you may see the difference while obtaining global positions:
Our FK:

for i in range(1, len(chain)): 
      R = qmul(R, quat_params[:, chain[i]])
      offset_vec = offsets[:, chain[i]]
      joints[:, chain[i]] = qrot(R, offset_vec) + joints[:, chain[i-1]]

BVH FK:

for i in range(1, len(self.parents)): 
    global_quats[:, i] = qmul(global_quats[:, self.parents[i]], local_quats[:, i])
    global_pos[:, i] = qrot(global_quats[:, self.parents[i]], offsets[:, i]) + global_pos[:, self.parents[i]]

Hope this helps you. I do not have time to validate this idea. But if anyone figure it out in this or any other ways, I would appreciate so much if you could let me know. If it does not work, I know the recent work ReMoDiffuse managed to use the rotation representation in their demo. You may refer to them.

BTW: I have updated the quaternion_euler_cont6d functions in quaternion.py, which should be safe to use.

@EricGuo5513 EricGuo5513 added the help wanted Extra attention is needed label Apr 17, 2023
@rx-fly
Copy link

rx-fly commented Apr 18, 2023

any way to output the resulting animation as an fbx file, or do you know of way that the npy can be used in a 3d software (ie, blender)?

@aniongithub
Copy link
Contributor

aniongithub commented Sep 18, 2023

To get further on this, I tried to modify recover_from_rot to return the local quaternion rotations instead of positions via FK using the following code.

def recover_from_rot(data, joints_num, skeleton):
    r_rot_quat, r_pos = recover_root_rot_pos(data)

    r_rot_cont6d = quaternion_to_cont6d(r_rot_quat)

    start_indx = 1 + 2 + 1 + (joints_num - 1) * 3
    end_indx = start_indx + (joints_num - 1) * 6
    cont6d_params = data[..., start_indx:end_indx]
    #     print(r_rot_cont6d.shape, cont6d_params.shape, r_pos.shape)
    cont6d_params = torch.cat([r_rot_cont6d, cont6d_params], dim=-1)
    cont6d_params = cont6d_params.view(-1, joints_num, 6)

    rotations = cont6d_to_quat(cont6d_params)
    # Don't calculate positions, return the rotations directly instead
    # positions = skeleton.forward_kinematics_cont6d(cont6d_params, r_pos)

    return r_pos.numpy(), r_rot_quat.numpy(), rotations.numpy()

I don't want to use FK/IK, but fetch the internal cont6d rotations and convert them directly to per-bone quaternion wrt. to the parent rotation. However, this produces weird rotations when I assume that each of the rotations is per-frame, per-bone local rotations (with shape <frame_count>, <bone_count>, 4>)

image

Could you help me with this?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
help wanted Extra attention is needed
Projects
None yet
Development

No branches or pull requests

7 participants