Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Investigate AD (automatic differentiation) support #4

Open
diegoferigo opened this issue May 18, 2022 · 4 comments
Open

Investigate AD (automatic differentiation) support #4

diegoferigo opened this issue May 18, 2022 · 4 comments

Comments

@diegoferigo
Copy link
Member

diegoferigo commented May 18, 2022

JAXsim currently focuses only on sampling performance, exploiting jax.jit and jax.vmap. Being written in JAX, the forward step of the simulation should be differentiable (also considering contact dynamics, since it is continuous), but it has not yet been investigated.

Interesting resources:

@diegoferigo
Copy link
Member Author

diegoferigo commented Mar 26, 2024

I had preliminary results of differentiating against the hardware parameters introduced in #101 (and refined in #120).

I created a simple toy problem of a box falling for 1 second over a compliant flat terrain. I designed an quick optimization pipeline that simulates two of such boxes: a nominal one with default parameters, and a training one with wrong parameters. Once I collect trajectories from both, I compute the RMS error between the two trajectory, get the gradient w.r.t. a set of hardware parameters, and with it update the parameters of the training model.

  • I can differentiate over the mass of the base (and only) link. The gradient-descent optimization converges to the correct mass.
  • I can differentiate over the parameters of the 3x3 inertia tensor of the base (and only) link. The optimization is currently only constrained to produce symmetric matrices without any further restriction to get a physical inertia tensor. JAX is able to compute gradients, even though the resulting inertia tensor is not physical.
  • I can differentiate over the successor-to-child ${}^{\text{suc}[i]} \mathbf{H}_i$ fixed transform of the base link, that marks the location of the base link frame w.r.t. the root the model (in most cases it's an identity matrix). JAX also in this case gives me gradients, but it does not exploit the $\text{SE}(3)$ nature of the matrix, therefore the matrix exits the manifold.

As first attempt, this seems promising. Of course, when dealing with quantities not belonging to $\mathbb{R}^n$, the optimization should be properly adapted. For $\text{SE}(3)$, I believe we can exploit jaxlie. For the inertia tensor, probably we need to find a proper parameterization for symmetric positive-definite matrices that also meet triangular inequality. I would start looking at a principal axes decomposition together with the enforcement of positive principal moments of inertia (maybe storing them as logarithms).

cc @flferretti @CarlottaSartore @traversaro @S-Dafarra @DanielePucci

@DanielePucci
Copy link
Member

Super, important for many in @ami-iit/alpha-delta-tau

@diegoferigo
Copy link
Member Author

diegoferigo commented Jun 10, 2024

A good example that compares a derivative computed analytically w.r.t. the same quantity obtained by AD is:

@diegoferigo
Copy link
Member Author

diegoferigo commented Jun 10, 2024

Another example of using AD is the validation of $\dot{M}(\mathbf{q}) = C(\mathbf{q}, \boldsymbol{\nu}) + C^\top(\mathbf{q}, \boldsymbol{\nu})$ performed in:

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants