Retargeting in maya

Hello !
I have 2 differents Rig and I would like to retarget the animation from one to the other.
The 2 rigs are for the same character but they are at least 5 years apart and I would like to only use the newest one.
Problem is, I am not an animator and I don’t really know where to begin when it comes to retargeting.

I tried HumanIK but from my test it doestn’t work because the scale of the some bones are animated.
I tried to used the character set and the character mapper without success.
I tried to parent constraint the bones from the source to the target but the orientation of the joints were differents betweend the 2 rig (in one, the “up” axis was y and was x on the other.

What would be the best choice for me to retarget the animation ?


Hi @yann.b, welcome to the community!


  • Are the proportions of the rigs identical - e.g. arm lengths, leg length and do they have the same pose.
  • Are there non 1:1 controls you want to retarget like a footRoll attribute or something that automates as a feature like opening wings - Are these attributes the same between each rig?
  • Do the rigs have ik/fk and space switching your looking to retarget too?

Essentially the one rule with retargeting i’ve found is to have a baseline somewhere - i.e. a matched proportion, a pose, similar features etc. Values can be different between rigs the key is building correspondence between them, even if that means building a translation rig - i.e. a intermediary that translates values from one rig, through it to another rig.

Hello @chalk , thank you :slight_smile:

Both rig are for the same character (but one is from 10 years ago and the other from 2 years ago), so same arm length, leg length.
The position of the shoulder are a bit different. Also, they really have short legs.
Here is a screenshot of the rigs

So this looks like transferring transforms - you should be able to do that using parent constraints between the target > source joints using maintainOffset. The general process is you align your target or source to the other, create constraints on the target to the source and then apply motion to the source. This way the target would ride the source skeleton - you then bake down the constraints to keys.

So I would align the target to the source, or vice versa. Then apply constraint on the target joints to glue them to the source with maintainOffset on. Finally I’d apply motion to the source and bake down the resulting constrained target to keyframes.

This is the basics - it gets a lot more complicated transferring systems, like ik between each other.

I tried using parent constraint but it doesn’t work because the orientation aren’t the same between the two skeleton.

Because of this, the scale does not work. The squash in my animation become a stretch.

Another idea then: Remove the scales from your incoming animation.

Duplicate your whole source skeleton to make a secondary skeleton. Then parent constrain it back to the original without any offset.
This secondary skeleton wouldn’t have any scales, but it would follow all the motion of the source.
Then just retarget to the secondary one instead.

Soooo, I think I finaly succeeded to retarget my animation on the new skeleton !

I created basic controller for both skeleton. I parent-constraint the skeleton to the controller of the first skeleton, then parent-constraint the controllers to the controller of the 2nd skeleton, then parent-constraint to the skeleton…
I did it with 2 sets of controllers because I thought it would be better to retarget when the controllers transform are at 0, 0, 0. But I don’t know if it’s really necessary. :sweat_smile:
As for the scale, I did it with the connection editor to connect the right axis.

It’s weird that Maya doesn’t offer something to retarget easily…

Thanks for your help anyway ! :slight_smile:

Retargetting is a complex problem to solve, especially as rigs vary so much. The only way Maya could offer a consistent retargeting solution out of the box would be to use a standard rigging solution.
Technically humanIK offers this, but depending on your version of Maya, humanIK is flakey until 2018.

In general though, depending on your coding ability you can get decent retargeting by using matrix maths:
Get ref pose of bones in source rig
Get ref pose of bones in target rig
Get the offset between the two rigs reference poses.
Get the difference of each animated frame of the source rigs bones from their reference pose either on each frame or each keyframe.
Add this to the rig offset from step 3 and then apply to target rig in it’s reference space.

Been a while since I did this but it worked pretty well for completely different proportioned rigs.

Unfortunetely I have Maya 2017.
And from what I understand HumanIK does not retarget the scales. It is a big problem for me. The animations are cartoon and need stretch and squash.

I can code a little bit. I will write a script to batch my animations but I got scared by the word “matrix”. :sweat_smile:
I think I will look into it, regardless of my fear of “matrix” because I may need to understand matrix in the future.
Plus, you gave me the steps to retarget using it so I will give it a try !

Some basic pointers around matrix maths.
To subtract matrixA from matrixB, you need to use an inverse operator.
What this is in Maya I am not sure, it might be a standalone function or it could be a method on the matrix itself.

I believe Max and Maya both work the same, so:

subtractMatrix = matrixA * matrixB.inverse()

And then to add matrixA to matrixB:
addMatrix = matrixA * matrixB

I tried using matrix and…it doesn’t work :sweat_smile:

This is what I have:

  • An array of the source bone and the corresponding target bone
  • The ref pose of the source bones and the target bones
  • The offset between the 2 ref pose ( ref_pose_source * ref_pose_target.inverse() )

Each frame I:

  • Get the matrix of all the source bones
  • Get the difference between the ref pose of the source bones ( matrix_source * ref_pose_source.inverse() )
  • Add the difference on the corresponding target bone (diff_matrix * ref_pose_target)

Is there something wrong with what I’m doing ?

Two things I can suggest:

  • put both your rigs into a matching reference pose. The easiest one to do is a T-Pose where the limbs of each rig are aligned to world space 90 degrees. This gives you a neutral frame of reference to compare the two rigs without extra offsets.
  • use a simple example to test on, rather than the whole rig. When I did this years ago I used the clavicles of my two rigs as they had the biggest offset in the matching pose.

The algorithm I suggested I can’t confirm at the moment - I wrote this code at my previous company years ago so no longer have access to it and I don’t have a Maya or Max license atm.
I will try and setup a blender scene when I get a chance.

That’s what I did.
I created 2 bones and tried de retarget these 2 bones onto another set of bones.


They have the same rotation and scale. But the target bones have a minor translation to the left.

From what I saw, it seems like the matrix are not set properly.
My source bones go from 6, 0, 0 to 6, 0, 3 in translation
and after retargeting, the translation is 3, 0, 6

I continued using the matrix to do the retargeting.
And I was wondering:
Should I get the matrix in worldspace or objectspace ?
I can get either with the xform command.

You are wanting to get the offset of nodeA and nodeB. Traditionally this is childNode and parentNode.
This is referred to as parent space in most environments ( some use the term local space, but that can get confusing as local space is its own thing ).

To get parent space, you will take the transforms ( which are always matrices ) in world space and multiply them with each other in a specific order depending on whether or not they are column or row major ( extra complication there but good to know there is a difference ).

So as I mentioned in my previous post, to get the offset of nodeA from nodeB you would multiply the transform of nodeA by the inverse of nodeB.
This gets you the transform of nodeA in parent space.

To get parent space transform back into world space, you then multiply said parent space transform by the world space transform of the parent node.

parent_space_matrix = child_node.transform * parent_node.transform.inverse()

world_space_matrix = parent_space_matrix * parent_node.transform

# Thus, world_space_matrix should be equal to child_node.transform, which is in world space:
world_space_matrix == child_node.transform

Those are the basic building blocks. You can swap out the parent transform for any node to get their offset. Using this you can retarget between any objects.
I am trying to refigure out the algorithm I used all those years ago but I don’t have a lot of spare time atm.

I think the bit that I had incorrect in my initial suggestion was:

Get ref pose of bones in target rig

This needs to be the reference pose of the bones in the target rig using the parent node of the corresponding bone in the source rig, with the translation set to zero.

That way you have the correct offset to apply to the target bone based on how the source has moved around it’s parent.

I am not great at explaining, but I will carry on writing up the example I am working on when I can and pass you the results ( unless you figure it out before I get there :slight_smile: )

Also hierarch order counts if you apply transforms directly over time - e.g. if you applied the transform to spine, before hips the offset will be broken. Doing retargeting through constraints/rigging the retarget mitigates this.

Oh sure, good point - order of operation matters for most scenarios.

The issue doesn’t occur if you retarget on each frame, nor is it if you retarget across between pure FK hierarchies.
General good practice is to always retarget in the order of hierarchy, removing any potential for order of operation borking.
Thankfully collecting nodes in hierarchy order and then iterating over them in said order isn’t a hard task.

Thank you.
I did not do it in parent space so that’s why I couldn’t make it work.

It still doesn’t work but I know what was wrong with my code. :sweat_smile:
For the moment, it’s fun, it’s like trying to solve a sudoku. :smiley:
I’ll keep you posted if I succeed the retargeting!

So I have yet to have much time looking into this, but this GitHub project looks pretty good: GitHub - nbreeze/py-simple-anim: A basic, general-purpose module that provides classes and functions related to 3D animation and transforms. Supports animation retargeting.

It needs numpy to work, but you can at least look at the source code to see how it was done :slight_smile:

Woooh, that’s a really long script. Much more complicated than mine :sweat_smile:
I’ll take a look. Thank you :slight_smile: