Fix motion path lattice twist in Maya


The curve you generate will have a length. So if you want to place all stitches at once you can generate a straight strip of stitches that is the same length as your curve. You can then create a lattice around all of those stitches. You can create divisions on your lattice based on the length of the curve and the amount of stitches.

For example if you have 5 lattice rows. You can split up the points and normals you query on the curve based on increments of 0.25. You query curve parameter 0, 0.25, 0.5, 0.75 and 1. You have created clusters for each of the rows on your lattice. You can translate and rotate those clusters based on your curve parameter position and normal.

1 Like

Sounds like a plan, thank you, Joosten!


I hope that will do the trick. Another more advanced solution would be to use a radial basis function to place the stitches on a vertex level. If you would go that route all you have to do is generate a curve the rest is just code to set the vertices positions directly.

1 Like

Never heard about it… Any links?


I have a 10 units curve, but the position at the parameter is confusing, I thought it should be 0.0, 0.1, 0.2, 0.3 etc

I suppose this happens because of uneven points location and rebuilding will fix it, but rebuilding curve also change it shape dramatically, so its not an option.


In that case you can use the API. Here is a snippet that will give you evenly spaced points on a curve

from maya import OpenMaya

node = "curve1"
num = 10

selectionList = OpenMaya.MSelectionList()
dag = OpenMaya.MDagPath()
selectionList.getDagPath(0, dag)

c = OpenMaya.MFnNurbsCurve(dag)
p = OpenMaya.MPoint()

points = []
length = c.length()
increment = 1.0 / (num - 1)

for i in range(num):
    parameter = c.findParamFromLength(length * increment * i)
    c.getPointAtParam(parameter, p)
    points.append((p.x, p.y, p.z))
print points
1 Like

Why in increment you are subtracting 1 from the number of points (increment = 1.0 / (num - 1))?

If I use increment = 1.0 /num I get expected results (using round here to get rid of numbers like -1.0000001909139464):
[(0.0, 0.0, 0.0), (0.0, 0.0, -1.0), (0.0, 0.0, -2.0), (0.0, 0.0, -3.0), (0.0, 0.0, -4.0), (0.0, 0.0, -5.0), (0.0, 0.0, -6.0), (0.0, 0.0, -7.0), (0.0, 0.0, -8.0), (0.0, 0.0, -9.0)]


Don’t you want a point at the end of the curve as well?


If I have a lattice (S=10, T=2, U=2), how can I get 4 lattice points for each of S divisions to create 10 clusters?


lattice = pm.PyNode('ffd2Lattice')

numberOfRowsS = lattice.sDivisions.get()

for i in range(numberOfRowsS):

Here what I have for now from left to right: source curve, test arrow object cloned along the curve and lattice deformed with clusters (and it is obviously wrong)

The coordinates of each cluster-arrow pair are the same but they located in space differently

Any thoughts on how to fix it?


Ok, since the issue was that each cluster was away from the origin I got some sort of double transformations. So before creating clusters I just scale lattice to zero on the X axis, then create clusters and move them to a proper position on the curve.


Amazing to see the progress on this :slight_smile:

1 Like

Despite the overall result may look good enoph:

the details are unexaptable…

The lattice is irregular:

And also I can’t rise the lattice resolution as much as I may need, after 400 division lattice become to broke my geometry. Here are 400 and 401 divisions:

If irregularity comes from bad algorithm of taking normals from surface vertices (and potentially can be fixed), I have no idea how to solve latice resolution limet. So I am looking for the new ideas for the same initial task: place stitches on surface.


You could blend the normals between the different faces. Many if your points will have the same normal direction because they lie on the same facem causing the irregularity. If you lerp between the normals of the point between two faces you get a smoother result.

As for the lattice division issue. You could just split it up into multiple sections never exceeding the 400 limit and afterwards combine the geometry together.


Not very clear how can I blend the normals exactly.

Currently, I have a list of point position (generated from the curve) so for each point, I search for the closest vertex on mesh and query it normal.


If you have 5 points with a matching normal, and after that another 5 with a matching ( because they are on the same face )

you can lerp the normals based on the middle of the points that are on the same face.
so if we take the 10 point example ( will start and end at the middle point on a face as before and after the middle point you would use the normal of the previous and next face ):

In this example we have 2 normals ( face_1_normal and face_2_normal ), point 3-5 are associated with face_1 and point 6-8 are associated with face_2.
The normals of the in between points are a blend between the two normals generating a smooth transition.

point_3: (face_1_normal * 1) + (face_2_normal * 0)
point_4: (face_1_normal * 0.8) + (face_2_normal * 0.2)
point_5: (face_1_normal * 0.6) + (face_2_normal * 0.4)
point_6: (face_1_normal * 0.4) + (face_2_normal * 0.6)
point_7: (face_1_normal * 0.2) + (face_2_normal * 0.8)
point_8: (face_1_normal * 0) + (face_2_normal * 1)


If you have 5 points with a matching normal, and after that another 5 with a matching ( because they are on the same face )

This part is already confusing… I am not dealing with geometry directly. I am using a NURBS curve (converted edge loop) to generate a list of positions in space with tangents. Then for each position in space, I get vertex normal from geometry, so I have 3 vectors: position, tangent, normal. With those vectors, I modify the transformation matrix of a cluster.

E.g I have a 3 lists of 3 vectors (normals, tangents and positions):
N = [(x1, y1, z1), (x2, y2, z2) … (x#, y#, z#)]
T = [(x1, y1, z1), (x2, y2, z2) … (x#, y#, z#)]
P = [(x1, y1, z1), (x2, y2, z2) … (x#, y#, z#)]

Normals are irregular. In such a case is it possible to blend them?


Indeed, you can group the P, T, N into bits where the normals are the same.
If you print your normal when you query them you will see that they are clustered because a lot of the points will exist on the same face. You can blend the normals between the groups generating a smooth transition between the two groups rather than a quick jump when you get a normal of a neighbouring face. Once you adjusted the normal you can generate a transformation matrix and position the clusters.


Hi Kiryha,

Does this code,

for i in range(num):
    parameter = c.findParamFromLength(length * increment * i)
    c.getPointAtParam(parameter, p)

produce uniform divisions along your curve? Im working on some pretty complex curve stuff now and had to rebuild the curve before getting parametric values along its length. Next to build a true curvature along this curve you’ll need to create a frenet frame - essentially its a moving coordinate frame along the curve to get the tangent/bi-tangent and normal to get the orientation - fortunately Maya has awesome curve support.

import maya.cmds as cmds
import maya.api.OpenMaya as apiOM

p0 = apiOM.MVector(0,0,0)
p1 = apiOM.MVector(0,1,0)
p2 = apiOM.MVector(1,1,0)
p3 = apiOM.MVector(1,0,0)

normal = (-1, 0, 0)

my_curve = cmds.curve(points=[p0, p1, p2, p3], editPoint=True)

count = 10
division = 1.0/(count-1)

for i in range(count):

   position = apiOM.MVector(cmds.pointOnCurve(my_curve, po=True, pr=i*division))
   tangent = apiOM.Vector(cmds.pointOnCurve(my_curve, nt=True, pr=i*division))

   bi_tangent = (tangent ^ normal).normal()
   normal = (tangent ^ bi_tangent).normal()

Note this isn’t complete - I’m just getting the data to build the transforms, but basically we’re recalculating the normal from the bi_tangent at each point, which itself is computed from the previously created normal hence the moving coordinate frame.


1 Like

Yes, I guess it does. Thanks for the info, Charles, that might be very useful (parallel transport for Maya I was looking for initially)!