Attaching objects to ncloth simulation

Hi there,

What I want to do kinda works already, but I’d appreciate some insights I think I might be missing:

  • Goal: Simulated (rigid) object on an elastic string, like a cat toy
  • Simulation on proxy object shall be transferred to highres object

Problem:

  • The bullet enginge does not give me the desired result (and it can drive me crazy)
  • Ncloth is great, also for rigid objects, but it changes the shape and doesn’t give me the transform (required for transferring the animation from the proxy object)

My solution (see script):

  • Attach locators to three vertices
  • create 2 joint with IK handle and pole vector constraint, then attach objects to joint

My questions:

  1. I might be old school, is there a better way to simulate this type of rigid body (maybe a plugin I’m unaware of)? Is it me or is bullet really annoying?
  2. How would you make the elastic string? Hair solver?
  3. Is there a better non-expression way (without plugins! - including the bonus tools ones) to attach transforms to a vertex?
  4. Is there are more elegant way to transfer ncloth simulation to a transform?

Here’s the code:

import maya.cmds as mc


def nclothToTransform(ncloth = None, vtx1 = 9, vtx2 = 11, vtx3 = 10):
    if not ncloth:
        ncloth = mc.ls(sl = True)[0]
    
    loc1 = '%s_loc1'%(ncloth)
    loc2 = '%s_loc2'%(ncloth)
    loc3 = '%s_loc3'%(ncloth)
    
    for loc in [loc1,loc2,loc3]:
        if mc.objExists(loc):
            mc.delete(loc)
    
    mc.spaceLocator(n = loc1)
    mc.spaceLocator(n = loc2)
    mc.spaceLocator(n = loc3)
    
    mc.addAttr(loc1, ln = 'vtxNum', k = 1, at = 'double', dv = vtx1)
    mc.addAttr(loc2, ln = 'vtxNum', k = 1, at = 'double', dv = vtx2)
    mc.addAttr(loc3, ln = 'vtxNum', k = 1, at = 'double', dv = vtx3)
    
    
    exp = ['float $pos[] = `pointPosition -w %s.vtx[%s.vtxNum]`;'%(ncloth, loc1), \
           '%s.translateX = $pos[0]; %s.translateY = $pos[1]; %s.translateZ = $pos[2];'%(loc1, loc1, loc1), \
           'float $pos[] = `pointPosition -w %s.vtx[%s.vtxNum]`;'%(ncloth, loc2), \
           '%s.translateX = $pos[0]; %s.translateY = $pos[1]; %s.translateZ = $pos[2];'%(loc2, loc2, loc2), \
           'float $pos[] = `pointPosition -w %s.vtx[%s.vtxNum]`;'%(ncloth, loc3), \
           '%s.translateX = $pos[0]; %s.translateY = $pos[1]; %s.translateZ = $pos[2];'%(loc3, loc3, loc3)]
    
    mc.expression(n = '%s_transExp'%ncloth,  s = '\n'.join(exp))
    
    mc.select( cl = True)
    
    j1 = mc.joint(p = [0,0,0], n ='%s_jt%03d'%(ncloth, 1))
    j2 = mc.joint(p = [0,0,0], n ='%s_jt%03d'%(ncloth, 2))
    
    mc.pointConstraint(loc1, j1, mo = False)
    c  = mc.pointConstraint(loc2, j2, mo = False)
    mc.delete(c)
    
    handle, effector = mc.ikHandle( sj=j1, ee=j2, p=1, w=1, n='%s_IKHandle'%ncloth, sol='ikRPsolver' )
    mc.pointConstraint(loc2, handle, mo = False)
    mc.poleVectorConstraint(loc3, handle, weight = 1)

Thanks!

I would rather get fired than use bullet - nDynamics has your back for the vast majority of things, and it’s much easier to integrate back into other effects, or even just other control systems in your rig.

Sticking a transform onto a curve of nHair is easy - use a pointOnCurveInfo node to find your position. If you want it to rotate and align to the curve, it’s a bit more involved - the PCI gives you tangent and normal information, which you can feed directly into an aimConstraint node as aimVector and upVector (although it’s worth noting that using this for an upvector can sometimes become unpredictable, and I usually use a separate upCurve).

There isn’t really an elegant way of generating the nHair itself other than clicking “create hair system”, and depending on a follicle is annoying, but if you work only with the nurbsCurve shape data (better yet only in local space), it’s very reliable.

Attaching things to nCloth can get annoying, but only if you try to do to much at the attachment stage itself. If you want a solution that will be robust to changes in topology your best bet is to use the UV features of a follicle (and eat that meaty performance hit), but if it’s just a basic tech mesh, you can access the local-space positions of a polygon mesh directly in the controlPoints attribute. You can then connect these to locators or (my favourite) the control points of a nurbs patch, from which you can use a pointOnSurface info and generate a frame in the same way as with the curve. Bit more involved, but it gives a smoother rivet on a deforming mesh.

Dynamics can be a very helpful tool, even a core part of a rig when used properly, and I’d love to see more people incorporating it. Hope this helps.

PS. also like the single coolest feature of nHair to me is that it lets you rivet either the root, the tip, both or neither, for free. Work out which dynamic settings are most important to you and wrap them up with a controller, and you have a very artist-friendly object that can achieve engaging motion in a fraction of the keyframing time.

PPS. This workflow relies a lot on passing nurbs and mesh information from one shape node to the other, and aside from some weird gotchas involving user viewport interaction and tweaking, I don’t know how efficient this is with the new parallel evaluation. I guarantee it’s faster than expressions though.

1 Like

This was very helpful, thank you!

Do you have any additional thoughts on simulating a proxy mesh (for rigid bodies) and then transferring the motion to the high res mesh? For example let’s say you simulate 10 dice rolling on a table (maybe even with all dice merged into one object to use polygon shells). With the vertex rivet method you would choose 3 vertex ID’s for each dice, define the up-vector (or in the case of my script create a pole vector constraint + bone) and then constraint the high res object to each rivet setup. Is this the only possible workflow? It seems very… unelegant.

Cheers

It’s an interesting problem - for starters, I would definitely use polygon shells, as not only is it faster but it will give a more complete (?) simulation if all your objects are merged - air turbulence and I think fluids work better this way.

Converting between spaces never really feels elegant to do, and I’m still looking for better ways. Again, this vertex method is not robust to changes in topology or point numbering (it might be possible to use transfer attributes, but I haven’t tested it) - for this method, unless you are absolutely sure that you will never have any more or fewer dice, I would probably use follicles, either on separate UV sets or one with every cube laid out together. Ten follicles in your scene will have a very small effect on performance (compared to a physics sim at least), and it’s probably the most future-proof and stable option anyway. Or a simple normal constraint might work, if you’re not bothered about performance at all.

That’s definitely not the only possible workflow either - go explore some others! Random maya nodes do the craziest stuff.

1 Like