Get vertex normal in OpenMaya?

Hi guys,
Someone could help me? I got the vertex normals in maya this way:

def getNormal(_vtx):
    vertName = 'mesh.vtx[' + str(_vtx) + ']'
    normTemp = cmds.polyNormalPerVertex(vertName, query=True, xyz=True)
    norms = zip(*[iter(normTemp)]*3)
    #values for averages
    xAve=0
    yAve=0
    zAve=0
    #get average vertex normal
    for i in range (len(norms)):
        xAve += norms[i][0]
        yAve += norms[i][1]
        zAve += norms[i][2]
    leng = len(norms)
    xAve = xAve/leng
    yAve = yAve/leng
    zAve = zAve/leng
    aveList = [xAve, yAve, zAve]
    return aveList

But how can I do that in OpenMaya api? How would be the correct syntax in om?
There is some faster way to do that?

Thanks a lot!

Well the easiest way is probably pymel.

import pymel.core as pm
mesh = pm.selected()[0]
mesh.vtx[0].getNormal()

After that yeah, you’d probably be jumping into the API

import maya.api.OpenMaya as om
sel_list = om.MGlobal.getActiveSelectionList()
dag = sel_list.getDagPath(0)
verts = om.MItMeshVertex(dag)
verts.getNormal(0)

Now if you want all the normals for a given vertex, you’d call getNormals instead of getNormal, in case you wanted to average them yourself again for some reason.

Oh and while pymel is the easiest way, from “how much code did I just write” kind of metric, it is also much slower than talking to the API, especially when dealing with components and dense meshes.

Hi Bob!
Thanks a lot for replying. :slight_smile:
I am trying to use the api because my code tooks 33 minutes to be completed, because I had to apply for 20k vertices. :frowning:
I will try to understand your code, for sure Maya will process everything in a shorter time, thanks! :slight_smile:

If possible, just let me ask you one more question?
How can I apply this normal direction value into the direction of a locator? So the locator can point to the same direction of the normal? (in API)

Thanks a lot for your attention, Bob!

JoĂŁo

Im not at my work comp but MfnMesh is your friend I think! You can get mesh data such as normals, (I just started with the API too btw so don’t trust me 100 percent heh). You can store the results in an MVector array, try to do your reading and as much as you can in a single iteration.

Post back if it helps? (I’m going to have to deal with similar issues.)
The maya api has a bunch of "(M)aya function sets MFN classes for operating on objects.

Normals Some of the Normals in Maya are now user-settable. If the vertex normals are not set or locked, they are computed by maya when the mesh changes. If set or locked, the normals remain frozen relative to the object, unless the user unlocks them. There are 3 types of normals for a mesh:

  • Per-vertex per-polygon Polygonal objects store a list of per-vertex per-polygon normals (similar to the vertex list) This list is what is returned by MFnMesh::getNormals and MItMeshPolygon::getNormal(index, normal). As well, MItMeshPolygon::normalIndex( vertexIndex ) returns the index into the normal list for a particular vertex of the polygon.

For a cube, the list would contain 24 normals (4 vertices * 6 polygons) since the edges for a cube are hard and the per-vertex per-polygon normals cannot be shared. For a sphere or torus, which has smooth edges, the normals can be shared, and thus the normal list contains the same number of normals as vertices.

  • Per-polygon normals Polygonal objects store a normal for each polygon. These normals can be accessed via MItMeshPolygon::getNormal(normal) or MFnMesh::getPolygonNormal. So, a cube, which is comprised of 6 polygons would contain 6 such normals.

  • Per-vertex

Also re:the 2.0 api, reading the c++ documentation gives you way more info regarding what you need to pass into a class (sometimes objects can act in two different modes depending on what you pass in). It’s been advised in Robert Galanakis book to look at the c++ documentation as well, I found it pretty helpful.


Also something I just “discovered” is using Dash docsets!
Try this:


Download zeal and set up a function in your text editor, (sublime or vim or whatever), zeal can recieve queries from the command line. It can fuzzy search and it queries the documentation based on a sql lit database.

It’s made my workflow so much smoother because sometimes browsing the docs on the web is a bit of a pain. I like the auto-lookup the function/word under cursor :slight_smile:

I like it so much I’m thinking about constructing my own docset for the Python Houdini/Vex documentation haha.

Hey dive!

Thanks a lot for your explanation. It will help me a lot! :slight_smile:
For sure, if I solve this, I post it here so you can see how I did!
I just didn´t get how I can apply the normal direction value into the locator orientation the API doc is very confusing to me… Im really beginner in the API… :frowning:
Would be possible to you to send me an example of this syntax?

thanks a lot!!!

Would this topic cover your needs? It’s about the face normal but for vertex normals it could be a very good reference for the code? Edit: it actually didn’t show that code, so here is an example to help out:

import maya.api.OpenMaya as om
from maya import cmds

def get_vertex_normal(mesh, vertex):
    sel = om.MSelectionList()
    sel.add(mesh)
    dag = sel.getDagPath(0)
    fn_mesh = om.MFnMesh(dag)
    
    # Use om.MSpace.kObject for object space
    return fn_mesh.getVertexNormal(vertex, om.MSpace.kWorld)


meshes = cmds.ls(type="mesh")
for mesh in meshes:
    print get_vertex_normal(mesh, vertex=0)

The code example would be faster if you wouldn’t get the API object each time per vertex but query all the normals you need for the vertices, keeping the MFnMesh around.

Note that internally MFnMesh also has a getVertexNormal which computes the average vertex internally already. Which is what my example above uses.

Per-vertex You can also access a normal for each vertex of the mesh, independent of any polygons. Such normals are not stored in the object, but instead are calculated by Maya upon request as the average of all the per-polygon normals for the polygons adjacent to the vertex. These normals are what is returned by MFnMesh::getVertexNormal or MItMeshVertex::getNormal.

Hey BigRoy, thanks a lot for replying!

Your code helped me a lot to understand a bit more of the API. :slight_smile:
But I was trying now to figure out how I can align a locator to this normal value so the locator can point to the same direction of this normal.

Would be possible to send me a syntax example of this?

Thanks a lot!

@jvvrf The normal will only give you the direction - you’ll need an up vector to build a full transform. Using @BigRoyNL code and some pseudo-code:

normal = get_vertex_normal(mesh, vtx_index)
up = apiOM.MVector(0, 1, 0)

cross = up ^ normal
tangent = normal ^ cross

pseudo_build_transform(normal, cross, tangent, dir_axis="x", up_axis="y", position)

Hey chalk,

Thanks a lot for your help… but as I am very beginner in the API, I didnt get how to apply the transformation with the last line you wrote. This is syntax is what I am looking for… would you have a pratical example of a function that applies the transformation as you wrote in the last line?

thanks a lot man!

I just started the api too earlier this week so I can’t help much without digging a ton too haha. I’m glad much more knowledgeable people than I am are answering here. I literally have to do a similar operation pretty soon in my script too.

-----------------------------------------------------------------------------

You could try this for a simple vector orientation match:

Lets say you want to transform locator Y axis to match normal,

LocYVector = om.MVector(0,1,0) # really Y axis
YourNormalVector = the normal you got from mfnmesh or MIter

om.MQuaternion(MVectorA, MVectorB): # when you pass in two vectors A B
returns your quaternion to transform A to B!
Basically 0,1,0 to the normal orientation basically to align vector A to B
rotateQuaternion = om.MQuaternion(LocYVector, YourNormalVector)

Btw there is this too:
Returns the vector that represents the rotation of this vector by the given quaternion.
MVector.rotateBy()

But in this case you dont want to rotate the vector you want to rotate the locator, assuming you select the locator

you can do
getActiveSelectionList() returns an om.SelectionList() object from your active selection
locatorSL = om.something globals.getActiveSelectionList() # pseudocode++
I cant remember the globals name just google it haha.

This only shows up in the python 2.0 api docs:
Selection lists have a getcomponent(index) method that returns tuple (dagpath, mobject) of the 0th item in the selection list
locdag, locMobj = locatorSL.getcomponent(0)

Getcomponent doesnt show up in the c++ docs I have but it shows up on Maya 2.0 Api I use it a lot:
getComponent(index) -> (MDagPath, MObject)

    Returns the index'th item of the list as a component, represented by
    a tuple containing an MDagPath and an MObject. If the item is just a
    DAG path without a component then MObject.kNullObj will be returned
    in the second element of the tuple. Raises TypeError if the item is
    neither a DAG path nor a component. Raises IndexError if index is
    out of range.

the c++ docs tell me MFnTransform needs an mobject constructor:
Relatively change the rotation of this transformation using a quaternion.

om.MFnTransform(locMobj).rotateByQuaternion(rotateQuaternion)

Im kind of busy atm but Im going to try this soon too… hope it works…ha

ps there is also a setActiveSelectionList(pass in a selection list here)
that sets your viewport selection to a selection list when youre done with the api

Side Note


Speaking of normals, just fyi (in your case you prob don’t need a second vector, but if you want an ortho-normal basis chad vernon explains it nicely)

You can take the double cross product of two vertex normals to establish a basis assuming the vertex isnt on a totally flat surface: (in which case the vertex normals would all be the same right?) also this is assuming you havent manually edited the normals to misguide a calculation…

Note: You could use a tangent space normal too but you have ensure they haven’t been edited and are the same as vertex normal.

On a flat surface you could use normal and a connected edge ( you can get connected edges and then get their vertex 1 or 2, picking the one whose world position is not equal to the verrtex world positoin, that should always give you two distinct vectors to create an ortho-normal basis):

Look at the methods for mfnmesh and MIterMeshVertex.getConnectedEdges()
Those methods return the vertex ID for each vertex when you pass in 0 then pass in 1. The edge index’s 2 points, picking the one whose world position != your vertex.

Sorry its convoluted. You can do om.MItMeshVertex.getPoint(vertexID, worldspace) to get world positions to pick the right vertex to generate a distinct second vector.

Its a lot of bouncing back and forth between the iterator MIt and the Mfn function set, the more you can do in 1 iteration, getting the data you need, the faster things should go I think.


This is what Im using to understand the object model (the 2.0 api is a bit different but mostly similar regarding mobject pointers)

The syntax for calling the mobject and getting your selection into a selection list is covered more in depth above. (like what bob was doing)

I think he explains it in applied-3d-math (its free)
An ortho-normal basis is basically a new local coordinate system where XYZ vectors all are 90 degres to one another. As in dotproduct(XY) | (YZ) | (ZX) == 0:
Chad explains it much nicer in maya here:

Say your locators were bolts, you could ensure the orientation of a bolt was in a predictable way (the way the two axis are pointing besides the one oriented to the normal) if you have a basis… without you just have one aligned vector / axis (hope that makes sense)

Hey dive!

I thank you very very much for all this explanation! I am going to check all links you sent! :slight_smile:
I tried to apply the code you sent, but I think I have to transform the normal value I have into a quaternion, or I am wrong?
This is what I wrote:

locatorSL = new_om.MGlobal.getActiveSelectionList()

locdag, locMobj = locatorSL.getComponent(0)

old_om.MFnTransform(locMobj).rotateByQuaternion(***NORMAL_TO_QUAT***)

I am trying to get this quat rotation as the normal is…

thank you very very much!! :slight_smile:

Dont Mix apis in terms of MObjects
I cleaned up the imports, enumerate is a tiny bit slower than range too.

import maya.cmds as mc
import maya.api.OpenMaya as om
from time import time
# print('---------------------------------------------------------------------')

# NOTE: run this on an empty scene
def clearscene():
  mc.select(all=1)
  mc.delete()

clearscene()

def sclear():
  mc.select(cl=1)

def ssave():
  """
  save current selection list flattened as tuple
  """
  s = mc.ls(sl=True, fl=True)
  s = tuple(i for i in s) #type:ignore
  return s

# NOTE: Various methods are available to query an MObject's type. If you are
# unsure of the type of an MObject, use the apiType() method to get the
# MObject's type. The return value will be a type in MFn::Type. If you want to
# know if an MObject is compatible with a specific function set, pass the
# function set type to the hasFn() method.

tsphere = mc.polySphere(sx=200, sy=200, radius=30)
mc.select(tsphere)
sl = om.MGlobal.getActiveSelectionList()

# getting the 0 idx of selection list get the dagpath and mobject handle
tdag, tsphere_mobj = sl.getComponent(0) #type:ignore (mdagpath, mobject)

mfn_sphere = om.MFnMesh(tdag)
# NOTE: verify the sphere1 tsphere is has the function set applied to it
# print(mfn_sphere.fullPathName())

# world space
ws = om.MSpace.kWorld

# fn getPoints returns a MPointArray of world positions
sphere_vtxs_ws = mfn_sphere.getPoints(ws)

# fn getNormals() returns a MFloatVectorArray
sphere_normals_ws = mfn_sphere.getNormals(ws)

YVector = om.MVector(0,1,0)

mfn_xform = om.MFnTransform(tdag)

# create an array of mobject references to locators
loc_mobjs = om.MObjectArray()
loc_mobjs.setLength(len(sphere_vtxs_ws))

# create an array of dagpaths to locators
loc_dags = om.MDagPathArray()

# clear selection
sclear()
# create the locators aggreating the selection add=True
for i in range(len(sphere_vtxs_ws)):
  loc = mc.spaceLocator()
  mc.select(loc, add=True)
  loc_sl = om.MGlobal.getActiveSelectionList()
  ldag, lmobj = loc_sl.getComponent(0)
  loc_dags.append(ldag)
  loc_mobjs.append(lmobj)
  sclear()


start = time()
for i in range(len(sphere_normals_ws)):
  # ensure its a unit vector
  normal = sphere_normals_ws[i].normalize()
  n = om.MVector(normal.x, normal.y, normal.z)
  quaternion = om.MQuaternion(YVector, n, 1.0)
  offsetvector = om.MVector(sphere_vtxs_ws[i].x, sphere_vtxs_ws[i].y,
                            sphere_vtxs_ws[i].z)
  locMobj = loc_mobjs[i]
  locdag = loc_dags[i]
  # initalize a transform function set on the locator dagpath
  locxform = om.MFnTransform(locdag)
  # rotate that locator by the quaternoin
  locxform.rotateBy(quaternion, ws)
  # offset vector by world position
  locxform.translateBy(offsetvector, ws)
  # offset vector by unit normal of vertex
  locxform.translateBy(n, ws)
end = time()
# 3.189 for 40k locators

Thanks a lot dive!
I will test that and I back to you asap! :slight_smile:
thanks a lot!!!

1 Like

let me know what you’re testing on, like vert count and performance, im curious :slight_smile:

Sure! I am getting 40k locators and aligning to a mesh with 40k vertices…
So then I can put an object on each locator, for build a scene. :slight_smile:

1 Like

of course, I dont need the 40k objects at same time, but I want to store the positions

1 Like

You should take off the last line then, I offset the locator by the length of the normal
Rather than create locators, (I’m not sure what you’re trying to do)
You could store the normal and point positions in mpoint and mvector float arrays like I did in the script and recall them later.
It would make the script faster since it’s not iterating and placing locators / orienting them.

Understood! Thanks a lot, I will do that!
My plan is:

Select all vertices > store normals in a list > align each locator to each normal
and then:
Select some locators and place objects
then:
Delete previous objects > select another group of locators and place new objects… :slight_smile:

1 Like

And I plan to align each object to each locator with matchTransform… its resulting quick

Hmm, If thats working for you thats fine, no point over-optimizing, but Im pretty sure creating the locators is slowing down things A LOT, you can bypass that entirely. With the api you kind of have to think differently from viewport manipulation.

Like you can store the normal data \ point positions (I already did that in the above code), just make your objects and perform the transformation in the api code itself.
It should be way faster.