CgFX, Error: the parameter used is invalid

I am struggling working with cgfx in MotionBuilder with the lack of output from it’s errors. If a compiler error happens then sure, one can read what went wrong - but now it feels that I’m working against the binding to the ui.

Shader, where I sample the color from a texture.

float4x4 WorldITXf : WorldInverseTranspose;
float4x4 WorldViewProjXf : WorldViewProjection;

// Exposing parameters to UI,
float3 DiffuseColor : DIFFUSE
<
  string UIName = "Diffuse Color";
  string UIWidget = "ColorPicker";
> = {0.66, 0.66, 0.66};

// the texture object will not show in mobu, so no point giving it ui names etc but we can do that to the sampler
texture DiffuseTexture <string ResourceType = "2D";>;
sampler2D DiffuseSampler <string UIName = "Diffuse Texture";> = sampler_state
{
  Texture = <DiffuseTexture>;
  MinFilter = Linear;
  MagFilter = Linear;
  AddressU = WRAP;
  AddressV = WRAP;
  AddressW = WRAP;
};

// Defining structures we will be using,

// Application => Vertex data,
struct appdata {
  float3 Position: POSITION;
  float2 UV: TEXCOORD0;
  float4 Normal: NORMAL;
  // Tangent and Binormal both require that the mesh has been UV Mapped,
  // We can pick whichever TEXCOORD we want, just make sure MOBUs widget
  // passes the correct data to the right TEXCOORD...
  float4 Tangent: TEXCOORD6;
  float4 Binormal: TEXCOORD7;
};

// Vertex => Fragment shader data
struct vert {
  float4 Position: POSITION;
  float2 UV: TEXCOORD0;
  float3 Normal: TEXCOORD1;
  float3 Tangent: TEXCOORD2;
  float3 Binormal: TEXCOORD3;
};

vert ShaderVertex(appdata IN)
{
  // Initialize the vert output to all zero,
  vert OUT = (vert)0;

  // Get the vertex position from object->world->view->projection space
  // All using one multiplication!
  float4 position = float4(IN.Position.xyz, 1.0f);
  OUT.Position = mul(WorldViewProjXf, position);

  // Just passing the UV from Application data,
  OUT.UV = IN.UV;

  // Get world normals by multiplying object space normals to
  // the world inverse transpose matrix
  OUT.Normal = normalize(mul(WorldITXf, IN.Normal).xyz);
  OUT.Tangent = normalize(mul(WorldITXf, IN.Tangent).xyz);
  OUT.Binormal = normalize(mul(WorldITXf, IN.Binormal).xyz);

  return OUT;
}

float4 ShaderPixel(vert IN) : COLOR
{
  // Hardcoding a light vector,
  float3 lightDirection0 = normalize(float3(0.5f, 0.7f, 0.3f));

  // Use the color sampled from the texture,
  float4 diffuseColor = tex2D(DiffuseSampler, IN.UV);
  // float4 diffuseColor = float4(DiffuseColor.xyz, 1.0); // Attempting to use this will cause errors

  // Simple diffuse light, surface normal dot light direction * color
  float NoL = saturate(dot(IN.Normal, lightDirection0));
  float3 color = diffuseColor.xyz * NoL;

  return float4(color, 1.0f);
}

// Techniques will be selectable from a drop-down in MotionBuilder.
technique SimpleDiffuse
{
  // This is a single pass shader, we can name the passes anything we
  // would like.
  pass FirstPass
  {
    VertexProgram = compile gp4vp ShaderVertex();
    FragmentProgram = compile gp4fp ShaderPixel();
  }
}

This shader above works, it samples a color from a texture. Happy days.

If I however attempt to use the parameter DiffuseColor instead. MotionBuilder will just throw Error: the parameter used is invalid message boxes my way untill I kill the program using taskmanager.

Commenting out texture and sampler and I am allowed to use the parameter DiffuseColor.

But what is causing this error? I can see how I might want to use both texture and a parameter to get a float3 and use both in the same “technique”

Hi.
I’m not sure how you are changing the code to test but these 4 lines work for me in both MotionBuilder 2020 and 2010:

float4 textureColor = tex2D(DiffuseSampler, IN.UV);
float4 diffuseColor = float4(DiffuseColor.xyz, 1.0);
float NoL = saturate(dot(IN.Normal, lightDirection0));
float3 color = diffuseColor.xyz * textureColor.xyz * NoL;

I am changing them to use either the color input, or the texture.

So this pixel shader works using texture input,

float4 ShaderPixel(vert IN) : COLOR
{
  // Hardcoding a light vector,
  float3 lightDirection0 = normalize(float3(0.5f, 0.7f, 0.3f));

  // Use the color sampled from the texture,
  float4 diffuseColor = tex2D(DiffuseSampler, IN.UV);

  // Simple diffuse light, surface normal dot light direction * color
  float NoL = saturate(dot(IN.Normal, lightDirection0));
  float3 color = diffuseColor.xyz * NoL;

  return float4(color, 1.0f);
}

While this pixel shader will cause the error; The parameter used is invalid

float4 ShaderPixel(vert IN) : COLOR
{
  // Hardcoding a light vector,
  float3 lightDirection0 = normalize(float3(0.5f, 0.7f, 0.3f));

  float4 diffuseColor = float4(DiffuseColor.xyz, 1.0);

  // Simple diffuse light, surface normal dot light direction * color
  float NoL = saturate(dot(IN.Normal, lightDirection0));
  float3 color = diffuseColor.xyz * NoL;

  return float4(color, 1.0f);
}

Tried making as simple of an example as I could seeing I’m getting this error almost no matter which inputs I want to add beyond sampling a diffuse and normal.

Attempting to add wrinklemaps so animation can see the results while working. Sampling three normal maps works just fine, but when I added a float parameter for the masking channels this is the error I get.

Just to be clear, your example of float3 color = diffuseColor.xyz * textureColor.xyz * NoL; works on 2022 also.

The fail case you posted above works fine for me. I dont have 2022 yet so I cant test in there but it works fine in both 2010 and 2020.

Have you tried changing the variable name to something else. Because right now you are using the same names with case differences: DiffuseColor and diffuseColor

Yeah tried having other names thinking maybe it was a collision with semantic or something, but didn’t help.

Downloaded 2020 and going to see if things work more as expected there.

just for the sake of confirmation. This is the entire cgfx file im loading in right now that works:

float4x4 WorldITXf : WorldInverseTranspose;
float4x4 WorldViewProjXf : WorldViewProjection;

// Exposing parameters to UI,
float3 DiffuseColor : DIFFUSE
<
  string UIName = "Diffuse Color";
  string UIWidget = "ColorPicker";
> = {0.66, 0.66, 0.66};

// the texture object will not show in mobu, so no point giving it ui names etc but we can do that to the sampler
texture DiffuseTexture <string ResourceType = "2D";>;
sampler2D DiffuseSampler <string UIName = "Diffuse Texture";> = sampler_state
{
  Texture = <DiffuseTexture>;
  MinFilter = Linear;
  MagFilter = Linear;
  AddressU = WRAP;
  AddressV = WRAP;
  AddressW = WRAP;
};

// Defining structures we will be using,

// Application => Vertex data,
struct appdata {
  float3 Position: POSITION;
  float2 UV: TEXCOORD0;
  float4 Normal: NORMAL;
  // Tangent and Binormal both require that the mesh has been UV Mapped,
  // We can pick whichever TEXCOORD we want, just make sure MOBUs widget
  // passes the correct data to the right TEXCOORD...
  float4 Tangent: TEXCOORD6;
  float4 Binormal: TEXCOORD7;
};

// Vertex => Fragment shader data
struct vert {
  float4 Position: POSITION;
  float2 UV: TEXCOORD0;
  float3 Normal: TEXCOORD1;
  float3 Tangent: TEXCOORD2;
  float3 Binormal: TEXCOORD3;
};

vert ShaderVertex(appdata IN)
{
  // Initialize the vert output to all zero,
  vert OUT = (vert)0;

  // Get the vertex position from object->world->view->projection space
  // All using one multiplication!
  float4 position = float4(IN.Position.xyz, 1.0f);
  OUT.Position = mul(WorldViewProjXf, position);

  // Just passing the UV from Application data,
  OUT.UV = IN.UV;

  // Get world normals by multiplying object space normals to
  // the world inverse transpose matrix
  OUT.Normal = normalize(mul(WorldITXf, IN.Normal).xyz);
  OUT.Tangent = normalize(mul(WorldITXf, IN.Tangent).xyz);
  OUT.Binormal = normalize(mul(WorldITXf, IN.Binormal).xyz);

  return OUT;
}

float4 ShaderPixel(vert IN) : COLOR

{

  // Hardcoding a light vector,

  float3 lightDirection0 = normalize(float3(0.5f, 0.7f, 0.3f));



  float4 diffuseColor = float4(DiffuseColor.xyz, 1.0);



  // Simple diffuse light, surface normal dot light direction * color

  float NoL = saturate(dot(IN.Normal, lightDirection0));

  float3 color = diffuseColor.xyz * NoL;



  return float4(color, 1.0f);

}

// Techniques will be selectable from a drop-down in MotionBuilder.
technique SimpleDiffuse
{
  // This is a single pass shader, we can name the passes anything we
  // would like.
  pass FirstPass
  {
    VertexProgram = compile gp4vp ShaderVertex();
    FragmentProgram = compile gp4fp ShaderPixel();
  }
}

And it does not work in MotionBuilder 2022.

Creating a new cgfx shader is no issue, issue arrise when I apply it to a mesh.

Curious if that cgfx file i listed above worked for you in 2020 whenever you get it installed.

silly question also but are you applying it to a MoBu primitive or a custom geo? My only other thought right now would be that there is actually an issue with the geo that when trying to display through the shader triggers the crash.

Geo didn’t seem to be the issue, but it does work on MobtionBuilder 2020.

So guess I can argue we can’t make the upgrade if animation want shaders to be working then.

Thanks for the help.

Yea sorry I can’t test in 2022. I’ll be upgrading soon and have to transition all my MoBu CGFX shaders so I guess I’ll be figuring out this issue then.