[Unity] Check if model uses texture's alpha information - Optimization

Hello All,

I am trying to, on import into Unity, set the shader that each material uses based upon the information in the model. The issue is that to determine what shader the material should use, I read from the texture. Some of my models re-use parts of textures that contain alpha but do not use that alpha information. I want an efficient way to figure out if my model is using any of the alpha information on the texture it uses.

I have achieved this, BUT, it is way too slow to be practical.

Current Solution:
I am looping through all of the pixels of the texture applied to my model’s material and checking to see if those semi-transparent pixels fall inside the any of the UV triangles on my model. If it doesn’t fall inside any of the UV triangles then I set the shader to not use transparency.

The only optimization I have tried so far is to limit the pixel information I read by looking at every 16 pixels (any higher and I think i might miss the portion that is alpha’d). I am also looking into limiting the number of UV triangles I’m looping through.

I would welcome any help with this issue even if you aren’t familiar with C# or unity. This needs to be fast because it is part of our workflow to unity. If you can figure out a way to reduce the time it takes to figure out if my model uses alpha information on my texture please let me know!

Thank you!

Also, our workflow is 3ds Max> .FBX > Unity

you could create two materials in maya that each use the same set of textures. plug the alpha channel into 1. then apply the two materials to the model to define the solid and transparent areas.
at export time, just check the material’s transparency slot for an input.

At the end of the day you will have two draw calls anyway, so i would just map the model with two materials in maya.

There already are separate materials. On import to Unity everything is reset so the only way I would be able to read something I checked in Max (we use 3ds max but I’m assuming I could use your solution for maya) into Unity would be to name the material something that I would read and set the shader based off of that inside of Unity. Unless you’re familiar with some other way to read across information. OH we’re also using .FBX, I forgot to mention that.

Is that what you were thinking rgkovach123?

Thx for response.

If tool perf is an issue, RGKovach’s suggestion is the right one: a user knows if s/he wants to make something transparent or not (remember that mappings change a lot due to edits and stuff, you may end up with fluctuating materials if you do it all based on per-pixel values.

If the tool sort of works but you’d like to speed it up, try implementing a Quadtreeto parse the texture once. Since you have to check every UV tri agains the texture that’s a good optimization.

You could also generate a UV bounding size for each triangle and check that instead of the verts; but that’s probably not much savings since you have to check all the verts to get the size anyway.

In general you might want to do this as a user-side tool instead of an import time automatic process – users can decide for themselves if they want to spend their time doing it manually or waiting for the numbers to crunch.

I’m no Unity expert, but this seems like something you would tag in the art package or through an intermediate tool, not at runtime. That’s probably not a helpful answer I know. You could probably do the analysis in Max. Perhaps you could bake the alpha channel to vertex colors or something similar.

[QUOTE=Theodox;22588]If tool perf is an issue, RGKovach’s suggestion is the right one: a user knows if s/he wants to make something transparent or not (remember that mappings change a lot due to edits and stuff, you may end up with fluctuating materials if you do it all based on per-pixel values.

If the tool sort of works but you’d like to speed it up, try implementing a Quadtreeto parse the texture once. Since you have to check every UV tri agains the texture that’s a good optimization.

You could also generate a UV bounding size for each triangle and check that instead of the verts; but that’s probably not much savings since you have to check all the verts to get the size anyway.

In general you might want to do this as a user-side tool instead of an import time automatic process – users can decide for themselves if they want to spend their time doing it manually or waiting for the numbers to crunch.[/QUOTE]

So when I implement a quad tree, the objects i’m “detecting collision” with (from the link you provided) are pixels with semi-transparent alpha? You think this is more of an optimization than going through my texture once but sampling in 16 bit squares?

I’m thinking I will make this optional because in some instances it will speed things up but for more complicated models it would take a long time regardless of optimization.

Are you familiar with information you can get into .FBX? What do you mean bake the alpha channel to vertex colors - what would that accomplish?

The idea between a quadtree is that you group values hierarchically. Say you have a bunch of white alpha pixels but they are mostly in one corner of the map. With a conventional search you still have the same speed test for every UV point you’re checking – you just have to check the nearest N pixels. With a quad tree, on the other hand, you can quickly reject 3/4 of the UV points since they will hit empty quadrants of the Quadtree right away - so it should be significantly faster than just checking pixels. That’s why its the common accelerator for things like physics collisions. If the data is randomly spread out the acceleration will be less useful.

Sorry, that wasn’t a good example, and I didn’t provide enough context. I was trying to say that you could include the shader decision logic into the mesh in some way.

Will your models always have the same shader/texture applied to them, or will this be dynamically applied at runtime? In other words, might the same mesh be used as the base for multiple different assets at runtime? I might have a window mesh that can be used as both a transparent and an opaque window for example.

If meshes are never dynamically assigned a shader/texture combo at runtime, then you can make your determination in Max. I was using vertex colors as an example (obviously a poor one) of where you could store that decision assuming that you can’t use something simpler such as material name, etc.

@theodox OK, that may be a more accurate and faster way to go through my textures, thank you.

@btribble Shaders will never be dynamically set. I was trying to avoid having to name my materials but I may end up having to. I wish there was a better way to get that material information across to Unity.

every pipeline suffers from this. you have a model in a 3d package and parts of it are mapped with different materials.
those materials need to have special properties set in the engine that the 3d package doesn’t know about.
thus, you end up with decorating the model and/materials with some data that the exporter/importer/engine can pick up on.

i find the simplest way is to create custom attributes. I would advise against naming schemes - code based on naming conventions is a pain in the butt to maintain.

we have a custom exporter than walks a limited sub-set of a standard maya shading network with additional custom attributes and builds a runtime shader from that data. this approach isn’t perfect, but the artists understand it well enough.

The great news is, there is! Unity has a simple event fired when assets are imported. It’s called “OnPostprocessGameObjectWithUserProperties,” and the documentation page on it can be found here:

http://docs.unity3d.com/Documentation/ScriptReference/AssetPostprocessor.OnPostprocessGameObjectWithUserProperties.html

To take advantage of this, you just need to extend a new class, (maybe called “CustomAssetProcessor”), from the AssetPostprocessor class and write that method.

The FBX format stores a variety of user-added data within the file format and can easily be parsed out with the Unity event hook. (If you export with ascii formatting, you can open the .fbx in a text format and find where it is storing this data. This could allow you to more formally parse out the information by reading from the .fbx directly, though I wouldn’t necessarily recommend this for something as simple as this.)

I’ve worked with lots of material/shader assignment on asset import and found this strategy to be pretty sufficient. The core benefit is that the model doesn’t need to drag an extra “metadata” file around with it. On each mesh, you could store an attribute with a string value, a unique ID, or whatever it takes for Unity to be able to interpret material data. Then, you just need to develop any sort of “translation process” between Max and Unity. For example, if Unity received an object on import with a TextureID value of 0001, it would assign a blank diffuse. If it received a value of 0002, you could put a Transparent/Diffuse on it. You’re in control to make this system do what you need to do at this point and can make it as simple or as complex as needed.

In the case of your current problem, it would be as easy as creating a bool value attribute on the object, named something like “hasAlpha”, and checking its value on import into Unity using the above method. Technically, you wouldn’t even need to check its value, but only for its existence on an object. Think of it as a sort of tag. Any object that has this FBX Custom Attribute would be assigned with a transparent shader. You just need to fill in the above event hook to account for this incoming attribute type and act accordingly. To easily automate this process, you could add an additional step to the Max->FBX export in which Max checks all scene objects for a transparent channel on the material’s texture node and assigns the “hasAlpha” attribute automatically right before exporting.

I think the above texture processing techniques are alright, but would be incredibly difficult to debug and are a needlessly technical solution for something as simple as basic data wrangling through the application pipeline.

Perfect! Thank you!