HLSL Vertex Animation Texture - Lerping UV Channels

Heya! I’m a Junior Technical Artist who has a background in Programming, 3D/2D Art, and Game Design (as some context).
I need some help, please!

I need to do an exercise about animating a bird flap without rigging it, so I must use Vertex Animation Textures as a requirement and deliver the HLSL code.
I literally need to replicate this example from twitter, but without using Shader Graph from engines like Unity/Unreal.
Example

The thing is that I just know the basics of HLSL (basically, Freya Holmér’s YouTube videos, which I totally recommend).
So I understand what I have to do, but I don’t really know how to write it down. Especially the alternation between different UV channels in a loop, and how I load the different UV channels into 1 texture using the RGBA channels (using texture samplers???).

I tried to find examples related to this, but I can’t really find anything helpful.
Is there anyone out there who could give me a little push, please? It’s a really important exercise I need to deliver.

Hey there. For starters, I would like to point out that this example does use vertex position offset, but doesn’t actually use vertex animation textures. It is using an alternative method which projects “key poses” from the front orthographic view into the 4 available UV sets and has the shader lerp through them. That also means than UVs couldn’t be used for texturing the model, and it was colored only using vertex colors. Do you specifically need to use vertex animation textures? The setup will probably be a little different because you will need to bake your data into a texture instead of the UV sets.

To get the animation to play in a loop you need to pipe a time input into some cyclical function like sin(). In the example shown in this post, time is used to sample an RGBA curve asset that is input as a shader parameter, but you could probably mimic that by piping time through a sin() function 4 separate times and shifting the phase of each. Since each wave peaks at a different time, the wave closest to a value of 1 will have the most impact on the final summed vertex offset position since it is directly multiplied with the value pulled from its respective UV set. 4 oscillating waves → 4 UV sets.

Again, if you specifically need to use vertex animation texures, your approach will need to be a bit different because the data will need to be baked differently. Rather than lerping between 4 “key poses”, you would need to read your vertex position offset directly from the texture, where the RGB of each pixel stores the exact XYZ offset of a vertex on a particular frame. You would need to sample the texture using something like…

float yCoord = frac(time / animLength) * numberOfFrames

… which would be roughly your current frame… and then the xCoord would be whatever your vert index is.

I’ve not worked with VATs very much before so this might not be exactly correct, but the general idea should be similar.

This blog post looks like a pretty good explanation of how VATs work and how texture sampling settings can affect them.