working in Quest2, anyone have suggestions for taking LDR source data and converting it to feasible HDR data with as little effort and as much accuracy as possible? (sorry if ‘Shaders’ is the wrong category for this)
From my experience this is difficult and would involve a lot of trial and error and the results won’t be as accurate as a real hdr image. You can try manually authoring a multiply mask texture that reveals the really bright bits. Then in the material just multiply original ldr texture by the mask. Your mask texture has 256 values and you can assume that 1 will keep original source and anything brighter will go beyond the range. Then the math is original_ldr_texthdr_mask255. Then the brightest value you can achieve goes from 0-1 range to 0-255 range. Not sure if this is enough for your purposes.
It looks like there are also some projects that use AI to reconstruct HDR images from LDR. Worth a look.
What’s your input LDR data? If it’s an 8bits texture, it’s gonna be complicated, but if it’s a render you want to convert to hdr it’s pretty doable.
can you forward info on the AI reconstructed HDR images you mentioned?
the LDR Data is a small 256x256x6 cube map that we use to generate irradiance and reflection maps at each environment load
Is it safe to assume that the HDR values are clamped in the LDR data and not rescaled to fit in LDR? If so I am not sure how you will be able to get it back.
FYI, here is an interesting demonstration of what I’m talking about (I should have gone to MIT I guess!) Single-Image HDR Reconstruction by Learning to Reverse the Camera Pipeline