I'm working on an expansion for Himeko Sutori. I've been running into a lot of problems with ancient UE3 and the ancient Scaleform integration that lets the engine display ancient Flash files. The most recent problem I ran into was a difference in how UE3 and Flash assume that textures should be displayed.
I'm setting up a character sheet in Flash that will show your in-game character in the UI. Sending images from the game engine to Flash isn't that hard. But it messes up the color. On the far right is what the image is supposed to look like. On the far left is what it actually looks like when you first send the image to Flash. UE3 assumes that your textures by default are sRGB* and and therefore have most of the lighting detail in the darker areas. The game will convert your gamma-space sRGB image to linear-space before applying lighting, and it looks right under in-game lighting situations. You can tell UE3 that a texture is not sRGB, but then it'll look too bright when displayed in-game, with lots of blown-out highlights and lots of wasted color depth in the highlights that should have been used in the midrange.
*(UE3 also tries to reduce your texture file size by creating localized palettes for groups of pixels, which works well for most use-cases, but it doesn't work so great for consistent color in pixel art.)
When you send an sRGB texture to Flash, Flash doesn't know that it has a gamma curve applied and needs to be converted to linear space before display. So when you send that sRGB texture to Flash, it shows up too dark and saturated. To fix this, I could make copies of every single texture in the game, and save one as sRGB if it's intended for display in the game world, and save a copy as non-sRGB for display in the UI. I'm not going to do that.
Instead I came up with my own gamma correction in Flash ActionScript. I had to figure out how to access the bitmap's underlying data, and separate out the 32-bit integer into a bunch of 8-bit channels for red, green, blue, and alpha, and then apply a lookup table to adjust the color, and then bit-shift the results and put them back together into a single 32-bit integer. I thought the result looked pretty good. But then life slapped me in the face and I found out that although this is possible in the default Flash player, it's unsupported in the Scaleform integration. The method I use for tweaking bitmap color just doesn't exist in Scaleform.
So I thought I was screwed. Then someone over at the Epic forums told me that we all just have to do the best we can with the tools we have. Scaleform
will let us change an image's color with adding and multiplying. It's not fine-tuned like a lookup table, but I figured if it's the only option I have then I'll give it a try. I set up some simple contrast reduction by multiplying the color by 0.7 and adding an offset. And I can live with the result. I just have to start with a darker background, and the result isn't too bad.