Mipmaps are a pretty old tech, and generally have about a 33% increased memory usage overhead compared to textures without any mip levels;
For example, let's say you had a standard RGB8 texture of 512x512 pixels, that is 768KB, and is also known as mip-level 0
Then you have the other mip levels typically as follows
level 1 -> 256x256
level 2 -> 128x128
level 3 -> 64x64
level 4 -> 32x32
level 5 -> 16xx16
level 6 -> 8x8
level 7 -> 4x4
level 8 -> 2x2
level 9 -> 1x1
These levels are usually auto-generated by the drivers as you upload the texture.
With the DirectDrawSurface (DDS) format you can specify your own arbitrary mip-levels, eg, I can have it progress as 512x512->384x384->256x256->192x192->128x128, etc.
DDS, also makes things about 4x smaller on average (at a small loss of visual quality)
From what I've heard modders say, bethesda completely fucked up on using the correct compression algorithms on the textures, but I digress.
Mipmapping has obvious benefits,
1. It looks better if you're looking at surfaces that scale or distort the texture a lot (this reduces the need for anisotropic filtering too), as the texture is prefiltered and therefor a lot easier and cheaper to sample.
2. If it can use a mip-level, it's much friendlier on your GPU's bandwidth, even if the larger mips have to get swapped out to system RAM temporarily due to overloading the VRAM (it can just pull it out as it needs it).
From what I understand, Bethesda has basically sort of reinvented the wheel here.
In their BA2 archives, they have stored the textures with mipmaps, and they've done it in such a way that they can seek directly to the miplevel they want, and instead upload that to the GPU as mip-level 0.
Benefits of this approach
- loading times are reduced a lot, if you just need the 256x256 copy, you can just upload that and its mips instead of having to upload the full 2048x2048 texture
Negatives
- Bethesda is overriding what the graphics driver should be doing on PC platforms
- Significantly increased the amount of loading that takes place, particularly on PC.
I can't speak for DX12, or Vulkan, or the console APIs, but if you want to to use the 512x512 surface now instead of the 256x256 one, you'll need to destroy the entire texture and load it from scratch at that mip-level, if you then need a lower level, the driver should just use the lower mip-level automatically.
However, bethesda's programmers are such utter morons, it won't surprise me if they also destroy and recreate the texture in this case.
I don't know if the console/vulkan/DX12 APIs let you insert new mip levels wherever.
- Bethesda's programmers are utter fucking morons, I cannot emphasize this enough.
I guarantee you this is both less stable and slower than just letting the graphics driver take care of it.
Compare this to their built in memory allocator, IN ALL OF THEIR GAMES TO DATE, USING HACKS TO REPLACE IT WITH STANDARD C MALLOC MAKES THE GAMES A MILLION TIMES FASTER AND STABLER[1][2][3]
[1]
http://www.nexusmods.com/newvegas/mods/34832/? (new vegas stutter remover among other things, replaces the allocator with one of many options to ludicrously increased performance and stability)
[2]
http://www.nexusmods.com/skyrim/mods/50305/? (hack for skyrim so allocate more memory at the start, because Bethesda is so fucking incompetent, the game just crashes 99% of the time if it event needs to allocate more - wouldn't surprise me if this remains an outstanding issue)
[3]
http://www.nexusmods.com/skyrim/mods/72725/? (provides the option to override skyrim's allocator with std c malloc, to massively increased stability)