Unfortunately most of the guy's benchmarks are no good.
The loading times specifically are fine however.
Basically the issues boil down to:
1. He's measuring GPU performance.
2. He's not measuring CPU overhead and properly isolating this variable.
3. Due to game logic calculations, games' CPU requirements do not scale linearly with framerates; e.g. an extra 5% of CPU power might be enough to double your framerate, likewise an absence of 10% might take it from a smooth 60 to incapable of maintaining 30.
While my example is extreme, this is why his results are all over the place, and we still do not have a firm 'denuvo's overhead is X' metric.
I have a few other posts in this thread that go into more detail.
EDIT: There was a Gears game that he accidentally did correctly. That one is quite valuable.