Our own Micoselva has 680 and we were talking how my card which was supposed to trade blows with his 680 is doing 10 fps higher with everything on ultra in same place i had 35fps Mico had 25fps.
It was like that initially, but
Crispy is actually right here. After the patches and driver updates, I can run the game at 30+ on Ultra (+ post-processing on), except:
- no motion blur (I hate that shit)
- no depth of field (as above)
- no hairworks
- foliage viewing distance set to High
With low-medium settings I can get a steady 60+ FPS now, but I got used to how the game looks on high/ultra by now, so I am playing in peasant cinematic 30 FPS (using Riva Tuner Statistics Server).
Yeah that 30fps/60fps lock in game is shite and you can see constant stuttering. I also tried to set 30FPS via RTSS and set everything on ultra but overall difference between ultra and high is very small. Hell difference between ultra and low even isn't that big (probably due to their lighting engine doing most of the work which is exactly the same on low and ultra and can't be scaled back due to use of PBR)
I am mostly locked at 60fps now with drops to 50 everything on high with few things on ultra and all post process (aside from sharpen) also no hairworks (AMD also released new drivers which improved a bit performance). When i want to play at absolutely locked framerate i switch to glorious PAL invention aka 50hz and then i don't have any drops all on high with few options on ultra. Biggest drop for me is foliage distance and shadows those two can half framerate on ultra.
Which leads us to my previous talk with you. Your GPU should be stronger than mine. 680 was released later and was mostly 10-15% faster than my gpu.
You should be able to lock game to 60 with very rare drops to 50 with all things on high...
And TW3 isn't only game with those problems. There are multiple games where 780 780Ti are out-competed even by 290 let alone 290x
here is the deal from latest techpowerup review of R390 just a week ago with latest drivers for both amd and nvidia:
http://www.techpowerup.com/reviews/Powercolor/R9_390_PCS_Plus/16.html
All of those are just 1920x1080 which is domain of nvidia and the more you go for better res the better AMD gpus performs (due probably to ROPs and bandwidth).
In those above we can see perfectly that 280X which is renamed 7970 often trades blows with 780 and sometimes it beats it which shouldn't be even a thing considering it is one gen older card almost 30% faster on paper than 280x.
More interesting is 2xx line which completely destroys keppler line. 290x was meant to compete with 780, 290 with 770, nvidia released a little cut down Titan and named it 780Ti to which AMD didn't have response.
Like i said earlier. All of those charts are POST "keppler" update. Which did gave some FPS but it didn't fix the problem of those card undeperforming.
Which goes back to :
- Either AMD have better hardware which is hardly the case considering R&D cost comparison of Nvidia and AMD. (Nvidia has something like 3-4 times budget on R&D)
- Or simply Nvidia doesn't give a fuck about older cards (thus no proper drivers updates) since they have newer cards every year and almost 70% marketshare so either way people will buy their next GPUs
So far every single thing you've said has been speculation, hearsay, and straight-up opinion. There's no hard evidence for the claims you're making, or the claims you claim other people are making.
i hope above is satisfactionary for you.
edit:
BTW people claim 1.07 improved performance on PC. I didn't test it personally yet