tranceland: There is no single GPU that can run The Witcher 2 maxed with Ubersampling On at a good framerate (30fps or more). Not even the GTX 680 can do it.
You need GTX 470/560TI in SLI minimum and I can definitely tell the difference on my single GTX 470.
Everything looks much smoother but I'm wondering if AA couldn't have been implemented in a more traditional way with small steps like 2x, 4x, 8x, 16x etc.
It seems to me that Ubersampling is applying a lot of AA which kills the GPU even though it's not necessary.
I got a solid 30 fps with my GTX 680 with the basic clocks. There is probably some part of the game which that can drop i presume, but most of the time it's 30.
I suppose "ubsersampling AA" use the Full Scene Anti-Aliasing which calculate a higher resolution of rendering & then adapt this high resolution rendering at your monitor's resolution.
That's why aliasing disappear & everything is sharp & soft.
Too bad we can't specify the MSAA (2x, 4x, 8x, etc..) & FSAA lvl from 100% to 200% (where 100% is your standard monitor resolution).
ArmA 2 got all these options & it's pretty cool to adjust the rendering to the GPU capabilities.
But with DX11 next generation game (Witcher 3), all this gonna disappear.