I've been working a lot on my work projects these last few days and while profiling my own game engine, I noticed something that I think everyone here should know. GEN3 support MAY seem that it is working properly on your rig and it is actually NOT!
In my particular case, it seemed to work fine but while profiling my engine I saw what seemed like the display driver stalling every now and then. I immediately thought it was a driver problem and discarded it because it did not seem to have much of an effect in terms of performance. Then the new Neverwinter game launched and I decided to give that a spin since I had played its beta and I liked it a lot. Using the 320.00 driver, I noticed the game, with SLI on, would actualy start to lose frames and after a while become unplayable.
Cryptic's developers have included a very nice feature. Instead of an fps counter you can type /fpsgraph 1 and see an actual fps graph instead, where it became apparent that the driver was getting a huge number of stalling spikes that it obviously did recover from but it caused the framerate to gradually drop. SLI made it more intensive and the higher the image quality settings, the faster it manifested.
Since a few days had passed I did not immediately relate the two incidents, it took me like two days to investigate this further and tie them together. So I tweaked my own code and did some further testing and apparently this is entirely due to the GEN3 signaling. It may seem that's it's working right but it is not , even when it works. Apparently the nvidia driver does a very good job recovering from it BUT in certain conditions the issue WILL produce framerate drops and in theory it may freeze the PC. Yes, even if it seems to be working properly. This has managed to "hide" in my rig for QUITE a while. It was not apparent in any game or with my own code and if I had not decided to play Neverwinter, I wouldn't find out until MUCH later in my engine's development cycle.
I do not have time to pull out the AMD cards from the AMD testbed PC and test them on the X79 but in theory there should be no difference at all.
As of these tests, I do not believe that there's ANY case of a video card that works properly with GEN3 and SB-E. Only cases where the issues are masqueraded and from now on I am going to suggest to everyone to not enable GEN3 on this platform until you have an Ivy Bridge-E CPU.