Page 4 of 7 FirstFirst ... 2 3 4 5 6 ... LastLast
Results 31 to 40 of 62
  1. #31
    ROG Guru: White Belt Array DangerClose's Avatar
    Join Date
    Dec 2011
    Reputation
    10
    Posts
    107

    For what it's worth, the R4E and 7970s ARE working in 3.0 mode. Anandtech proved this. Remember how many years 802.11n went before it was "official" ??? The R4E and my 3 7970s are working in 3.0 mode, proved by GPUz testing as well. Just because it's not "official" does not mean it's not supported.
    *Asus Rampage IV Extreme - 3930K - 32GB Corsair Vengeance 1600 - 2x Corsair Force GT 120GB SSDs - WD VelociRaptor 600GB Steam drive - 3x 7970 Tri-Fire !! - Corsair AX1200 PSU - Thermaltake Level 10 GT case - H100 Cooler - Win7 x64 Ultimate

  2. #32
    Banned Array
    Join Date
    Feb 2012
    Reputation
    10
    Posts
    80

    Quote Originally Posted by DangerClose View Post
    For what it's worth, the R4E and 7970s ARE working in 3.0 mode. Anandtech proved this. Remember how many years 802.11n went before it was "official" ??? The R4E and my 3 7970s are working in 3.0 mode, proved by GPUz testing as well. Just because it's not "official" does not mean it's not supported.
    Thank you!

    My IMO I was trying to get across, unlike ATI, It is ****txy for Nvidia to be in cahoots with Intel by purposely cripple PCIe 3.0 mode by using modified Nvidias GTX 680 drivers. This was done to boost Ivy Brigde-E sales when it gets release.

    Kudos to ATI for smoking the GTX 680!

  3. #33
    Banned Array
    Join Date
    Feb 2012
    Reputation
    10
    Posts
    80

    Quote Originally Posted by mdzcpa View Post
    Awesome! Then you really will have PCIe 3.0 support.!! Congrats!!
    Its no big deal. As there is no performance benefits using PCIe 3.0 with today's current GPU offerings.

  4. #34
    Banned Array
    Join Date
    Feb 2012
    Reputation
    10
    Posts
    80

    Quote Originally Posted by mdzcpa View Post
    Well, based on your links, I certainly stand corrected when it comes to Intel's X79 motherboard advertising.

    But I also get the feeling that Intel had already invested thier money in advertising materials developed for their Intel X79 motherboards and simply let that just fly even though they retracted PCIe 3.0 support weeks before SB-E/X79 launch.

    As for the server chip and server chipset...whatever. Even though the Xeon processor does show PCIe 3.0, the desktop versions do not:
    http://ark.intel.com/products/63696/...ache-3_30-GHz)
    http://ark.intel.com/products/63697/...ache-3_20-GHz)
    http://ark.intel.com/products/63698/...ache-3_60-GHz)

    3960X
    3930K
    3820

    ....all show PCIe revision as 2.0 on Intel's own website.

    Sorry, but you haven't convinced anyone that these chips officially suppot Gen 3.

    If anything, it appears the "wool pulling" might be the other way around. I'm sure Intel would prefer everyone to believe that SB-E is PCIe 3.0 without them actually having to commit to saying so.

    By the way, can you possbily post anything without flames and insults? Just curious.

    Mike,
    Try Intels socket 2011 Xeon advertising. The X79 is crippled junk there is nothing to be excited about it.
    Intel had the motherboard manufactures cripple the X79's additional 4, sata2 ports. Intel had Nvidia to cripple the SB-E 3.0 PCIe feature. It was all done for a reason. And the reason behind it was for to set the SB-E/X79 further apart from the E5/e302 Enterprise counterparts in order to justify the higher costs.

    Anyways... You and "everyone" that is standing behind your computer screen that is not convinced, has the right to believe and do anything they want in order to satisfy themselves.

    And I am sorry that you take offence to my style of posting's. I for one, have learned never to discriminate a poster by his style of his posts. As a posting lacks feeling and emotions, and most importantly English may not be thier native tongue.

  5. #35
    ROG Guru: Blue Belt Array mdzcpa PC Specs
    mdzcpa PC Specs
    MotherboardMaximus Hero XI
    Processor9900k @ 5.1Ghz All Core 1.315v AVX -2
    Memory (part number)Gskill GTZ 4133 C17 32GB (4x8GB)
    Graphics Card #1ROG Strix 2080 TI OC
    Sound CardOn Board
    MonitorROG PG348Q
    Storage #1Samsung 970 Pro M.2 512GB for OS & APPS
    Storage #2Samsung 970 Pro M.2 1TB for GAMES
    CPU CoolerROG Ryujin 360
    CasePhanteks Evolv X
    Power SupplySeasonic Prime 1000W Titanium
    Keyboard Corsair K70 LUX RGB
    Mouse Corsair M65 RGB
    Headset Corsaur VOID Pro
    Mouse Pad Corsair MM300 Wide Desk Mat
    OS WIN 10 Pro
    Network RouterLinksys VELOP
    Accessory #1 ROG 751JY Laptop for Moobile Gaming
    Accessory #2 Koolance EXC-800 Chiller
    Accessory #3 25+ years of overclocking

    Join Date
    Nov 2011
    Reputation
    80
    Posts
    967

    Quote Originally Posted by Take it Upstairs View Post
    Thank you!

    My IMO I was trying to get across, unlike ATI, It is ****txy for Nvidia to be in cahoots with Intel by purposely cripple PCIe 3.0 mode by using modified Nvidias GTX 680 drivers. This was done to boost Ivy Brigde-E sales when it gets release.

    Kudos to ATI for smoking the GTX 680!

    If that is what you were trying to get accross, you should have just said so. Look at my Specs!! I've been happily running 7970s in crossfire on the RIVE in Gen 3 speeds since the day they came out. Since BIOS 1005 the RIVE runs these cards great at Gen 3.

    The point I was trying to make is that Nvidia is hiding behind the fact that SB-E is not "officially" 3.0....which unfortunately is true...officially speaking. If you read my first post in this thread you'd see this was my original point.


    What I meant about "anyone" is anyone that reads this thread. It is clear as day Intel does not hold SB-E out to be PCIe 3.0. Do I think that is BS??? Yes. Is X79 in general gimped from what is was suppose to be...yes, in several ways. So in the end we're not too far apart on this topic. The only thing I did with my post that you originally took exception to was that I did pay respect to the fact that Intel withheld official PCIe 3.0 just before launch. And it then opens the door for the kind of crap Nvidia just pulled.

    As far as being "offended"...no worries. I'm a battle hardened forum warrior and have been around the block a few times. But let's be honest here, the "Mikey likes it" shot wasn't actually "style" now was it But I'm not losing any sleep. I just hate to see guys get all personal and worked up over this kind of stuff as it dimishes the community as whole.

    So, we can agree to disagree on the details of the issue and move on. It's all good

  6. #36
    ROG Guru: Blue Belt Array mdzcpa PC Specs
    mdzcpa PC Specs
    MotherboardMaximus Hero XI
    Processor9900k @ 5.1Ghz All Core 1.315v AVX -2
    Memory (part number)Gskill GTZ 4133 C17 32GB (4x8GB)
    Graphics Card #1ROG Strix 2080 TI OC
    Sound CardOn Board
    MonitorROG PG348Q
    Storage #1Samsung 970 Pro M.2 512GB for OS & APPS
    Storage #2Samsung 970 Pro M.2 1TB for GAMES
    CPU CoolerROG Ryujin 360
    CasePhanteks Evolv X
    Power SupplySeasonic Prime 1000W Titanium
    Keyboard Corsair K70 LUX RGB
    Mouse Corsair M65 RGB
    Headset Corsaur VOID Pro
    Mouse Pad Corsair MM300 Wide Desk Mat
    OS WIN 10 Pro
    Network RouterLinksys VELOP
    Accessory #1 ROG 751JY Laptop for Moobile Gaming
    Accessory #2 Koolance EXC-800 Chiller
    Accessory #3 25+ years of overclocking

    Join Date
    Nov 2011
    Reputation
    80
    Posts
    967

    Quote Originally Posted by mdzcpa View Post
    This must be an Nvidia issue. My AMD 7970s run PCIe 3.0 fine on the RIVE and the P9X79 Deluxe.
    My first post in this thread. LOL.

    Carry on............

  7. #37
    Banned Array
    Join Date
    Feb 2012
    Reputation
    10
    Posts
    80

    Quote Originally Posted by mdzcpa View Post
    My first post in this thread. LOL.

    Carry on............
    Humm... So Mike... How did you validate that your 7970's are running absolutely to 3.0 PCIe specifications? GPU-z?

  8. #38
    Banned Array
    Join Date
    Feb 2012
    Reputation
    10
    Posts
    80

    You see, what makes me nuts is reading stuff like this, from the OP on this thread here:
    http://www.techpowerup.com/forums/sh...d.php?t=162942

    It reads:
    NVIDIA decided against implementing Gen 3.0 support for the new GPU on X79/SNB-E systems, at the very last moment.
    but yet
    NVIDIA could be working to fix the issue.

    WTF? Nvidia created the issue, and NOW are working to FIX thier own issue they created!
    I hope they dont lose too much sleep trying to figure this one out!

    Then you have Nvidia responding to the OP with a ridiculous statement that for all of the C5's that has already been so-called validated for PCIe 3.0, all of sudden, from out of the blue, the SB-E becomes trivial for them to figure out!

    Why was this not trivial for ATI to develop 3.0 drivers that performed well??? and what platform did ATI use to aid in validating the 7970? .............The SB-E/X79.

  9. #39
    Super Moderator - Coderat Array Nodens PC Specs
    Nodens PC Specs
    Laptop (Model)ASUS GL702VW-GC004T
    MotherboardRampage V Extreme Edition 10
    ProcessorIntel 3930K C2
    Memory (part number)CMZ16GX3M4X2133C11
    Graphics Card #1ASUS STRIX-GTX970-DC2OC-4GD5
    MonitorASUS VG278H
    Storage #13x OCZ Vertex 3 240 (RAID5 - LSI 9361-8i)
    Storage #23x WD2003FYYS (RAID5 - LSI 9361-8i)
    CPU CoolerCorsair H100i + 4x Noctua NF-P12
    CaseCoolermaster HAF-X
    Power SupplySeasonic X-1250
    Keyboard Razer Anansi
    Mouse Logitech G502
    OS Windows 10 Pro x64
    Network RouterLinksys E4200v2 (running TomatoUSB/Toastman firmware)
    Nodens's Avatar
    Join Date
    Mar 2012
    Reputation
    266
    Posts
    4,389

    I think you are a bit hasty with blaming and marketing conspiracy theory here. nVIDIA has commited to eventually enable support for the X79/SB-E combo which means they are actually not locking out GEN3 speeds on X79/SB-E but they are rather delaying them. I am also willing to bet that support will be enabled before Ivy Bridge becomes widely available. Besides they have absolutely no reason to create so much bad publicity on their company just to pitch in an Intel marketing scheme.

    Occam's razor would dictate that the issue is actually observed incompatibilities or possible theoretical incompatibilities that have to be tested and validated. The fact that the first 3xx.xx beta drivers had it enabled show that something was possibly observed. If it was all some sinister Intel scheme why would the first drivers have it enabled in the first place? It does not make any sense. The correct solution is usually the one that makes the less amount of assumptions.

    Being a registered developer with nVIDIA (and ATI/AMD but that's irrelevant) for quite a few years, I have had the chance to speak with several people involved in the driver process while troubleshooting game development related bugs. I can tell you that nVIDIA's process is quite streamlined and thorough. If something got blocked from the driver it's 99% because of an actual issue.

  10. #40
    ROG Junior Member Array
    Join Date
    Mar 2012
    Reputation
    10
    Posts
    4

    I just received my 2x GTX 680 cards today, since I was very much peeved by this driver issue, I ran some benchmarks with pci gen 3.0 and gen 2.0 to see if there was any difference, there was not.
    My scores on Heaven Benchmark was 2172 with DX11, Extreme Tesselation, Shaders High, 16x Anisotopy, 8x AA at 1920x1080 resolution.
    With 2.0 at the same settings I got 2080, which falls perfectly within the margin for error.

    Scores on 3Dmark vantage were P44629 on 3.0 vs P45045 2.0
    Scores on 3Dmark 11 were P15363 on 3.0 vs P15718 on 2.0

    If there is any interest, I would gladly make a new thread with proper screenshots of GPU-Z and the scores.

    Edit: I would note that the R4E runs these cards at 16x while in dual-SLI mode, if you ran these cards on PCI 2.0 at 8x, I'm certain you'd encounter bandwidth limitation, so for those like me who will only be running single card, or dual-SLI, you'll be fine, those wanting to run tri-SLI or quad-SLI would probably have trouble.
    Last edited by Tsteine; 03-26-2012 at 05:20 PM.

Page 4 of 7 FirstFirst ... 2 3 4 5 6 ... LastLast

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •