Page 2 of 2 FirstFirst 1 2
Results 11 to 19 of 19
  1. #11
    ROG Guru: Black Belt Array Korth PC Specs
    Korth PC Specs
    MotherboardASUS X99 R5E (BIOS2101/1902)
    ProcessorHaswell-EP E5-1680-3 SR20H/R2 (4.4GHz)
    Memory (part number)Vengeance LPX 4x8GB SS DDR4-3000 (CMK32GX4M4C3000C15)
    Graphics Card #1NVIDIA Quadro GP100GL/16GB, 16xPCIe3, NVLink1 (SLI-HB)
    Graphics Card #2NVIDIA Quadro GP100GL/16GB, 16xPCIe3, NVLink1 (SLI-HB)
    Sound CardJDS Labs O2+ODAC (RevB), USB2 UAC1
    MonitorASUS PG278Q
    Storage #1Samsung 850 PRO 512GB SSDs, 4xSATA3 RAID0
    Storage #2Comay BladeDrive E28 3200GB SSD, 8xPCIe2
    CPU CoolerRaijintek NEMESIS/TISIS, AS5, 2xNH-A14
    CaseObsidian 750D (original), 6xNH-A14
    Power SupplyZalman/FSP ZM1250 Platinum
    Headset Pilot P51 PTT *modded*
    OS Arch, Gentoo, Win7x64, Win10x64
    Network RouterActiontec T3200M VDSL2 Gateway
    Accessory #1 TP-Link AC1900 Archer T9E, 1xPCIe
    Accessory #2 ASUS/Infineon SLB9635 TPM (TT1.2/FW3.19)
    Accessory #3 ASUS OC Panel I (FW0501)
    Korth's Avatar
    Join Date
    Mar 2015
    Reputation
    152
    Posts
    2,719

    The EZ-Plug just provides more electrical flow into the PCIe slots to ensure the GPUs are more stable when the main VRMs get bogged at peak load. Sort of like having a bank of decoupling capacitors in circuit - but with better results, uses less PCB space, lower component count, lower costs.

    There's no valid reason not to use the EZ-Plug, it takes strain (and heat) off the main VRMs if nothing else. I assume all your motherboard ATX power inputs and GPU PCI(12V) power inputs are all connected. TitanX cards pull a lot of watts, and it doesn't matter where they're routed through the mobo because they're still all provided by the PSU. And you're correct about the pin/trace gauges having maximum safe power handling capabilities - just another good reason to put more of them onboard (like the EZ-Plug does).

    And let's face it - people running 2-SLI TitanX configurations aren't likely to buy an X99-PRO, they're going to go for the high-end mobos which provide superior power handling and overclocking features. X99-PRO was launched before the TitanX was launched, it will run 2-SLI GTX980 quite fine and Asus had no reason to engineer it for insanely heavy TitanX power loads.

    The OP has 32GB installed - twin TitanX cards will run on 32GB but would generally do better with 48GB or more.

  2. #12
    ROG Guru: Orange Belt Array
    Join Date
    Aug 2015
    Reputation
    35
    Posts
    442

    Yes, and ignore my comment about small gauge wires, I wasn't looking at it and remembered it as a smaller plug.

    It wasn't that I was questioning it, in fact, I think it is rather critical to the stability of this board under load, I am wondering why the R5E needs so much more than the X99-PRO? Or perhaps wondering more whether my X99-PRO is going to flake out on me at some point for lack of them?

  3. #13
    ROG Guru: Green Belt Array Qwinn PC Specs
    Qwinn PC Specs
    MotherboardRampage V Extreme - BIOS 3301
    Processor5930K @ 4.4 GHz Adaptive 1.27v / 4.2 Ghz cache Offset +0.27v / Input 1.92v / LLC7
    Memory (part number)G.Skill 32gb DDR4-2666Mhz 15-15-15-35-CR2 1.2v at XMP settings
    Graphics Card #1Gigabyte Gaming G1 980Ti OC to 1455Mhz Core 8000Mhz memory at Stock Voltage
    Graphics Card #2Gigabyte Gaming G1 980Ti SLI
    MonitorROG Swift PG278Q
    Storage #1Intel 750 PCIe 1.2TB
    Storage #2Sandisk Ultra II 960GB SSD
    CPU CoolerThermaltake Water 3.0 Ultimate (360" rad AIO)
    CaseThermalTake v71 Full Tower
    Power SupplyEVGA SuperNOVA 1200 P2
    Keyboard Saitek Eclipse II
    Mouse Razer A5090 Master of Destiny
    OS Windows 10 Professional 64 bit Build 10586

    Join Date
    Jul 2015
    Reputation
    46
    Posts
    670

    Cekim, if you're talking only 2 way SLI, I'm sure all boards can handle it. I think those connectors are more intended toward 3 or 4 way SLI. Could those boards handle that?

    Anyway, I read this thread with great interest as I've been running for the last 6 months without these plugged in. I went ahead and plugged both of them in tonight. One result is that I *may* be more stable on my GPU's. However, they are also running hotter. First card is still maxing around 81c in Witcher 3 in 3D at 1440p on mostly ultra settings, but it's at that temp more often and throttling more and the fans are considerably noisier, getting up to 82% speed, never used to go that high. And this was just hanging around one of the little towns without moving, I can only imagine it'll get worse in other areas. This is at stock voltage running +125v +500 memory oc, btw, which was mostly if not perfectly stable prior to plugging these in. If it doesn't result in absolutely perfect stability over the next few days, I'll definitely be unplugging them.

    Just to confirm, if I don't raise voltage, oc'ing the frequencies shouldn't add heat, should it?

    It's actually pretty odd because when I first booted up, it seemed like my idle temps were lower. But under load, way higher.

    EDIT: Hmmm, maybe not. One thing that happened is that the tinkering made my GPU1 press up slightly against my Intel 750, and that stopped the central fan on the GPU1 from spinning. That got me to the 81c and 80% fan mark faster than usual, which is why I probably noticed it for the first time. I resolved that, but testing again I found the stand-there-and-stare-at-my-horse test did eventually get to those numbers again, it just took longer (like 5-7 minutes). Running around and actually playing seems to heat the cards less than just standing there, oddly. Noting that I had never really done that specific test in that manner previously, I unplugged the molex cable from the PSU and tested again. (I left the CPU 4 pin one in, way too hard to get that one out). Got pretty much the same result. So at least the molex cable doesn't seem to actually add heat. Can't say about the 4 pin one, but I'm presuming that one is oriented to affect CPU/memory heat, not GPU heat. I didn't notice any major change to CPU heat, dropped a fraction if anything.
    Last edited by Qwinn; 11-19-2015 at 07:31 AM.

  4. #14
    ROG Guru: Yellow Belt Array skypine27 PC Specs
    skypine27 PC Specs
    MotherboardRampage V Extreme
    Processor5960x
    Graphics Card #1Titan X
    Graphics Card #2Titan X
    skypine27's Avatar
    Join Date
    Dec 2014
    Reputation
    10
    Posts
    174

    Im a windows 10 guy with 2 x Titan X's in SLI as well. Working fine.

    Do you have a 1200+ watt PSU? Might try an Ax1200i
    *CPU: Intel 7980XE @ 4.4ghz (by Core usage) w/ EK monoblock
    *Mobo: Asus Rampage VIE
    *RAM: 64GB DDR4 3000 G.Skill TridentZ
    *Graphics: Titan Xp w/EK block+backplate
    *Monitor: Dell Alienware AW3418DW @ 120hz
    *Storage OS: Samsung SM961 Pro (1TB) Windows 10
    *Storage Games Internal: 4TB 850 EVO RAID0
    *Storage Extermal: 32TB Raid0 (External USB 3.1 Box)
    *Case/PSU: Thermaltake V71 TG/RGB + 3 Rads (120mm, 360mm, 420mm) + Corsair AX1200i PSU

  5. #15
    ROG Guru: Orange Belt Array
    Join Date
    Aug 2015
    Reputation
    35
    Posts
    442

    Quote Originally Posted by Qwinn View Post
    Cekim, if you're talking only 2 way SLI, I'm sure all boards can handle it. I think those connectors are more intended toward 3 or 4 way SLI. Could those boards handle that?
    Don't know, I only have 2-way and it seems to be fine on the X99. You may be right that it only matters with 3+ way. Or it could be that I might get another 100MHz OC out of my system with them that I can't get now (though frankly, I am temperature bound as I'd really like to load up 16 threads at 100% and not see anything more than ~65C as I put them there and keep them there for days at a time for work).

    Quote Originally Posted by Qwinn
    However, they are also running hotter
    ...
    EDIT: Hmmm, maybe not. One thing that happened is that the tinkering made my GPU1 press up slightly against my Intel 750, and that stopped the central fan on the GPU1 from spinning. That got me to the 81c and 80% fan mark faster than usual, which is why I probably noticed it for the first time. I resolved that, but testing again I found the stand-there-and-stare-at-my-horse test did eventually get to those numbers again, it just took longer (like 5-7 minutes). Running around and actually playing seems to heat the cards less than just standing there, oddly. Noting that I had never really done that specific test in that manner previously, I unplugged the molex cable from the PSU and tested again. (I left the CPU 4 pin one in, way too hard to get that one out). Got pretty much the same result. So at least the molex cable doesn't seem to actually add heat. Can't say about the 4 pin one, but I'm presuming that one is oriented to affect CPU/memory heat, not GPU heat. I didn't notice any major change to CPU heat, dropped a fraction if anything.
    I've found I have to be pretty aggressive with GPU fan settings - the defaults want to cook these things. I learned that in my 4970 case with 2 980's the default fan settings let them run to the point of thermal shutdown. Changed the GPU fan setting in GPU tweak and now they run at 55-65C playing 4K games and 34-35C idle.

  6. #16
    ROG Guru: Green Belt Array Qwinn PC Specs
    Qwinn PC Specs
    MotherboardRampage V Extreme - BIOS 3301
    Processor5930K @ 4.4 GHz Adaptive 1.27v / 4.2 Ghz cache Offset +0.27v / Input 1.92v / LLC7
    Memory (part number)G.Skill 32gb DDR4-2666Mhz 15-15-15-35-CR2 1.2v at XMP settings
    Graphics Card #1Gigabyte Gaming G1 980Ti OC to 1455Mhz Core 8000Mhz memory at Stock Voltage
    Graphics Card #2Gigabyte Gaming G1 980Ti SLI
    MonitorROG Swift PG278Q
    Storage #1Intel 750 PCIe 1.2TB
    Storage #2Sandisk Ultra II 960GB SSD
    CPU CoolerThermaltake Water 3.0 Ultimate (360" rad AIO)
    CaseThermalTake v71 Full Tower
    Power SupplyEVGA SuperNOVA 1200 P2
    Keyboard Saitek Eclipse II
    Mouse Razer A5090 Master of Destiny
    OS Windows 10 Professional 64 bit Build 10586

    Join Date
    Jul 2015
    Reputation
    46
    Posts
    670

    Unfortunately, while I think I did okay on the silicon lottery, I did crappy on the fan noise lottery. The fans on the 980 Ti's start getting uncomfortably loud around 70% and *really* annoying at anything above 80%. At 100% they're shrieking.

    In my "stand there and stare at my horse" test in Witcher 3, the heat just builds for 5-10 minutes until it seems to even out at 81-82c and 81% fan. Not sure how I can tweak the fans to overcome that problem. I'd be uncomfortable letting the cards get much warmer than that (spec on 980Ti says max temp is 92c) and to spin the fans even faster is too painful as well.

    On the bright side, I do get 55+ fps when doing that, in spectacular 3D. I'd happily bring that down to 40-45 fps to bring down the noise and heat though, but not sure how to do that, given I'm at stock voltage.

    I've been running these tests with monitor set at 144Mhz refresh rate, and lately there's been rumblings about high idle temps with the last couple of drivers at that rate. Going to try running at 120Mhz refresh rate over the weekend and see if it makes a difference.
    Last edited by Qwinn; 11-19-2015 at 07:59 PM.

  7. #17
    ROG Guru: Orange Belt Array
    Join Date
    Aug 2015
    Reputation
    35
    Posts
    442

    Quote Originally Posted by Qwinn View Post
    Unfortunately, while I think I did okay on the silicon lottery, I did crappy on the fan noise lottery. The fans on the 980 Ti's start getting uncomfortably loud around 70% and *really* annoying at anything above 80%. At 100% they're shrieking.

    In my "stand there and stare at my horse" test in Witcher 3, the heat just builds for 5-10 minutes until it seems to even out at 81-82c and 81% fan. Not sure how I can tweak the fans to overcome that problem. I'd be uncomfortable letting the cards get much warmer than that (spec on 980Ti says max temp is 92c) and to spin the fans even faster is too painful as well.

    On the bright side, I do get 55+ fps when doing that, in spectacular 3D. I'd happily bring that down to 40-45 fps to bring down the noise and heat though, but not sure how to do that.
    Yeah, they are quite loud even at 75.

    I also added SP fans to the case side panel to blow directly on the cards and VRM of the CPU. That helped lowering the requirement of the card fan to go nuts to keep up.

  8. #18
    ROG Guru: Green Belt Array GoNz0- PC Specs
    GoNz0- PC Specs
    Laptop (Model)Dell XPS 9530
    MotherboardRampage V Extreme
    Processor5930
    Memory (part number)[Ripjaws 4] F4-2666C15Q-16GRR
    Graphics Card #1Asus Strix GTX980
    CPU CoolerEk Evo Supremacy
    CaseCorsair 800D
    Power SupplyEnermax 1250w revolution 85
    Keyboard Logitech 710+
    Mouse Razer Mamba 2012
    Headset Sennheiser PC350
    OS Windows 8.1 Pro
    Network RouterAsus AC68u

    Join Date
    Aug 2012
    Reputation
    24
    Posts
    568

    consider water cooling!

  9. #19
    ROG Guru: Orange Belt Array
    Join Date
    Aug 2015
    Reputation
    35
    Posts
    442

    Quote Originally Posted by GoNz0- View Post
    consider water cooling!
    Eventually... My 59xx's have been finicky beasts compared to other machines I have

    So, water cooling video cards sounds like another project right now...

    My solution to fan noise was a very large, basically climate controlled closet (long story, house built that way) behind my office and a hole in the wall through which all the wires go. Problem staying solved. ;-)

Page 2 of 2 FirstFirst 1 2

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •