Results 1 to 10 of 24
Thread: Speed issue with Raid 0
-
03-10-2018 02:18 AM #1
- Join Date
- Nov 2016
- Reputation
- 10
- Posts
- 8
Speed issue with Raid 0
My system has the following specs
Zenith Extreme with 1950x Threadripper
128Gb Ram
8 x Samsung Evo 960 nvme drives in Raid 0
2 Asus Hyper 16x Cards
Currently I have the 2 Hyper cards full up with all 8 drives, but I have tried various combinations of drives on the board and DIMM card, but it's not much faster than a single drive. It works, it boots, everything looks normal, it's just not as fast as it should be. It's averaging around 4500-4800 MB/Sec.
I have what looks like the same config as the der8auer video, but I'm just not getting anywhere near the speed.
I'm running out of things to try and I could use some ideas.
-
03-11-2018 12:48 AM #2
- Join Date
- Jan 2018
- Reputation
- 10
- Posts
- 14
First off, *drool* over the 8 x 960 pros lol.
What version of UEFI?
All drives on the latest FW from Samsung?
Both slots hosting the HyperX cards set to NVME raid in UEFI?
Pure UEFI environment or using CSM?
Checked the write cache settings in Device Manager?
Checked the array details in RaidXpert2 in Windows? It shows a lot more info about the array than UEFI.
Download HWiNFO64 and check that each drive is getting it's full x4 of bandwidth. Even in a RAID you should be able to see the lanes assigned to each one. Here you can see my 2 x 950 Pros in RAID 0 getting x4 each (about 4/5 of the way down the list):
These are the common things I can think of. It sounds like they're being starved for PCIe bandwidth. Below are my speeds on 2x 512GB 950 Pros on the DIMM.2:
-
03-11-2018 01:58 AM #3
- Join Date
- Nov 2016
- Reputation
- 10
- Posts
- 8
Thanks for the reply.
They're not Pros, just Evos, but still pretty nice drives
Using 0902
All drives updated to the latest version
Both slots set to NVME Raid for the HyperX cards
CSM is diabled, but I tried with it enabled and it didn't change anything
Write cache is enabled
Everything looks right in RaidXpert2
Check HWinfo64 and verified all drives show X4
I agree it looks like it's starved for PCIE bandwidth, but the Threadripper has 64 lanes. I'm using 32 for drives, and 8 for my GPU. That should leave 24 for system use and all other devices.
And looking at your speeds, you are doing better with 2 drives than I am with 8.
-
03-11-2018 05:15 PM #4
- Join Date
- Jan 2018
- Reputation
- 10
- Posts
- 14
Really weird! Definitely have plenty of lanes left over. Tried swapping around slots just for good measure? You said you'd tried putting a couple of the drives on the DIMM.2 and it didn't change anything? Would eliminate a problem with at least 1 of the HyperX 16 cards. Hmmmm.
-
03-11-2018 05:28 PM #5
- Join Date
- Nov 2016
- Reputation
- 10
- Posts
- 8
I've had the cards in different slots, tested them individually, and one at a time.
Originally, I had one on the board and 2 on the DIMM.2 but I didn't bench them. I might try that again.
I have no idea what's going on unless I'm seeing a driver or BIOS issue.
-
03-11-2018 06:33 PM #6
- Join Date
- Nov 2016
- Reputation
- 10
- Posts
- 8
I'm starting to suspect heat. I re-arranged my fans and now I'm actually getting lower with no other changes.
-
03-11-2018 07:34 PM #7
- Join Date
- Jan 2018
- Reputation
- 10
- Posts
- 14
Don't the cards have built in fans? Is there enough cool air flow inside the case to supply them? If you suspect heat I'd remove the covers, prop a couple fans up in front of the bare cards and run some benches.
-
03-11-2018 07:43 PM #8
- Join Date
- Nov 2016
- Reputation
- 10
- Posts
- 8
They do. That's how I thought it might be heat related. I turned off the fans and it performed worse.
I probably need a new case. I originally got this one a few years ago when I had 6 mechanical drives in it, so it's optimized to cool drives, not the slots. Probably time to invest in a new case.
I also have one configuration left I could try that moves all the drives away from the video card, but it puts all of the drives on one side of the CPU, and I'm not sure if that will work or not.
Edit for clarity:
What I mean is that each side of the CPU has 32 lanes. From my understanding, you lose 4 off the top for system use, leaving 28. That means I can't put all 8 on one side. I can put 2 drives back on the DIMM, and put the video card on the same side, but it still means I can't put 4 drives on with it, or I have to run my video car at x8, which I am now in the configuration it's in anyway, so probably not a big deal.Last edited by ragepaw; 03-11-2018 at 07:50 PM.
-
03-15-2018 03:34 AM #9
- Join Date
- Mar 2018
- Reputation
- 10
- Posts
- 8
-
03-15-2018 02:42 PM #10
JustinThyme PC Specs Laptop (Model) G752VY-DH72 Motherboard Rampage VI Extreme Processor I9 9940X Memory (part number) 64GB DDR4 8x8 Corsair Dominator Platinum 3800 MHz @ C17 Graphics Card #1 ASUS Strix 2080Ti O11G @ 2.1GHz Graphics Card #2 ASUS Strix 2080Ti O11G @ 2.1Ghz Graphics Card #3 ROG Nvlink Graphics Card #4 Have to feed animals Sound Card External Audioengine D1 24 bit 192kbps DAC Monitor ASUS PG348Q @ 100Hz Storage #1 Intel 905P 480GB U2 flavor Storage #2 Samsung 850 EVO 1TB X2 in RAID 0, 960 PRO 1TB DIMM.2_1 CPU Cooler HeatKiller IV PRO and VRM blocks ,Dual D5 PWM serial, 2X 480, 1X 360 RADS Case Phanteks Enthoo Elite 8X LL120 PWM, 3X LL140 PWM, 12 SP120 PWM 1x AF140 PWM Power Supply Corsair AX 1500i Keyboard ASUS Claymore Mouse ASUS Spatha, Logitech MX Master Headset Sennheiser HD 700 Mouse Pad ASUS ROG Sheath Headset/Speakers Audioengine A5+ with SVS SB-1000 Sub OS Win10 Pro 1809 Network Router NetGear NightHawk X10 Accessory #1 NetGear Prosafe 10GBe Switch Accessory #2 Qnap TVS-682 NAS modded with I7 CPU
- Join Date
- Nov 2013
- Reputation
- 144
- Posts
- 3,854
compatible models for Hyper card
ROG RAMPAGE VI EXTREME
ROG RAMPAGE VI APEX
ROG STRIX X299-XE GAMING
ROG STRIX X299-E GAMING
PRIME X299-DELUXE
PRIME X299-A
TUF X299 MARK 1
TUF X299 MARK 2
There has never been official support for AMD but an updated driver was suppoed to make it possible.