cancel
Showing results for 
Search instead for 
Did you mean: 

Rampage VI Extreme and Samsung 960 Pro in Raid

clindsay616
Level 7
So I just got my Rampage VI Extreme and 7900X up and running however I was not able to figure out how to get my two drives to create a bootable raid drive.

I have 2 Samsung 960Pro 1TB and I currently have them both in the dimm.2 slot. Everytime I try to set up the raid array it says they are unsupported. Do I need to not have them both on the Dimm.2? Is there something else I am missing that I need to do before I set up the array?

Thanks!
87,753 Views
150 REPLIES 150

smithkid
Level 7
clindsay616@gmail.com wrote:
So I just got my Rampage VI Extreme and 7900X up and running however I was not able to figure out how to get my two drives to create a bootable raid drive.

I have 2 Samsung 960Pro 1TB and I currently have them both in the dimm.2 slot. Everytime I try to set up the raid array it says they are unsupported. Do I need to not have them both on the Dimm.2? Is there something else I am missing that I need to do before I set up the array?

Thanks!


Pretty sure you have to do it via chipset and will boot into Linex but not windows. For Windows boot you need VROC and Intel drives. There is the ASUS card however as an option and it will take 4 960 pros.

https://www.slideshare.net/pertonas/hyper-m2-x16-x-intel-vroc

Moloch
Level 7
clindsay616@gmail.com wrote:
So I just got my Rampage VI Extreme and 7900X up and running however I was not able to figure out how to get my two drives to create a bootable raid drive.

I have 2 Samsung 960Pro 1TB and I currently have them both in the dimm.2 slot. Everytime I try to set up the raid array it says they are unsupported. Do I need to not have them both on the Dimm.2? Is there something else I am missing that I need to do before I set up the array?

Thanks!


I was able to set up raid 0 with 2x Samsung 960 on a Dimm.2 with R6A. Unfortunately it appeared that the DIMM.2 slot is connected via a single 4x bus, rather than a 4x bus for each slot. So I still had a max speed of ~3500GB/s.

After setting up raid between the 2 different DIMM.2 cards on the Apex, I got speeds of nearly double. So I would recommend setting up raid between the single chipset slot and one of the DIMM.2 slots on the R6E, so they each get their own dedicated 4x lane (or get a PCIe card for VROC that supports non-Intel M.2, I think ASUS makes one)

Keep in mind, there are 2 different types of raid now:
IRST = VROC raid (must use VROC/cpu lanes... might require Intel brand M.2 drives... should be bootable) (I'm guessing this is the type you tried?)
RSTe = Chipset raid (I use this type, allows any drive, but not bootable)

Moloch wrote:
I was able to set up raid 0 with 2x Samsung 960 on a Dimm.2 with R6A. Unfortunately it appeared that the DIMM.2 slot is connected via a single 4x bus, rather than a 4x bus for each slot. So I still had a max speed of ~3500GB/s.

After setting up raid between the 2 different DIMM.2 cards on the Apex, I got speeds of nearly double. So I would recommend setting up raid between the single chipset slot and one of the DIMM.2 slots on the R6E, so they each get their own dedicated 4x lane (or get a PCIe card for VROC that supports non-Intel M.2, I think ASUS makes one)

Keep in mind, there are 2 different types of raid now:
IRST = VROC raid (must use VROC/cpu lanes... might require Intel brand M.2 drives... should be bootable) (I'm guessing this is the type you tried?)
RSTe = Chipset raid (I use this type, allows any drive, but not bootable)


So I am resolved to not worrying about Raiding these drives. The 1tb 960pro is already blazing fast. My question would be is the Dimm.2 Faster Directly to the CPU or the PCIEx4? I am planning on splitting them up so that I only have one drive in the Dimm.2 slot and one in the PCIEx4 but was curious which was better for the OS.

Moloch wrote:
I was able to set up raid 0 with 2x Samsung 960 on a Dimm.2 with R6A. Unfortunately it appeared that the DIMM.2 slot is connected via a single 4x bus, rather than a 4x bus for each slot. So I still had a max speed of ~3500GB/s.

After setting up raid between the 2 different DIMM.2 cards on the Apex, I got speeds of nearly double. So I would recommend setting up raid between the single chipset slot and one of the DIMM.2 slots on the R6E, so they each get their own dedicated 4x lane (or get a PCIe card for VROC that supports non-Intel M.2, I think ASUS makes one)

Keep in mind, there are 2 different types of raid now:
IRST = VROC raid (must use VROC/cpu lanes... might require Intel brand M.2 drives... should be bootable) (I'm guessing this is the type you tried?)
RSTe = Chipset raid (I use this type, allows any drive, but not bootable)


So I am resolved to not worrying about Raiding these drives. The 1tb 960pro is already blazing fast. My question would be is the Dimm.2 Faster Directly to the CPU or the PCIEx4? I am planning on splitting them up so that I only have one drive in the Dimm.2 slot and one in the PCIEx4 but was curious which was better for the OS.

Moloch wrote:
I was able to set up raid 0 with 2x Samsung 960 on a Dimm.2 with R6A. Unfortunately it appeared that the DIMM.2 slot is connected via a single 4x bus, rather than a 4x bus for each slot. So I still had a max speed of ~3500GB/s.

After setting up raid between the 2 different DIMM.2 cards on the Apex, I got speeds of nearly double. So I would recommend setting up raid between the single chipset slot and one of the DIMM.2 slots on the R6E, so they each get their own dedicated 4x lane (or get a PCIe card for VROC that supports non-Intel M.2, I think ASUS makes one)

Keep in mind, there are 2 different types of raid now:
IRST = VROC raid (must use VROC/cpu lanes... might require Intel brand M.2 drives... should be bootable) (I'm guessing this is the type you tried?)
RSTe = Chipset raid (I use this type, allows any drive, but not bootable)


Yes, Apex has one DIMM.2 slot connected to PCH, and another one, connected directly to CPU (I mean VROC), so you mean that you had ONE Samsung 960 in each DIMM.2 and was able to set RAID0 ? ...please, clarify. Many thanks in advance.

NixNemo wrote:
Yes, Apex has one DIMM.2 slot connected to PCH, and another one, connected directly to CPU (I mean VROC), so you mean that you had ONE Samsung 960 in each DIMM.2 and was able to set RAID0 ? ...please, clarify. Many thanks in advance.


If he is doing RAID 0 as bootable he is doing it through the PCH bus, which is possible but it will cap seq reads at 3500 ish, just like the z170/z270 boards. VROC RAID 0 on a bootable drive is NOT possible without using specific Intel drives. The CPU DIMM.2 has 8 pci-e lanes attached to it so if you were to set up your Windows OS on the PCH DIMM.2 then you could do a RAID 0 non-bootable drive and get full windows RAID speeds of 6000-7000 Seq Reads.
Panteks Enthoo Elite / Asus x299 Rampage VI Extreme / Intel I9-7900X / Corsair Dominator RGB 3200MHz

MSI GTX 1080 TI / 2x Intel 900p / Samsung 970 Pro 512GB

Samsung 850 PRO 512GB / Western Digital Gold 8TB HD

Corsair AX 1200i / Corsair Platinum K95 / Asus Chakram

Acer XB321HK 4k, IPS, G-sync Monitor / Water Cooled / Asus G571JT Laptop

rzotti
Level 7
Going over a similar question myself .... Can we have two 960's on the dimm 2, configured to cpu, in raid-0, non-bootable, and a third 960 , on the single chipset slot, for boot only?

On the asus hyper card - using it with 960's wouldn't have the same limitation as the board itself? Meaning , Intel is locking non-intel drives out of vroc , you could add 4 960,s on that card but vroc should still not work, right?


Sent from my iPhone using Tapatalk

Hey Rzotti,

I'm planning on doing the same thing with 2 Samsung 960 2tb m.2 drives.

Was originally going to raid 0 them, but I guess nvme is locked out, not only on the VROC add in card, but also on the DIMM.2s?

Disappointing, but waiting for intel 900p drives to VROC them all together. The Sammies are fast enough just the way they are, but I might raid 0 them once I move my OS to the 900P drives.

Mavtop wrote:
Hey Rzotti,

I'm planning on doing the same thing with 2 Samsung 960 2tb m.2 drives.

Was originally going to raid 0 them, but I guess nvme is locked out, not only on the VROC add in card, but also on the DIMM.2s?

Disappointing, but waiting for intel 900p drives to VROC them all together. The Sammies are fast enough just the way they are, but I might raid 0 them once I move my OS to the 900P drives.


Aren't the Intel 900P's U.2 Optane memory? Don't think what you mention is supported by the hardware or the software. The MB's generally have 1 U.2 slot and the rest are M.2 as well as the extension cards like the ASUS Hyper M.2 X16. Also in general if you have Raided enough M.2 SSDs your going to be faster than the U.2 Optane Memory and there really is no need for it unless the Optane memory is really that faster but from what I have read so far the Optane memory is really not a feasible/desirable solution but maybe the 900Ps change this we will see.