cancel
Showing results for 
Search instead for 
Did you mean: 

R6E Intel VROC RAID1 W10 boot working for days then RAID1 boot array disappeared!

StewartT
Level 8
BACKGROUND
After months of reading, trial and error with VROC keys, various NVMe storage, W7 & W10 testing;
I configure a RAID1 VROC CPU storage array in the BIOS.
Then when installing W10, I did an F6 driver install of Intel RSTe to allow Windows 10 to be installed on and boot from the Intel RAID1 VROC array.
I used this setup for a few days, it appeared solid. I spent days installing & configuring software to create a useful desktop.

PROBLEM
This morning when I powered on the server, it entered the BIOS .vs booting Windows 10?
When I checked the Intel VROC configuration, the RAID1 array was gone!
The two Intel 7600p Pro NVMe are listed but no longer assigned to the array.
Everything I’m using is on the Asus / Intel VROC approved list (CPU,NVMe,VROC key,X299/VROC/RSTe).

QUESTIONS
What happened to the RAID1 array configuration in the BIOS!!!!!!!!!!!?

Now what?
Do I re-create the RAID1 VROC array in the BIOS?
The BIOS states everything will be lost if I recreate the array.

In the future, what happens if a BIOS update is done?
Does the RAID array configuration get impacted?
I assume no, because this would mean an OS re-install with every BIOS update.

Any idea what caused the RAID1 array to be lost this time?
Is this a known issue?

BACKUP
Yes, I have a valid backup created using Paragon.
I assume I can recreate the RAID1 array in the BIOS, boot the Paragon recovery media, load the RSTe drivers and recover to the prior state.
I ask the questions above because I didn't expect to need this backup with a VROC RAID1 array.
12,686 Views
5 REPLIES 5

JustinThyme
Level 13
Which BIOS are you on? IM asking because some wont show the drives or the array in the BIOS if you created the array at the f6. Mine was that way on bios early BIOS. Look at the boot order and its not listed correctly there either, just says windows boot manager. Still boots just fine and shows up in the IRSTe utility as it should.

Can you share more about your build?
Exactly which drives and where they are installed?

If you updated the BIOS after creating the array all the setting go back to default. Simply put the settings where they were before and it should grab right up on the array.



“Two things are infinite: the universe and human stupidity, I'm not sure about the former” ~ Albert Einstein

Getting RAID to work consistently on domestic gaming rigs is just not worth the hassle, especially now with fast SSD drives and NVMe drives. In everyday use you wont notice very little difference to a single drive. It will only show up in benchmarking if you can stand the hassle.
Many years ago I played around with RAID with HDD's on SuperMicro server boards with LSI hardware RAID cards and also software RAID. I lost count the number of times I lost my RAID or broke the raid through updates etc. In the end I just gave up with RAID and stuck to Velocirapters which were fast enough.
RAID is fine in the server world, mostly, due to the infrastructure, sparsely spaced updates and 24/7 operation. Lately it has become more of a problem, with updates breaking RAID on servers, but the server environment is more geared up for it with enterprise backup methods.
Do yourself a favour and ditch the idea of RAID on gaming rigs ; just setup with single NVMe drives, You wont regret it.....

This is the forth, maybe five, Asus ROG board I've used over the past 10+ years in server builds.
It has a great community, most issues get resolved after reported.

I'm not a gamer (take a look at the video adapter :-).
I could have selected anyone of a number of server motherboards at this price point (Intel, Supermicro, etc...)
still, Asus ROG motherboards have a lot to offer.

I need the features and function ROG boards provide.
In most cases, they have been excellent, feature rich, stable motherboards
that use higher end parts that hold up under heavy use.

Both (02x) Intel 7600p NVMe's are installed on the ROG DIMM.2 (raiser) to use the same controller, the BIOS is set to CPU.

77698

77697

Riicckk wrote:
Getting RAID to work consistently on domestic gaming rigs is just not worth the hassle, especially now with fast SSD drives and NVMe drives. In everyday use you wont notice very little difference to a single drive. It will only show up in benchmarking if you can stand the hassle.
Many years ago I played around with RAID with HDD's on SuperMicro server boards with LSI hardware RAID cards and also software RAID. I lost count the number of times I lost my RAID or broke the raid through updates etc. In the end I just gave up with RAID and stuck to Velocirapters which were fast enough.
RAID is fine in the server world, mostly, due to the infrastructure, sparsely spaced updates and 24/7 operation. Lately it has become more of a problem, with updates breaking RAID on servers, but the server environment is more geared up for it with enterprise backup methods.
Do yourself a favour and ditch the idea of RAID on gaming rigs ; just setup with single NVMe drives, You wont regret it.....


Really? Ive had raid in about every imaginable combination running on "domestic" rigs for the last 30 years. you are right that its not so much about speed these days especially when there is still a DMI bottleneck for non intel drives downstream of the PCH. HOWEVER there are many uses for raid other than striping. Trust me when I say raid doesn't get broken from updates in enterprise applications. Drives fail from time to time and thats why there is raid mirroring in the first place. Drive dies, put in a new one and walk away, the array rebuilds itself.

On to speed in raid 0. No NVMe by itself can crank out 10,000MB/S sequential reads while also getting random 4K random speeds of 200-300MB/S

Now on to the OP delimma.
You cant raid up two drives on the DIMM.2.
You can raid DIMM.2_2 with a PCIE drive or DIMM2.1 with an M2 drive under the armour. The raid with a PCIE and the DIMM.2_2 just became plausible in VROC with the 905P 380GB intel drives. If I was just going after file space that needed to be fast Id install a hyperX 16 card with 4 of the 905P drives but thats gonna run you 2 large. You can also do it with the 2.5 inch U@ drives leaving the cover off and using U2 to M2 adapters to get 10,000MB/s sequential reads all day long.



“Two things are infinite: the universe and human stupidity, I'm not sure about the former” ~ Albert Einstein

Thank you for the reply.

You cant raid up two drives on the DIMM.2.
You can raid DIMM.2_2 with a PCIE drive or DIMM2.1 with an M2 drive under the armour. The raid with a PCIE and the DIMM.2_2 just became plausible in VROC with the 905P 380GB intel drives. If I was just going after file space that needed to be fast Id install a hyperX 16 card with 4 of the 905P drives but thats gonna run you 2 large. You can also do it with the 2.5 inch U@ drives leaving the cover off and using U2 to M2 adapters to get 10,000MB/s sequential reads all day long.


Respectfully, what is this statement based on?
i.e. completed testing and/or documentation discovered?


To be clear, the R6E BIOS does allow 02x Intel 7600p Pro NVMe's installed in the DIMM.2 NVMe raiser to be seen & configured into a RAID1 VROC array. Intel VROC documents contain a list of approved, non-Intel, & Intel NVMe's such as the 905p and others (please see test finding below).
This RAID1 array can then be installed and configured as a boot device for W10 using an F6 driver install of RSTe.

As stated, I had one "unexplained" boot failure of the Intel VROC RAID1 array.
From your prior post, It sounds like VROC has improved in the latest BIOS update?

When I checked the state of the RAID1 array in the BIOS, after the boot failure, the RAID1 array configuration was gone.
I had not entered or changed anything in the BIOS prior to this configuration loss.

I recreated the RAID1 array, and the server has booted everyday as expected for a week.
Clearly, no one likes RAID1 array configurations disappearing from a BIOS.

COMPLETED VROC TESTING
I tested various R6E NVMe configurations that I expected to work only to find later
some configurations were restricted (based on Intel x299 VROC documents).

Thus far I've tested using 02x Samsung 970 Pro NVMe's, 02x Intel 760p NVMe's, and 02x Intel 7600p Pro NVMe's
FINDINGS -
a) The DIMM.2 is able to access 01x of 02x installed NVMe's when the NVMe is not Intel branded (01x Samsung NVMe of 02x could be seen).
Intel VROC documents contain a list of approved, non-Intel, NVMe's that I expect would not have this limit.
I assume 01x non-Intel NVMe could be placed in the DIMM.2, 01x non-Intel NVMe could be installed on the MB, or additional non-Intel NVMe via PCI adapters.
b) Without the Intel feature key installed, 02x Intel 760p NVMe's can be seen on the DIMM.2 raiser and configured RAID0. I tested this.
I've read this RAID0 array can not be used as a boot device due to Intel restrictions. I've not tested this but assume this stated limit exists.
c) With the Intel feature key installed, 02x Intel 7600p Pro NVMe's can be seen on the DIMM.2 raiser and configured RAID0 or RAID1. I tested this.
I setup a RAID1 array, installed W10 on this array as the server boot device. I assume a RAID0 array could also be configured as a W10 boot device.

WINDOW7
I attempted creating W7 install media with NVMe support then using an F6 RSTe install.
W7 is not able to see the RAID array, this prevents W7 installation on the array.

This appears to be an RSTe .vs RST issue.

NEXT STEPS
I hope to meet with Intel to cover x299 VROC in greater detail in Jan.
I'll have a few additional weeks of use completed.