Results 1 to 6 of 6
  1. #1
    ROG Member Array
    Join Date
    Oct 2018
    Reputation
    11
    Posts
    6

    R6E Intel VROC RAID1 W10 boot working for days then RAID1 boot array disappeared!

    BACKGROUND
    After months of reading, trial and error with VROC keys, various NVMe storage, W7 & W10 testing;
    I configure a RAID1 VROC CPU storage array in the BIOS.
    Then when installing W10, I did an F6 driver install of Intel RSTe to allow Windows 10 to be installed on and boot from the Intel RAID1 VROC array.
    I used this setup for a few days, it appeared solid. I spent days installing & configuring software to create a useful desktop.

    PROBLEM
    This morning when I powered on the server, it entered the BIOS .vs booting Windows 10?
    When I checked the Intel VROC configuration, the RAID1 array was gone!
    The two Intel 7600p Pro NVMe are listed but no longer assigned to the array.
    Everything I’m using is on the Asus / Intel VROC approved list (CPU,NVMe,VROC key,X299/VROC/RSTe).

    QUESTIONS
    What happened to the RAID1 array configuration in the BIOS!!!!!!!!!!!?

    Now what?
    Do I re-create the RAID1 VROC array in the BIOS?
    The BIOS states everything will be lost if I recreate the array.

    In the future, what happens if a BIOS update is done?
    Does the RAID array configuration get impacted?
    I assume no, because this would mean an OS re-install with every BIOS update.

    Any idea what caused the RAID1 array to be lost this time?
    Is this a known issue?

    BACKUP
    Yes, I have a valid backup created using Paragon.
    I assume I can recreate the RAID1 array in the BIOS, boot the Paragon recovery media, load the RSTe drivers and recover to the prior state.
    I ask the questions above because I didn't expect to need this backup with a VROC RAID1 array.
    Last edited by StewartT; 12-06-2018 at 09:34 PM.

  2. #2
    Banned Array JustinThyme PC Specs
    JustinThyme PC Specs
    Laptop (Model)G752VY-DH72
    MotherboardRampage VI Extreme
    ProcessorI9 9940X
    Memory (part number)64GB DDR4 8x8 Corsair Dominator Platinum 3800 MHz @ C17
    Graphics Card #1ASUS Strix 2080Ti O11G @ 2.1GHz
    Graphics Card #2ASUS Strix 2080Ti O11G @ 2.1Ghz
    Graphics Card #3ROG Nvlink
    Graphics Card #4Have to feed animals
    Sound CardExternal Audioengine D1 24 bit 192kbps DAC
    MonitorASUS PG348Q @ 100Hz
    Storage #1Intel 905P 480GB U2 flavor
    Storage #2Samsung 850 EVO 1TB X2 in RAID 0, 960 PRO 1TB DIMM.2_1
    CPU CoolerHeatKiller IV PRO and VRM blocks ,Dual D5 PWM serial, 2X 480, 1X 360 RADS
    CasePhanteks Enthoo Elite 8X LL120 PWM, 3X LL140 PWM, 12 SP120 PWM 1x AF140 PWM
    Power SupplyCorsair AX 1500i
    Keyboard ASUS Claymore
    Mouse ASUS Spatha, Logitech MX Master
    Headset Sennheiser HD 700
    Mouse Pad ASUS ROG Sheath
    Headset/Speakers Audioengine A5+ with SVS SB-1000 Sub
    OS Win10 Pro 1809
    Network RouterNetGear NightHawk X10
    Accessory #1 NetGear Prosafe 10GBe Switch
    Accessory #2 Qnap TVS-682 NAS modded with I7 CPU

    Join Date
    Nov 2013
    Reputation
    144
    Posts
    3,858

    Which BIOS are you on? IM asking because some wont show the drives or the array in the BIOS if you created the array at the f6. Mine was that way on bios early BIOS. Look at the boot order and its not listed correctly there either, just says windows boot manager. Still boots just fine and shows up in the IRSTe utility as it should.

    Can you share more about your build?
    Exactly which drives and where they are installed?

    If you updated the BIOS after creating the array all the setting go back to default. Simply put the settings where they were before and it should grab right up on the array.
    Last edited by JustinThyme; 12-07-2018 at 04:45 AM.

  3. #3
    ROG Enthusiast Array
    Join Date
    Oct 2015
    Reputation
    10
    Posts
    42

    Getting RAID to work consistently on domestic gaming rigs is just not worth the hassle, especially now with fast SSD drives and NVMe drives. In everyday use you wont notice very little difference to a single drive. It will only show up in benchmarking if you can stand the hassle.
    Many years ago I played around with RAID with HDD's on SuperMicro server boards with LSI hardware RAID cards and also software RAID. I lost count the number of times I lost my RAID or broke the raid through updates etc. In the end I just gave up with RAID and stuck to Velocirapters which were fast enough.
    RAID is fine in the server world, mostly, due to the infrastructure, sparsely spaced updates and 24/7 operation. Lately it has become more of a problem, with updates breaking RAID on servers, but the server environment is more geared up for it with enterprise backup methods.
    Do yourself a favour and ditch the idea of RAID on gaming rigs ; just setup with single NVMe drives, You wont regret it.....

  4. #4
    ROG Member Array
    Join Date
    Oct 2018
    Reputation
    11
    Posts
    6

    Hardware & software details

    This is the forth, maybe five, Asus ROG board I've used over the past 10+ years in server builds.
    It has a great community, most issues get resolved after reported.

    I'm not a gamer (take a look at the video adapter :-).
    I could have selected anyone of a number of server motherboards at this price point (Intel, Supermicro, etc...)
    still, Asus ROG motherboards have a lot to offer.

    I need the features and function ROG boards provide.
    In most cases, they have been excellent, feature rich, stable motherboards
    that use higher end parts that hold up under heavy use.

    Both (02x) Intel 7600p NVMe's are installed on the ROG DIMM.2 (raiser) to use the same controller, the BIOS is set to CPU.

    Click image for larger version. 

Name:	sully-cfg01.jpg 
Views:	1 
Size:	70.8 KB 
ID:	77698

    Click image for larger version. 

Name:	sully-cfg02.jpg 
Views:	1 
Size:	44.5 KB 
ID:	77697
    Miniatura de Adjuntos Miniatura de Adjuntos sully-cfg01.jpg  

    Last edited by StewartT; 12-20-2018 at 12:10 AM.

  5. #5
    Banned Array JustinThyme PC Specs
    JustinThyme PC Specs
    Laptop (Model)G752VY-DH72
    MotherboardRampage VI Extreme
    ProcessorI9 9940X
    Memory (part number)64GB DDR4 8x8 Corsair Dominator Platinum 3800 MHz @ C17
    Graphics Card #1ASUS Strix 2080Ti O11G @ 2.1GHz
    Graphics Card #2ASUS Strix 2080Ti O11G @ 2.1Ghz
    Graphics Card #3ROG Nvlink
    Graphics Card #4Have to feed animals
    Sound CardExternal Audioengine D1 24 bit 192kbps DAC
    MonitorASUS PG348Q @ 100Hz
    Storage #1Intel 905P 480GB U2 flavor
    Storage #2Samsung 850 EVO 1TB X2 in RAID 0, 960 PRO 1TB DIMM.2_1
    CPU CoolerHeatKiller IV PRO and VRM blocks ,Dual D5 PWM serial, 2X 480, 1X 360 RADS
    CasePhanteks Enthoo Elite 8X LL120 PWM, 3X LL140 PWM, 12 SP120 PWM 1x AF140 PWM
    Power SupplyCorsair AX 1500i
    Keyboard ASUS Claymore
    Mouse ASUS Spatha, Logitech MX Master
    Headset Sennheiser HD 700
    Mouse Pad ASUS ROG Sheath
    Headset/Speakers Audioengine A5+ with SVS SB-1000 Sub
    OS Win10 Pro 1809
    Network RouterNetGear NightHawk X10
    Accessory #1 NetGear Prosafe 10GBe Switch
    Accessory #2 Qnap TVS-682 NAS modded with I7 CPU

    Join Date
    Nov 2013
    Reputation
    144
    Posts
    3,858

    Quote Originally Posted by Riicckk View Post
    Getting RAID to work consistently on domestic gaming rigs is just not worth the hassle, especially now with fast SSD drives and NVMe drives. In everyday use you wont notice very little difference to a single drive. It will only show up in benchmarking if you can stand the hassle.
    Many years ago I played around with RAID with HDD's on SuperMicro server boards with LSI hardware RAID cards and also software RAID. I lost count the number of times I lost my RAID or broke the raid through updates etc. In the end I just gave up with RAID and stuck to Velocirapters which were fast enough.
    RAID is fine in the server world, mostly, due to the infrastructure, sparsely spaced updates and 24/7 operation. Lately it has become more of a problem, with updates breaking RAID on servers, but the server environment is more geared up for it with enterprise backup methods.
    Do yourself a favour and ditch the idea of RAID on gaming rigs ; just setup with single NVMe drives, You wont regret it.....
    Really? Ive had raid in about every imaginable combination running on "domestic" rigs for the last 30 years. you are right that its not so much about speed these days especially when there is still a DMI bottleneck for non intel drives downstream of the PCH. HOWEVER there are many uses for raid other than striping. Trust me when I say raid doesn't get broken from updates in enterprise applications. Drives fail from time to time and thats why there is raid mirroring in the first place. Drive dies, put in a new one and walk away, the array rebuilds itself.

    On to speed in raid 0. No NVMe by itself can crank out 10,000MB/S sequential reads while also getting random 4K random speeds of 200-300MB/S

    Now on to the OP delimma.
    You cant raid up two drives on the DIMM.2.
    You can raid DIMM.2_2 with a PCIE drive or DIMM2.1 with an M2 drive under the armour. The raid with a PCIE and the DIMM.2_2 just became plausible in VROC with the 905P 380GB intel drives. If I was just going after file space that needed to be fast Id install a hyperX 16 card with 4 of the 905P drives but thats gonna run you 2 large. You can also do it with the 2.5 inch U@ drives leaving the cover off and using U2 to M2 adapters to get 10,000MB/s sequential reads all day long.

  6. #6
    ROG Member Array
    Join Date
    Oct 2018
    Reputation
    11
    Posts
    6

    RAID1 VROC fingers crossed or install the latest BIOS?

    Thank you for the reply.

    You cant raid up two drives on the DIMM.2.
    You can raid DIMM.2_2 with a PCIE drive or DIMM2.1 with an M2 drive under the armour. The raid with a PCIE and the DIMM.2_2 just became plausible in VROC with the 905P 380GB intel drives. If I was just going after file space that needed to be fast Id install a hyperX 16 card with 4 of the 905P drives but thats gonna run you 2 large. You can also do it with the 2.5 inch U@ drives leaving the cover off and using U2 to M2 adapters to get 10,000MB/s sequential reads all day long.
    Respectfully, what is this statement based on?
    i.e. completed testing and/or documentation discovered?


    To be clear, the R6E BIOS does allow 02x Intel 7600p Pro NVMe's installed in the DIMM.2 NVMe raiser to be seen & configured into a RAID1 VROC array. Intel VROC documents contain a list of approved, non-Intel, & Intel NVMe's such as the 905p and others (please see test finding below).
    This RAID1 array can then be installed and configured as a boot device for W10 using an F6 driver install of RSTe.

    As stated, I had one "unexplained" boot failure of the Intel VROC RAID1 array.
    From your prior post, It sounds like VROC has improved in the latest BIOS update?

    When I checked the state of the RAID1 array in the BIOS, after the boot failure, the RAID1 array configuration was gone.
    I had not entered or changed anything in the BIOS prior to this configuration loss.

    I recreated the RAID1 array, and the server has booted everyday as expected for a week.
    Clearly, no one likes RAID1 array configurations disappearing from a BIOS.

    COMPLETED VROC TESTING
    I tested various R6E NVMe configurations that I expected to work only to find later
    some configurations were restricted (based on Intel x299 VROC documents).

    Thus far I've tested using 02x Samsung 970 Pro NVMe's, 02x Intel 760p NVMe's, and 02x Intel 7600p Pro NVMe's
    FINDINGS -
    a) The DIMM.2 is able to access 01x of 02x installed NVMe's when the NVMe is not Intel branded (01x Samsung NVMe of 02x could be seen).
    Intel VROC documents contain a list of approved, non-Intel, NVMe's that I expect would not have this limit.
    I assume 01x non-Intel NVMe could be placed in the DIMM.2, 01x non-Intel NVMe could be installed on the MB, or additional non-Intel NVMe via PCI adapters.
    b) Without the Intel feature key installed, 02x Intel 760p NVMe's can be seen on the DIMM.2 raiser and configured RAID0. I tested this.
    I've read this RAID0 array can not be used as a boot device due to Intel restrictions. I've not tested this but assume this stated limit exists.
    c) With the Intel feature key installed, 02x Intel 7600p Pro NVMe's can be seen on the DIMM.2 raiser and configured RAID0 or RAID1. I tested this.
    I setup a RAID1 array, installed W10 on this array as the server boot device. I assume a RAID0 array could also be configured as a W10 boot device.

    WINDOW7
    I attempted creating W7 install media with NVMe support then using an F6 RSTe install.
    W7 is not able to see the RAID array, this prevents W7 installation on the array.

    This appears to be an RSTe .vs RST issue.

    NEXT STEPS
    I hope to meet with Intel to cover x299 VROC in greater detail in Jan.
    I'll have a few additional weeks of use completed.
    Last edited by StewartT; 12-20-2018 at 09:04 AM.

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •