Page 1 of 2 1 2 LastLast
Results 1 to 10 of 12
  1. #1
    ROG Member Array
    Join Date
    Mar 2019
    Reputation
    10
    Posts
    9

    Bootable VROC RAID 10, VMD domains, and PCIe lanes

    Intended Hardware/RAID Setup:

    As I understand it from this picture, bootable RAID arrays have a maximum of 4 supported drives because they must reside within a single VMD domain and non-bootable arrays can span multiple VMD domains.

    Click image for larger version. 

Name:	slide-3.png 
Views:	0 
Size:	232.9 KB 
ID:	79603

    I don't know how VMDs are distributed on this board so I don't know if the arrays can be configured in this way:

    Onboard:
    M.2_1 (lower M.2 slot, too short for a 905P) Data drive 7
    M.2_2 (upper M.2 slot, shares bandwidth with the U.2 port and may be PCH only) No drive

    DIMM.2:
    M.2_1 Boot drive 1
    M.2_2 Boot drive 2

    M.2 x16 Card 1:
    M.2_1 Boot drive 3
    M.2_2 Boot drive 4
    M.2_3 Data drive 1
    M.2_4 Data drive 2

    M.2 x16 Card 2:
    M.2_1 Data drive 3
    M.2_2 Data drive 4
    M.2_3 Data drive 5
    M.2_4 Data drive 6

    Will this configuration work?

    Also, the CPU supports 44 PCIe lanes and there will be an x16 graphics card, how many 970 EVOs would need to be cancelled to keep the graphics card running at a full x16?

  2. #2
    ROG Guru: Orange Belt Array G75rog PC Specs
    G75rog PC Specs
    MotherboardR VI Apex
    ProcessorI9-7900X
    Memory (part number)64GB Gskill Trident Z
    Graphics Card #1AMD Radeon VII EKWB water block
    Graphics Card #2Waste of PCI lanes in NVME environment
    Sound CardStock
    MonitorLG 34" Ultra Wide3440x1440
    Storage #14ea 960 Pro 2TB, 2ea 970 Evo 2TB
    Storage #21ea 960 Pro 1 TB, 1ea 970 Pro 1TB
    CPU CoolerEKWB CPU, Swiftech Res/pump, 400mm Rad
    CaseThermaltake View 91
    Power SupplyNZXT Hale 90 1200W
    Keyboard Logitech G910
    Mouse Razer Ouroboros
    OS W10 64 1809, 1903, Ubuntu Linux
    Network RouterNetgear R9000 + 4ea R9000 AP's, 10 Gb Switch
    Accessory #1 Hyper M.2 X16 V2
    Accessory #2 10GB network via Intel 550 Nic
    Accessory #3 2 10GB Asustor NAS 7000 series

    Join Date
    Jan 2013
    Reputation
    13
    Posts
    253

    You'll have to drop one Hyper X16 card in order to run x16 on the other one. If you run a Hyper X16 at X8 you will only see 2 drives.

    You would be better served running all the 905P's in the Hyper X16 and put the Data drives on the Dimm.2

    The Asus X299 Sage WS or X299 Sage/10G uses the PLX 8747 Pcie switches and would allow X16 to the GPU and both Hyper X16 cards.

    imho


    I currently run the R6 Apex with 8 1&2 TB Samsungs on board without Raid.

  3. #3
    ROG Guru: Orange Belt Array
    Join Date
    Oct 2017
    Reputation
    10
    Posts
    257

    Quote Originally Posted by G75rog View Post
    You'll have to drop one Hyper X16 card in order to run x16 on the other one. If you run a Hyper X16 at X8 you will only see 2 drives.

    You would be better served running all the 905P's in the Hyper X16 and put the Data drives on the Dimm.2

    The Asus X299 Sage WS or X299 Sage/10G uses the PLX 8747 Pcie switches and would allow X16 to the GPU and both Hyper X16 cards.

    imho


    I currently run the R6 Apex with 8 1&2 TB Samsungs on board without Raid.
    I run 2 HyperX16s both run at x16 with 4 divided x4 lanes. I had to put my GPU in a x8 slot to accomplish this.

  4. #4
    ROG Guru: Orange Belt Array
    Join Date
    Oct 2017
    Reputation
    10
    Posts
    257

    One note I use intel 900p SSDs, 8 of them. Not sure about the samsungs working maybe with a VROC key?

  5. #5
    ROG Guru: Orange Belt Array
    Join Date
    Oct 2017
    Reputation
    10
    Posts
    257

    BTW the controllers worked this way for me, there are 3, Controller 0 1st HyperX16, Controller 1 2nd HyperX16, Controller2 DIMM Card (I have a single Samsung 960 Pro in that card)

  6. #6
    ROG Guru: Orange Belt Array
    Join Date
    Oct 2017
    Reputation
    10
    Posts
    257

    I am not sure your config will work, but if it does you will be spanning contollers. And I know you cannot boot off spanned controllers, well at least it is not supported. So I don't think your configuration will work or at least you shouldn't do it even if you found a way to get the spanned controllers to boot (maybe there is a hack) your OS won't upgrade properly in the future. It will be painful to do.

  7. #7
    ROG Member Array
    Join Date
    Mar 2019
    Reputation
    10
    Posts
    9

    Thanks guys. I would like to keep the graphics card at x16 so it seems like my options for the non-bootable data array are to either have less drives and continue using this motherboard, or switch to the X299 Sage/10G and keep all of the drives but lose a few extras that this motherboard has.

    I prefer this motherboard so I think I'll go with that and perhaps make the non-bootable data array out of SATA drives in a non-VROC RAID.

  8. #8
    ROG Member Array Jahmen's Avatar
    Join Date
    May 2017
    Reputation
    10
    Posts
    9

    I was going to ask how you could physically get two Hyper M.2 Cards onto the boards PCIe slots with the Graphics Card. Then I saw this was posted back in 2017 back when the Graphics Cards were smaller.

    My ROG Rampage VI Extreme Encore motherboard with the ASUS 2080 Ti graphics card can't handle the physical configuration for 2 Hyper M.2 Cards.
    Best for the Hyper Card in the 1st PCIe slot (1) for the X16 x8 lanes and the graphics card in the 2nd PCIe slot (2).
    The graphics card won't fit on the 3rd bottom PCIe Slot (3).

    You can't put the graphics card on top 1st Slot (1) with a Hyper M.2 Card under it in the 2nd PCIe Slot (2) position because it covers the intake fans for the graphics card.

    Anyone know how to set up a Hardware base VROC Key module boot-able RAID10 on CPU with the Hyper M.2 Card and x4 7600p series SSDs?
    Yeah, I figure probably not. I've been asking ASUS Support to give me that information for a couple weeks now.

    Feel free to help out and post those steps if you know them, and if you do, please include the driver & application file steps.

    The ASUS RAID Guide has some steps, but without those for the HD key BIOS setup or the driver & application files.

    Appears to be no consensus yet for exactly which of those Key files (RSTe or VROC) to use for the HD Key.
    Last edited by Jahmen; 02-10-2020 at 07:03 PM.
    Go do crazy some place else, we're all stocked up here!

  9. #9
    New ROGer Array
    Join Date
    Feb 2019
    Reputation
    10
    Posts
    31

    Quote Originally Posted by Jahmen View Post
    I was going to ask how you could physically get two Hyper M.2 Cards onto the boards PCIe slots with the Graphics Card. Then I saw this was posted back in 2017 back when the Graphics Cards were smaller.

    My ROG Rampage VI Extreme Encore motherboard with the ASUS 2080 Ti graphics card can't handle the physical configuration for 2 Hyper M.2 Cards.
    Best for the Hyper Card in the 1st PCIe slot (1) for the X16 x8 lanes and the graphics card in the 2nd PCIe slot (2).
    The graphics card won't fit on the 3rd bottom PCIe Slot (3).

    You can't put the graphics card on top 1st Slot (1) with a Hyper M.2 Card under it in the 2nd PCIe Slot (2) position because it covers the intake fans for the graphics card.

    Anyone know how to set up a Hardware base VROC Key module boot-able RAID10 on CPU with the Hyper M.2 Card and x4 7600p series SSDs?
    Yeah, I figure probably not. I've been asking ASUS Support to give me that information for a couple weeks now.

    Feel free to help out and post those steps if you know them, and if you do, please include the driver & application file steps.

    The ASUS RAID Guide has some steps, but without those for the HD key BIOS setup or the driver & application files.

    Appears to be no consensus yet for exactly which of those Key files (RSTe or VROC) to use for the HD Key.


    So I have this set up:
    ASUS ROG RAMPAGE VI EXTREME
    1x ASUS HYPER M.2 X16 CARD V1
    4x Intel Optane 900P connected to the ASUS HYPER M.2 X16 CARD
    1x EVGA 2080ti graphics card
    1x Intel VROC Intel SSD Only Key VROCISSDMOD installed
    + a number of other SSDs etc (even tried to use the u.2 port with a 6.4TB Intel P4600 - works too (BUT you cannot use the onboard m.2 Slot (raiser card) in CPU mode if you do. need to route the m.2 slot through the chipset's shared x4 lanes... - I was experimenting with these configurations back and forth with various BIOS settings but do not use the u.2 port anymore - might as well route everything through the ASUS HYPER M.2 X16 CARD and continue to use the M2. slot)

    I've used both these two configurations set up and working:
    1. graphics card in slot 3 and Hyper x16 card in slot 1 (had to use this set up initially because my CPU aircooler was too big and conflicted with any card in PCIe slot 1)
    2. graphics card in slot 1 and Hyper x16 card in slot 3 (delided my 7980XE CPU and changed to water-cooling allowing me to put back the graphics card in PCIe slot 1)


    Both configurations works but 2 is better in my view.

    I currently have configuration 2 working 20/7. - best set up for best performance... but both works 24/7 confirmed...

    I see no reason why it would not work to add an additional ASUS HYPER M.2 X16 CARD in PCIe slot 4 BUT if you do you can only use 8 out of the 16 lanes. The reason is because you simply do not have enough PCIe lanes connected to the CPU even with a 7980XE (44 is max).

    simple PCIe lane math: 16 lanes (in slop 1 for the graphics card) + 16 lanes (in slot 3 for the Hyper x16 card) + 8 lanes (in slot 4 for the 2nd Hyper x16 card ) = 40 + another 4 lanes for cipset etc (can not change that) = 44 which is all the 7980XE supports...

    so that's the best you can do on a R6E with a 7890XE
    Last edited by Int8bldr; 02-11-2020 at 04:21 AM.

  10. #10
    ROG Enthusiast Array Aysberg PC Specs
    Aysberg PC Specs
    MotherboardASUS ROG RAMPAGE VI EXTREME
    ProcessorIntel Core i9-7980XE
    Memory (part number)4 x F4-3200C14-16GTZ
    Graphics Card #1ASUS ROG STRIX GTX 1080 Ti GAMING OC
    Sound CardCreative SB Zx
    Monitor1x LG 34GK950G / 2 x HP DreamColor Z27x
    Storage #12 x Samsung NVMe 970 EVO 2TB / Samsung NVMe 3 x 960 EVO 1TB
    Storage #24 x SanDisk SDSSDHII960G
    CPU Coolercuplex kryos NEXT
    CaseLianLi PC-A75X
    Power SupplyCorsair HX1200i
    Keyboard Corsair K95 RGB
    Mouse Roccat KONE XTD Optical
    Headset AKG K712 + ModMic 5
    OS Windows 10 Pro x64 1809
    Network RouterMikroTik CRS317-1G-16S+RM
    Accessory #1 Mellanox ConnectX-3
    Accessory #2 Asus Hyper M.2 X16 Card

    Join Date
    Nov 2013
    Reputation
    18
    Posts
    65

    Just out of curiosity, what are you doing with such a setup? Is this just a "because I can" scenario. If I would need that massive storage performance I would simply switch the platform and could archive more with much less hassle.

    I am running an Omega with 2 x 2080ti, doing 3D renderings and post production and a RAID0 with two Samsung NVMe can handle the load and are still not at it's limit. OS boots from another NVMe and the rest is stored on classic SSDs or even a spinning drive.

Page 1 of 2 1 2 LastLast

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •