Quantcast
Channel: ASRock Forums
Viewing all articles
Browse latest Browse all 36739

Intel Motherboards : RAID Setup Menu

$
0
0
Author: parsec
Subject: RAID Setup Menu
Posted: 01 Jan 2017 at 11:29am

Originally posted by eComposer eComposer wrote:

Thanks Parsec,

Interestingly I had followed all the steps listed above (except I used 64kb strip per Tweaktown, and did not clear the CMOS).

I'm not sure what is going wrong, it just won't give the RAID 0 array as an option to install Windows to...

From what you're saying though, RAID 0 has performance penalties to just using a single SSD.

I'm mixing music with multiple channels (often 50+ depending on using "stems"), and each has multiple plugins running (including many high end emulations and samples etc, effects, amp simulators, console emulations etc etc), and I/O is key to avoid drop outs, distortion and other issues detracting from the real time audio output.

Would you suggest just using a single Pcie 3 x4 SSD as the boot drive vs RAID 0, as I/O is the objective here to support the best mixing conditions I can achieve?  (Essentially looking for the best streaming config available). 

I thought the specs for RAID 0 using the M 2 Pcie 3 x4 slots leveraging NVMe via the ASRock Z170 Extreme 7+ was supposed to deliver the best I/O, but your comment seems to indicate the reverse (that is non RAID vs RAID 0).

FYI:  I'm using the Thunderbolt 2 ASRock card with a high end Thunderbolt audio interface to minimize latency etc (especially when recording multiple tracks "real time" and monitoring vs mixing).

RAID 0 or not RAID? - This would be helpful to know since I've held off a full update to build a new config from scratch aiming to reinstall everything on RAID 0 (and have frequent backup capabilities set up to guard against RAID 0 failure).  Maybe avoiding RAID 0 is the better option then?


That's strange the RAID 0 array can't be seen by the Win 10 installer. I went through the procedure again, and I don't think I left anything out. The RAID 0 array won't be shown as a drive until after the IRST "F6" driver is installed. Can you do a "refresh" in main the Custom installation screen, as in look for drives again?

I did not mention that you must format the RAID 0 array after the F6 driver is loaded. All you do is click the New button and the installer will format the RAID array correctly, as GPT and all required partitions.

Also do not remove the USB drive with the F6 driver from the PC until the Windows installation is complete. It's been a while since I've used a RAID 0 array of 950 Pros. But I know it works. The 960 should be no different than a 950 in RAID for a Windows installation.

When the Z170 Extreme7+ board was first released, we had a thread in this forum about creating and using 950 Pro's in RAID 0 arrays. Several forum members and I worked out the details ourselves. One guy had three 950 Pro's in RAID 0. At that time with the very first IRST driver (14.0...) that supported NVMe SSDs in RAID with Z170 boards, the difference between the benchmark results of two vs three SSDs was minimal, at best 500MB/s faster for sequential read speed. That guy was disappointed, but we never could improve the results. Also, anything less than a 64K stripe size would result in terrible benchmark results with 950 Pros in RAID 0. At that time with IRST version 14, we all agreed the 128K stripe size was the best. If that has changed with newer versions of IRST, great.

Personally, I always configure a full UEFI booting installation, meaning CSM set to Disabled. The only problem with that is your video source must be GOP compatible, a UEFI booting protocol. Older video cards (pre-Nvidia 700 series) may not be GOP compatible without a VBIOS update. No idea about ATI/AMD video cards. EVGA 600 series cards needed a VBIOS update to be GOP compatible, but it worked. Intel integrated graphics is GOP compatible since Sandy Bridge.

Regarding the articles about how fast and great RAID 0 arrays of NVMe SSDs are: By all means, be my guest and use them! The only way to really know what they are like is to have and use them.

I'll make one comment about the articles, the graphs in particular. Yes, you can see the clear difference in the benchmark tests with the RAID 0 arrays, with their multi-hundreds of thousands of IOPs. But check one axis of the graphs labeled Queue Depth (QD.) QD is the number of outstanding IO requests waiting to be serviced by the drive or RAID array. NVMe SSDs have even better high QD performance, and better 4K random read performance than SATA SSDs.

It is well known that in home PC usage, since even a single SSD is so fast, that the number of outstanding IO requests is rarely, if ever, four. That is called a QD of 4, or QD = 4. That statistic was done with SATA SSDs.

Notice in the test graphs, the maximum QD=32 for IOPs, and for latency, QD=16. Unless you are hosting a website on your PC, or running database queries against millions of data records, you'll never be doing IO at even QD=4. In short, yes the performance potential is there, but most people never use it. I can't predict what benefits you will get from you usage case, but do you think you will ever use the ~200,000 IOPs (IO Operations Per second) of a single NVMe SSD? Do we need 400,000+ IOPs of the RAID 0 array?

I'm also very certain that a RAID 0 array of NVMe SSDs will not boot Windows faster than a single identical NVMe SSD. The same is true of SATA SSDs in RAID 0. From a cold start/boot, run Task Manager and click the Startup tab. Check the Last BIOS Time at the top right.

Viewing all articles
Browse latest Browse all 36739

Trending Articles