I had recently started setting up a homegrown OpenSolaris NAS with a large 
RAIDZ2 pool, and had found its RAIDZ2 performance severely lacking - more like 
downright atrocious. As originally set up:

* Asus M4A785-M motherboard
* Phenom II X2 550 Black CPU
* JMB363-based PCIe X1 SATA card (2 ports)
* SII3132-based PCIe X1 SATA card (2 ports)
* Six on-board SATA ports

Two 500 GB drives (one Seagate, one WD) serve as the root pool, and have 
performed admirably. The other eight 500 GB drives (4 Seagate, 4 WD, in a 
RAIDZ2 configuration) performed quite poorly, with lots of long freezeups and 
no error messages. Even streaming a 48 kHz/24-bit FLAC via CIFS would 
occasionally freeze for 5-10 seconds, with no other load on the file server. 
Such freezeups became far more likely with other activity - forget about 
streaming video if a scrub was going on, for instance. These pauses were NOT 
accompanied by any CPU activity. If I watched what the array was doing using 
GKrellM, I could see the pauses.

I started to get the feeling that I was running into a bad I/O bottleneck. I 
don't know how many PCIe lanes are being used by the onboard ports, and I'm now 
of the opinion that two-port PCIe X1 SATA cards are a Very Bad Idea for 
OpenSolaris. Today, I replaced the motley assortment of controllers with an 
Intel SASUC8I to handle the RAIDZ2 array, leaving the root pool on two of the 
onboard ports. Having already had a heart-attack moment last week after 
rearranging drives, *this* time I knew to do a "zpool export" before powering 
the system down. :O

The card worked out-of-the-box, with no extra configuration required. WOW, what 
a difference! I tried a minor stress-test: viewing some 720p HD video on one 
system via NFS, while streaming music via CIFS to my XP desktop. Not a single 
pause or stutter - smooth as silk. Just for kicks, I upped the ante and started 
a scrub on the RAIDZ2. No problem! Finally, it works like it should!

The scrub is going about twice as fast overall, with none of the herky-jerky 
action I was getting using the mix-and-match SATA interfaces.

Interestingly about the SASUC8I - the name "Intel" doesn't occur anywhere on 
the card. It's basically a repackaged LSI SAS3081E-R card (it's even labeled as 
such on the card itself and on the antistatic bag), and came just as a card in 
a box with an additional low-profile bracket for those with 1U cases - no 
driver CD or cables. I knew that it didn't come with cables, and ordered them 
separately. If I had ordered the LSI kit with cables from the same supplier, it 
would have cost about $80 more than getting the SASUC8I and cables separately.

If you're building a NAS, and have a PCIe X8 or X16 slot handy, this card is 
well worth it. Leave the two-port cheapies for workstations.
-- 
This message posted from opensolaris.org
_______________________________________________
zfs-discuss mailing list
zfs-discuss@opensolaris.org
http://mail.opensolaris.org/mailman/listinfo/zfs-discuss

Reply via email to