Another pool - different array, different host, different workload.
And again - summay read throutput to all disks in a pool is 10x bigger than to 
a pool itself.

Iny idea?

bash-3.00# zpool iostat -v 1
                                           capacity     operations    bandwidth
pool                                     used  avail   read  write   read  write
--------------------------------------  -----  -----  -----  -----  -----  -----
nfs-s5-1                                4.32T  16.1T    304    127  11.9M   506K
  raidz                                 4.32T  16.1T    304    127  11.9M   506K
    c4t600C0FF00000000009258F2411CF3D01d0      -      -    148     48  8.48M  
67.8K
    c4t600C0FF00000000009258F6FA45D3801d0      -      -    148     48  8.48M  
67.9K
    c4t600C0FF00000000009258F1820617F01d0      -      -    146     48  8.46M  
67.9K
    c4t600C0FF00000000009258F24546FAC01d0      -      -    146     48  8.45M  
67.9K
    c4t600C0FF00000000009258F5949030301d0      -      -    146     48  8.46M  
67.9K
    c4t600C0FF00000000009258F24E8AADD01d0      -      -    146     48  8.45M  
67.9K
    c4t600C0FF00000000009258F5FD5023B01d0      -      -    146     48  8.46M  
67.9K
    c4t600C0FF00000000009258F17E7007801d0      -      -    146     48  8.46M  
67.9K
    c4t600C0FF00000000009258F598F6BE701d0      -      -    146     48  8.46M  
67.8K
--------------------------------------  -----  -----  -----  -----  -----  -----

                                           capacity     operations    bandwidth
pool                                     used  avail   read  write   read  write
--------------------------------------  -----  -----  -----  -----  -----  -----
nfs-s5-1                                4.32T  16.1T    508     72  34.5M   282K
  raidz                                 4.32T  16.1T    508     72  34.5M   282K
    c4t600C0FF00000000009258F2411CF3D01d0      -      -    254     25  14.1M  
37.6K
    c4t600C0FF00000000009258F6FA45D3801d0      -      -    248     24  13.8M  
38.1K
    c4t600C0FF00000000009258F1820617F01d0      -      -    247     26  13.9M  
37.6K
    c4t600C0FF00000000009258F24546FAC01d0      -      -    240     26  13.8M  
37.8K
    c4t600C0FF00000000009258F5949030301d0      -      -    243     25  14.0M  
37.3K
    c4t600C0FF00000000009258F24E8AADD01d0      -      -    246     26  13.8M  
38.3K
    c4t600C0FF00000000009258F5FD5023B01d0      -      -    242     25  13.6M  
38.6K
    c4t600C0FF00000000009258F17E7007801d0      -      -    238     27  13.5M  
39.4K
    c4t600C0FF00000000009258F598F6BE701d0      -      -    258     27  14.6M  
39.7K
--------------------------------------  -----  -----  -----  -----  -----  -----

^C
bash-3.00#
 
 
This message posted from opensolaris.org
_______________________________________________
zfs-discuss mailing list
zfs-discuss@opensolaris.org
http://mail.opensolaris.org/mailman/listinfo/zfs-discuss

Reply via email to