I'm hoping someone can help me understand a zfs data corruption symptom. We have a zpool with checksum turned off. Zpool status shows that data corruption occured. The application using the pool at the time reported a "read" error and zoppl status (see below) shows 2 read errors on a device. The thing that is confusing to me is how ZFS determines that data corruption exists when reading data from a pool with checkdum turned off.
Also, I'm wondering about the persistent errors in the output below. Since no specific file or directory is mentioned does this indicate pool metadata is corrupt? Thanks for any help interpreting the output... # zpool status -xv pool: zpool1 state: ONLINE status: One or more devices has experienced an error resulting in data corruption. Applications may be affected. action: Restore the file in question if possible. Otherwise restore the entire pool from backup. see: http://www.sun.com/msg/ZFS-8000-8A scrub: none requested config: NAME STATE READ WRITE CKSUM zpool1 ONLINE 2 0 0 c4t60A9800043346859444A476B2D48446Fd0 ONLINE 0 0 0 c4t60A9800043346859444A476B2D484352d0 ONLINE 0 0 0 c4t60A9800043346859444A476B2D484236d0 ONLINE 0 0 0 c4t60A9800043346859444A476B2D482D6Cd0 ONLINE 0 0 0 c4t60A9800043346859444A476B2D483951d0 ONLINE 0 0 0 c4t60A9800043346859444A476B2D483836d0 ONLINE 0 0 0 c4t60A9800043346859444A476B2D48366Bd0 ONLINE 0 0 0 c4t60A9800043346859444A476B2D483551d0 ONLINE 0 0 0 c4t60A9800043346859444A476B2D483435d0 ONLINE 0 0 0 c4t60A9800043346859444A476B2D48326Bd0 ONLINE 0 0 0 c4t60A9800043346859444A476B2D483150d0 ONLINE 0 0 0 c4t60A9800043346859444A476B2D483035d0 ONLINE 0 0 0 c4t60A9800043346859444A476B2D47796Ad0 ONLINE 0 0 0 c4t60A9800043346859444A476B2D477850d0 ONLINE 0 0 0 c4t60A9800043346859444A476B2D477734d0 ONLINE 0 0 0 c4t60A9800043346859444A476B2D47756Ad0 ONLINE 0 0 0 c4t60A9800043346859444A476B2D47744Fd0 ONLINE 0 0 0 c4t60A9800043346859444A476B2D477333d0 ONLINE 0 0 0 c4t60A9800043346859444A476B2D477169d0 ONLINE 0 0 0 c4t60A9800043346859444A476B2D47704Ed0 ONLINE 0 0 0 c4t60A9800043346859444A476B2D476F33d0 ONLINE 0 0 0 c4t60A9800043346859444A476B2D476D68d0 ONLINE 0 0 0 c4t60A9800043346859444A476B2D476C4Ed0 ONLINE 0 0 0 c4t60A9800043346859444A476B2D476B32d0 ONLINE 0 0 0 c4t60A9800043346859444A476B2D476968d0 ONLINE 0 0 0 c4t60A98000433468656834476B2D453974d0 ONLINE 0 0 0 c4t60A98000433468656834476B2D454142d0 ONLINE 0 0 0 c4t60A98000433468656834476B2D454255d0 ONLINE 0 0 0 c4t60A98000433468656834476B2D45436Dd0 ONLINE 0 0 0 c4t60A9800043346859444A476B2D487346d0 ONLINE 2 0 0 c4t60A9800043346859444A476B2D487175d0 ONLINE 0 0 0 c4t60A9800043346859444A476B2D48705Ad0 ONLINE 0 0 0 c4t60A9800043346859444A476B2D486F45d0 ONLINE 0 0 0 c4t60A9800043346859444A476B2D486D74d0 ONLINE 0 0 0 c4t60A9800043346859444A476B2D486C5Ad0 ONLINE 0 0 0 c4t60A9800043346859444A476B2D486B44d0 ONLINE 0 0 0 c4t60A9800043346859444A476B2D486974d0 ONLINE 0 0 0 c4t60A9800043346859444A476B2D486859d0 ONLINE 0 0 0 c4t60A9800043346859444A476B2D486744d0 ONLINE 0 0 0 c4t60A9800043346859444A476B2D486573d0 ONLINE 0 0 0 c4t60A9800043346859444A476B2D486459d0 ONLINE 0 0 0 c4t60A9800043346859444A476B2D486343d0 ONLINE 0 0 0 c4t60A9800043346859444A476B2D486173d0 ONLINE 0 0 0 c4t60A9800043346859444A476B2D482F58d0 ONLINE 0 0 0 c4t60A9800043346859444A476B2D485A43d0 ONLINE 0 0 0 c4t60A9800043346859444A476B2D485872d0 ONLINE 0 0 0 c4t60A9800043346859444A476B2D485758d0 ONLINE 0 0 0 c4t60A9800043346859444A476B2D485642d0 ONLINE 0 0 0 c4t60A9800043346859444A476B2D485471d0 ONLINE 0 0 0 c4t60A9800043346859444A476B2D485357d0 ONLINE 0 0 0 c4t60A9800043346859444A476B2D485241d0 ONLINE 0 0 0 c4t60A9800043346859444A476B2D485071d0 ONLINE 0 0 0 c4t60A9800043346859444A476B2D484F56d0 ONLINE 0 0 0 c4t60A9800043346859444A476B2D484E41d0 ONLINE 0 0 0 c4t60A9800043346859444A476B2D484C70d0 ONLINE 0 0 0 c4t60A9800043346859444A476B2D484B56d0 ONLINE 0 0 0 c4t60A9800043346859444A476B2D484A2Dd0 ONLINE 0 0 0 c4t60A9800043346859444A476B2D484870d0 ONLINE 0 0 0 c4t60A9800043346859444A476B2D484755d0 ONLINE 0 0 0 c4t60A9800043346859444A476B2D48462Dd0 ONLINE 0 0 0 errors: The following persistent errors have been detected: DATASET OBJECT RANGE zpool1 17 2428895232-2429026304 zpool1 17 2429026304-2429157376 zpool1 17 2429157376-2429288448 zpool1 17 2429288448-2429419520 zpool1 17 2429419520-2429550592 zpool1 17 2463629312-2463760384 zpool1 17 2463760384-2463891456 zpool1 17 2463891456-2464022528 zpool1 17 2464022528-2464153600 zpool1 17 2464153600-2464284672 zpool1 18 2397700096-2397831168 zpool1 18 2397831168-2397962240 zpool1 18 2397962240-2398093312 zpool1 18 2398093312-2398224384 zpool1 18 2398224384-2398355456 zpool1 18 2432434176-2432565248 zpool1 18 2432565248-2432696320 zpool1 18 2432696320-2432827392 zpool1 18 2432827392-2432958464 zpool1 18 2432958464-2433089536 zpool1 19 2418933760-2419064832 zpool1 19 2419064832-2419195904 zpool1 19 2419195904-2419326976 zpool1 19 2419326976-2419458048 zpool1 19 2453798912-2453929984 zpool1 19 2453929984-2454061056 zpool1 19 2454061056-2454192128 zpool1 19 2454192128-2454323200 This message posted from opensolaris.org _______________________________________________ zfs-discuss mailing list zfs-discuss@opensolaris.org http://mail.opensolaris.org/mailman/listinfo/zfs-discuss