On 08/11/10 14:33, Jim Meyering wrote:
> Looks like I got very lucky here and hit a number of nanoseconds
> that happened to be a multiple of 100,000:
> 
>     $ for i in $(seq 1000); do touch -d '1970-01-01 18:43:33.5000000000' 2; 
> t=$(stat -c "%.W %.X %.Y %.Z" 2); test $(echo "$t"|wc -c) -lt 57 && echo 
> "$t"; done
>     0.000000 63813.500000 63813.500000 1289224045.731146
>     0.0000 63813.5000 63813.5000 1289224047.8224
>     [Exit 1]
> 
> I realize this is due to the way the precision estimation
> heuristic works.  Wondering if there's a less-surprising
> way to do that.

You could snap to milli, micro, nano,
though that would just mean it would
happen less often.

> Now, I'm thinking that this variable precision feature would be better
> if it were somehow optional, rather than the default for %.X.
> Consistency/reproducibility are more important, here.

You could touch -d '0.123456789' stat.prec.test
at program start, but that wouldn't always work.
Non writable dir, disparity between read and
write support for time stamp resolutions, :(

You could sample X preexisting files/dirs on the
same file system, and stop when Y have not increased
in precision. That combined with snapping to milli,micro,nano
would usually work. Though that's starting to
get too hacky IMHO while still not being general.

I guess we're back to doing 9 by default for %.Y
and using %#.Y to mean auto precision ?

cheers,
Pádraig.



Reply via email to