Where I work, we're used to dealing with time-series data and this really 
sounds like developers making the mistake to assume that there are always 
60 seconds in a minute. Working with time is not as trivial as it sounds, 
once you start rolling up data views and interpolating within and between 
steps. In other words, I doubt if Java or the JVM is directly to blame 
here, sounds more like erroneous assumptions - but detail on the matter is 
limited so far.


On Tuesday, July 3, 2012 11:13:57 AM UTC+2, fabrizio.giudici wrote:
>
> > I wonder if this is also related to the leap second which caused Reddit 
> > (using Cassandra, on the JVM), Mozilla (using Hadoop, also based on the 
> > JVM) and FourSquare, LinkedIN, StumbleUpon and Gawker to show hickups. 
>
>
> ... which somewhat surprised me, not because I can't figure out how a leap 
>   
> second can break things, but because there have been at least two dozens   
> leap seconds since 1970, so I presume a good deal of leap seconds in the   
> latest ten years, when Linux and Java already had a relevant share. So,   
> what the news are? Did really happened something new or things broke also 
>   
> in the past, but there was no such news coverage? 
>
>

-- 
You received this message because you are subscribed to the Google Groups "Java 
Posse" group.
To view this discussion on the web visit 
https://groups.google.com/d/msg/javaposse/-/b3F7A9_2Nr8J.
To post to this group, send email to javaposse@googlegroups.com.
To unsubscribe from this group, send email to 
javaposse+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/javaposse?hl=en.

Reply via email to