But it is hard to know how long customers stay with their most recent
download.

Cheers

On Thu, Apr 30, 2015 at 2:26 PM, Sree V <sree_at_ch...@yahoo.com.invalid>
wrote:

> If there is any possibility of getting the download counts,then we can use
> it as EOS criteria as well.Say, if download counts are lower than 30% (or
> another number) of Life time highest,then it qualifies for EOS.
>
> Thanking you.
>
> With Regards
> Sree
>
>
>      On Thursday, April 30, 2015 2:22 PM, Sree V
> <sree_at_ch...@yahoo.com.INVALID> wrote:
>
>
>  Hi Team,
> Should we take this opportunity to layout and evangelize a pattern for EOL
> of dependencies.I propose, we follow the official EOL of java, python,
> scala, .....And add say 6-12-24 months depending on the popularity.
> Java 6 official EOL Feb 2013Add 6-12 monthsAug 2013 - Feb 2014 official
> End of Support for Java 6 in SparkAnnounce 3-6 months prior to EOS.
>
> Thanking you.
>
> With Regards
> Sree
>
>
>     On Thursday, April 30, 2015 1:41 PM, Marcelo Vanzin <
> van...@cloudera.com> wrote:
>
>
>  As for the idea, I'm +1. Spark is the only reason I still have jdk6
> around - exactly because I don't want to cause the issue that started
> this discussion (inadvertently using JDK7 APIs). And as has been
> pointed out, even J7 is about to go EOL real soon.
>
> Even Hadoop is moving away (I think 2.7 will be j7-only). Hive 1.1 is
> already j7-only. And when Hadoop moves away from something, it's an
> event worthy of headlines. They're still on Jetty 6!
>
> As for pyspark, https://github.com/apache/spark/pull/5580 should get
> rid of the last incompatibility with large assemblies, by keeping the
> python files in separate archives. If we remove support for Java 6,
> then we don't need to worry about the size of the assembly anymore.
>
> On Thu, Apr 30, 2015 at 1:32 PM, Sean Owen <so...@cloudera.com> wrote:
> > I'm firmly in favor of this.
> >
> > It would also fix https://issues.apache.org/jira/browse/SPARK-7009 and
> > avoid any more of the long-standing 64K file limit thing that's still
> > a problem for PySpark.
>
> --
> Marcelo
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> For additional commands, e-mail: dev-h...@spark.apache.org
>
>
>
>
>
>
>

Reply via email to