On Sat, Jan 9, 2016 at 1:48 PM, Sean Owen wrote:
> (For similar reasons I personally don't favor supporting Java 7 or
> Scala 2.10 in Spark 2.x.)
That reflects my sentiments as well. Thanks Sean for bringing that up!
Jacek
-
T
Chiming in late, but my take on this line of argument is: these
companies are welcome to keep using Spark 1.x. If anything the
argument here is about how long to maintain 1.x, and indeed, it's
going to go dormant quite soon.
But using RHEL 6 (or any old-er version of any platform) and not
wanting
+1
Companies that use stock python in redhat 2.6 will need to upgrade or
install fresh version wich is total of 3.5 minutes so no issues ...
On Tue, Jan 5, 2016 at 2:17 AM, Reynold Xin wrote:
> Does anybody here care about us dropping support for Python 2.6 in Spark
> 2.0?
>
> Python 2.6 is anci
Hi,
https://github.com/apache/spark/pull/10674
Please review and merge at your convenience. Thanks!
Pozdrawiam,
Jacek
Jacek Laskowski | https://medium.com/@jaceklaskowski/
Mastering Apache Spark
==> https://jaceklaskowski.gitbooks.io/mastering-apache-spark/
Follow me at https://twitter.com/jace
Figured it out and reported
https://issues.apache.org/jira/browse/SPARK-12736. Fix's coming...
Pozdrawiam,
Jacek
Jacek Laskowski | https://medium.com/@jaceklaskowski/
Mastering Apache Spark
==> https://jaceklaskowski.gitbooks.io/mastering-apache-spark/
Follow me at https://twitter.com/jaceklaskow
> On 7 Jan 2016, at 19:55, Juliet Hougland wrote:
>
> @ Reynold Xin @Josh Rosen: What is current maintenance burden of supporting
> Python 2.6? What libraries are no longer supporting Python 2.6 and where does
> Spark use them?
>
generally the cost comes in the test matrix: one more thing to
Hi,
I think the change is related:
https://github.com/apache/spark/commit/659fd9d04b988d48960eac4f352ca37066f43f5c
as it touches the dependency in pom.xml.
Pozdrawiam,
Jacek
Jacek Laskowski | https://medium.com/@jaceklaskowski/
Mastering Apache Spark
==> https://jaceklaskowski.gitbooks.io/master
Hi,
With today's sources I'm facing "NoClassDefFoundError:
org/spark-project/guava/collect/Maps" while starting standalone Master
using ./sbin/start-master.sh.
Anyone's working on it? File an issue?
Spark Command: /Library/Java/JavaVirtualMachines/Current/Contents/Home/bin/java
-cp
/Users/jacek