Re: Use Hadoop-3.2 as a default Hadoop profile in 3.0.0?

2020-07-11 Thread gpongracz
As someone who mainly operates in AWS it would be very welcome to have the option to use an updated version of hadoop using pyspark sourced from pypi. Acknowledging the issues of backwards compatability... The most vexing issue is the lack of ability to use s3a STS, ie org.apache.hadoop.fs.s3a.Te

Starting work on last Scala 2.13 updates

2020-07-11 Thread Sean Owen
I call attention to https://github.com/apache/spark/pull/28971 which represents part 1 of several changes that are the last large change for Scala 2.13, except for REPL updates: dealing with the fact that default collection types will be immutable. The goal of course is to cross-compile without se