Thanks all! (And thanks Matei for the developer link!) I was able to
build using maven[1] but `./sbt/sbt assembly` results in build errors.
(Not familiar enough with the build to know why; in the past sbt
worked for me and maven did not).
I was able to run the master version of pyspark, which
Dear List,
The version of pyspark on master has a lot of nice new features, e.g.
SequenceFile reading, pickle i/o, etc:
https://github.com/apache/spark/blob/master/python/pyspark/context.py#L353
I downloaded the recent 1.0.1 release and was surprised to see the
distribution did not include these
You should expect master to compile and run: patches aren't merged unless
they build and pass tests on Jenkins.
You shouldn't expect new features to be added to stable code in maintenance
releases (e.g. 1.0.1).
AFAIK, we're still on track with Spark 1.1.0 development, which means that
it should
Yeah, we try to have a regular 3 month release cycle; see
https://cwiki.apache.org/confluence/display/SPARK/Wiki+Homepage for the current
window.
Matei
On Jul 16, 2014, at 4:21 PM, Mark Hamstra m...@clearstorydata.com wrote:
You should expect master to compile and run: patches aren't merged
You should try cleaning and then building. We have recently hit a bug in
the scala compiler that sometimes causes non-clean builds to fail.
On Wed, Jul 16, 2014 at 7:56 PM, Matei Zaharia matei.zaha...@gmail.com
wrote:
Yeah, we try to have a regular 3 month release cycle; see