Yeah, we try to have a regular 3 month release cycle; see 
https://cwiki.apache.org/confluence/display/SPARK/Wiki+Homepage for the current 
window.

Matei

On Jul 16, 2014, at 4:21 PM, Mark Hamstra <m...@clearstorydata.com> wrote:

> You should expect master to compile and run: patches aren't merged unless 
> they build and pass tests on Jenkins.
> 
> You shouldn't expect new features to be added to stable code in maintenance 
> releases (e.g. 1.0.1).
> 
> AFAIK, we're still on track with Spark 1.1.0 development, which means that it 
> should be released sometime in the second half of next month (or shortly 
> thereafter).
> 
> 
> On Wed, Jul 16, 2014 at 4:03 PM, Paul Wais <pw...@yelp.com> wrote:
> Dear List,
> 
> The version of pyspark on master has a lot of nice new features, e.g. 
> SequenceFile reading, pickle i/o, etc: 
> https://github.com/apache/spark/blob/master/python/pyspark/context.py#L353
> 
> I downloaded the recent 1.0.1 release and was surprised to see the 
> distribution did not include these changes in master.  (I've tried pulling 
> master [ 9c249743ea ] and compiling from source, but I get a build failure in 
> TestSQLContext.scala FWIW).
> 
> Is an updated pyspark scheduled for the next release?  (Also, am I wrong in 
> expecting HEAD on master should probably compile and run?)
> 
> Best Regards,
> -Paul Wais
> 

Reply via email to