Re: [VOTE] Release Apache Spark 1.0.0 (rc5)

2014-05-14 Thread witgo
SPARK-1817 will cause users to get incorrect results and RDD.zip is common usage . This should be the highest priority. I think we should fix the bug,and should also test the previous release -- Original -- From: "Patrick Wendell";; Date: Wed, May 14, 2014 03:02

Re: [VOTE] Release Apache Spark 1.0.0 (rc5)

2014-05-14 Thread witgo
You need to set: spark.akka.frameSize 5 spark.default.parallelism1 -- Original -- From: "Madhu";; Date: Wed, May 14, 2014 09:15 AM To: "dev"; Subject: Re: [VOTE] Release Apache Spark 1.0.0 (rc5) I just built rc5 on Windows 7 and tried to re

Re: [VOTE] Release Apache Spark 1.0.0 (rc5)

2014-05-14 Thread Sean Owen
On Tue, May 13, 2014 at 2:49 PM, Sean Owen wrote: > On Tue, May 13, 2014 at 9:36 AM, Patrick Wendell wrote: >> The release files, including signatures, digests, etc. can be found at: >> http://people.apache.org/~pwendell/spark-1.0.0-rc5/ > > Good news is that the sigs, MD5 and SHA are all correct

Re: [VOTE] Release Apache Spark 1.0.0 (rc5)

2014-05-14 Thread Patrick Wendell
Hey @witgo - those bugs are not severe enough to block the release, but it would be nice to get them fixed. At this point we are focused on severe bugs with an immediate fix, or regressions from previous versions of Spark. Anything that misses this release will get merged into the branch-1.0 branc

Re: [VOTE] Release Apache Spark 1.0.0 (rc5)

2014-05-14 Thread Madhu
I built rc5 using sbt/sbt assembly on Linux without any problems. There used to be an sbt.cmd for Windows build, has that been deprecated? If so, I can document the Windows build steps that worked for me. -- View this message in context: http://apache-spark-developers-list.1001551.n3.nabble.com

Requirements of objects stored in RDDs

2014-05-14 Thread Soren Macbeth
Hi, What are the requirements of objects that are stored in RDDs? I'm still struggling with an exception I've already posted about several times. My questions are: 1) What interfaces are objects stored in RDDs expected to implement, if any? 2) Are collections (be they scala, java or otherwise) h

Re: Problem creating objects through reflection

2014-05-14 Thread Rohit Rai
Hi Piotr, The easiest solution to this for now is to write all your code (including the case class) inside a Object and the execution part in a method in the object. Then you can call the method on the spark shell to executed your code. Cheers, Rohit *Founder & CEO, **Tuplejump, Inc.* _