Re: Scala 2.10 Merge

2013-12-14 Thread Patrick Wendell
Alright I just merged this in - so Spark is officially Scala 2.10
from here forward.

For reference I cut a new branch called scala-2.9 with the commit
immediately prior to the merge:
https://git-wip-us.apache.org/repos/asf/incubator-spark/repo?p=incubator-spark.git;a=shortlog;h=refs/heads/scala-2.9

- Patrick

On Thu, Dec 12, 2013 at 8:26 PM, Patrick Wendell pwend...@gmail.com wrote:
 Hey Reymond,

 Let's move this discussion out of this thread and into the associated JIRA.
 I'll write up our current approach over there.

 https://spark-project.atlassian.net/browse/SPARK-995

 - Patrick


 On Thu, Dec 12, 2013 at 5:56 PM, Liu, Raymond raymond@intel.com wrote:

 Hi Patrick

 So what's the plan for support Yarn 2.2 in 0.9? As far as I can
 see, if you want to support both 2.2 and 2.0 , due to protobuf version
 incompatible issue. You need two version of akka anyway.

 Akka 2.3-M1 looks like have a little bit change in API, we
 probably could isolate the code like what we did on yarn part API. I
 remember that it is mentioned that to use reflection for different API is
 preferred. So the purpose to use reflection is to use one release bin jar to
 support both version of Hadoop/Yarn on runtime, instead of build different
 bin jar on compile time?

  Then all code related to hadoop will also be built in separate
 modules for loading on demand? This sounds to me involve a lot of works. And
 you still need to have shim layer and separate code for different version
 API and depends on different version Akka etc. Sounds like and even strict
 demands versus our current approaching on master, and with dynamic class
 loader in addition, And the problem we are facing now are still there?

 Best Regards,
 Raymond Liu

 -Original Message-
 From: Patrick Wendell [mailto:pwend...@gmail.com]
 Sent: Thursday, December 12, 2013 5:13 PM
 To: dev@spark.incubator.apache.org
 Subject: Re: Scala 2.10 Merge

 Also - the code is still there because of a recent merge that took in some
 newer changes... we'll be removing it for the final merge.


 On Thu, Dec 12, 2013 at 1:12 AM, Patrick Wendell pwend...@gmail.com
 wrote:

  Hey Raymond,
 
  This won't work because AFAIK akka 2.3-M1 is not binary compatible
  with akka 2.2.3 (right?). For all of the non-yarn 2.2 versions we need
  to still use the older protobuf library, so we'd need to support both.
 
  I'd also be concerned about having a reference to a non-released
  version of akka. Akka is the source of our hardest-to-find bugs and
  simultaneously trying to support 2.2.3 and 2.3-M1 is a bit daunting.
  Of course, if you are building off of master you can maintain a fork
  that uses this.
 
  - Patrick
 
 
  On Thu, Dec 12, 2013 at 12:42 AM, Liu, Raymond
  raymond@intel.comwrote:
 
  Hi Patrick
 
  What does that means for drop YARN 2.2? seems codes are still
  there. You mean if build upon 2.2 it will break, and won't and work
  right?
  Since the home made akka build on scala 2.10 are not there. While, if
  for this case, can we just use akka 2.3-M1 which run on protobuf 2.5
  for replacement?
 
  Best Regards,
  Raymond Liu
 
 
  -Original Message-
  From: Patrick Wendell [mailto:pwend...@gmail.com]
  Sent: Thursday, December 12, 2013 4:21 PM
  To: dev@spark.incubator.apache.org
  Subject: Scala 2.10 Merge
 
  Hi Developers,
 
  In the next few days we are planning to merge Scala 2.10 support into
  Spark. For those that haven't been following this, Prashant Sharma
  has been maintaining the scala-2.10 branch of Spark for several
  months. This branch is current with master and has been reviewed for
  merging:
 
  https://github.com/apache/incubator-spark/tree/scala-2.10
 
  Scala 2.10 support is one of the most requested features for Spark -
  it will be great to get this into Spark 0.9! Please note that *Scala
  2.10 is not binary compatible with Scala 2.9*. With that in mind, I
  wanted to give a few heads-up/requests to developers:
 
  If you are developing applications on top of Spark's master branch,
  those will need to migrate to Scala 2.10. You may want to download
  and test the current scala-2.10 branch in order to make sure you will
  be okay as Spark developments move forward. Of course, you can always
  stick with the current master commit and be fine (I'll cut a tag when
  we do the merge in order to delineate where the version changes).
  Please open new threads on the dev list to report and discuss any
  issues.
 
  This merge will temporarily drop support for YARN 2.2 on the master
  branch.
  This is because the workaround we used was only compiled for Scala 2.9.
  We are going to come up with a more robust solution to YARN 2.2
  support before releasing 0.9.
 
  Going forward, we will continue to make maintenance releases on
  branch-0.8 which will remain compatible with Scala 2.9.
 
  For those interested, the primary code changes in this merge are
  upgrading the akka version, changing the use of 

Re: Scala 2.10 Merge

2013-12-14 Thread Nick Pentreath
Whoohoo!

Great job everyone especially Prashant!

—
Sent from Mailbox for iPhone

On Sat, Dec 14, 2013 at 10:59 AM, Patrick Wendell pwend...@gmail.com
wrote:

 Alright I just merged this in - so Spark is officially Scala 2.10
 from here forward.
 For reference I cut a new branch called scala-2.9 with the commit
 immediately prior to the merge:
 https://git-wip-us.apache.org/repos/asf/incubator-spark/repo?p=incubator-spark.git;a=shortlog;h=refs/heads/scala-2.9
 - Patrick
 On Thu, Dec 12, 2013 at 8:26 PM, Patrick Wendell pwend...@gmail.com wrote:
 Hey Reymond,

 Let's move this discussion out of this thread and into the associated JIRA.
 I'll write up our current approach over there.

 https://spark-project.atlassian.net/browse/SPARK-995

 - Patrick


 On Thu, Dec 12, 2013 at 5:56 PM, Liu, Raymond raymond@intel.com wrote:

 Hi Patrick

 So what's the plan for support Yarn 2.2 in 0.9? As far as I can
 see, if you want to support both 2.2 and 2.0 , due to protobuf version
 incompatible issue. You need two version of akka anyway.

 Akka 2.3-M1 looks like have a little bit change in API, we
 probably could isolate the code like what we did on yarn part API. I
 remember that it is mentioned that to use reflection for different API is
 preferred. So the purpose to use reflection is to use one release bin jar to
 support both version of Hadoop/Yarn on runtime, instead of build different
 bin jar on compile time?

  Then all code related to hadoop will also be built in separate
 modules for loading on demand? This sounds to me involve a lot of works. And
 you still need to have shim layer and separate code for different version
 API and depends on different version Akka etc. Sounds like and even strict
 demands versus our current approaching on master, and with dynamic class
 loader in addition, And the problem we are facing now are still there?

 Best Regards,
 Raymond Liu

 -Original Message-
 From: Patrick Wendell [mailto:pwend...@gmail.com]
 Sent: Thursday, December 12, 2013 5:13 PM
 To: dev@spark.incubator.apache.org
 Subject: Re: Scala 2.10 Merge

 Also - the code is still there because of a recent merge that took in some
 newer changes... we'll be removing it for the final merge.


 On Thu, Dec 12, 2013 at 1:12 AM, Patrick Wendell pwend...@gmail.com
 wrote:

  Hey Raymond,
 
  This won't work because AFAIK akka 2.3-M1 is not binary compatible
  with akka 2.2.3 (right?). For all of the non-yarn 2.2 versions we need
  to still use the older protobuf library, so we'd need to support both.
 
  I'd also be concerned about having a reference to a non-released
  version of akka. Akka is the source of our hardest-to-find bugs and
  simultaneously trying to support 2.2.3 and 2.3-M1 is a bit daunting.
  Of course, if you are building off of master you can maintain a fork
  that uses this.
 
  - Patrick
 
 
  On Thu, Dec 12, 2013 at 12:42 AM, Liu, Raymond
  raymond@intel.comwrote:
 
  Hi Patrick
 
  What does that means for drop YARN 2.2? seems codes are still
  there. You mean if build upon 2.2 it will break, and won't and work
  right?
  Since the home made akka build on scala 2.10 are not there. While, if
  for this case, can we just use akka 2.3-M1 which run on protobuf 2.5
  for replacement?
 
  Best Regards,
  Raymond Liu
 
 
  -Original Message-
  From: Patrick Wendell [mailto:pwend...@gmail.com]
  Sent: Thursday, December 12, 2013 4:21 PM
  To: dev@spark.incubator.apache.org
  Subject: Scala 2.10 Merge
 
  Hi Developers,
 
  In the next few days we are planning to merge Scala 2.10 support into
  Spark. For those that haven't been following this, Prashant Sharma
  has been maintaining the scala-2.10 branch of Spark for several
  months. This branch is current with master and has been reviewed for
  merging:
 
  https://github.com/apache/incubator-spark/tree/scala-2.10
 
  Scala 2.10 support is one of the most requested features for Spark -
  it will be great to get this into Spark 0.9! Please note that *Scala
  2.10 is not binary compatible with Scala 2.9*. With that in mind, I
  wanted to give a few heads-up/requests to developers:
 
  If you are developing applications on top of Spark's master branch,
  those will need to migrate to Scala 2.10. You may want to download
  and test the current scala-2.10 branch in order to make sure you will
  be okay as Spark developments move forward. Of course, you can always
  stick with the current master commit and be fine (I'll cut a tag when
  we do the merge in order to delineate where the version changes).
  Please open new threads on the dev list to report and discuss any
  issues.
 
  This merge will temporarily drop support for YARN 2.2 on the master
  branch.
  This is because the workaround we used was only compiled for Scala 2.9.
  We are going to come up with a more robust solution to YARN 2.2
  support before releasing 0.9.
 
  Going forward, we will continue to make maintenance releases on
  branch-0.8 

Re: Scala 2.10 Merge

2013-12-14 Thread Sam Bessalah
Yes. Awesome.
Great job guys.

Sam Bessalah

 On Dec 14, 2013, at 9:59 AM, Patrick Wendell pwend...@gmail.com wrote:
 
 Alright I just merged this in - so Spark is officially Scala 2.10
 from here forward.
 
 For reference I cut a new branch called scala-2.9 with the commit
 immediately prior to the merge:
 https://git-wip-us.apache.org/repos/asf/incubator-spark/repo?p=incubator-spark.git;a=shortlog;h=refs/heads/scala-2.9
 
 - Patrick
 
 On Thu, Dec 12, 2013 at 8:26 PM, Patrick Wendell pwend...@gmail.com wrote:
 Hey Reymond,
 
 Let's move this discussion out of this thread and into the associated JIRA.
 I'll write up our current approach over there.
 
 https://spark-project.atlassian.net/browse/SPARK-995
 
 - Patrick
 
 
 On Thu, Dec 12, 2013 at 5:56 PM, Liu, Raymond raymond@intel.com wrote:
 
 Hi Patrick
 
So what's the plan for support Yarn 2.2 in 0.9? As far as I can
 see, if you want to support both 2.2 and 2.0 , due to protobuf version
 incompatible issue. You need two version of akka anyway.
 
Akka 2.3-M1 looks like have a little bit change in API, we
 probably could isolate the code like what we did on yarn part API. I
 remember that it is mentioned that to use reflection for different API is
 preferred. So the purpose to use reflection is to use one release bin jar to
 support both version of Hadoop/Yarn on runtime, instead of build different
 bin jar on compile time?
 
 Then all code related to hadoop will also be built in separate
 modules for loading on demand? This sounds to me involve a lot of works. And
 you still need to have shim layer and separate code for different version
 API and depends on different version Akka etc. Sounds like and even strict
 demands versus our current approaching on master, and with dynamic class
 loader in addition, And the problem we are facing now are still there?
 
 Best Regards,
 Raymond Liu
 
 -Original Message-
 From: Patrick Wendell [mailto:pwend...@gmail.com]
 Sent: Thursday, December 12, 2013 5:13 PM
 To: dev@spark.incubator.apache.org
 Subject: Re: Scala 2.10 Merge
 
 Also - the code is still there because of a recent merge that took in some
 newer changes... we'll be removing it for the final merge.
 
 
 On Thu, Dec 12, 2013 at 1:12 AM, Patrick Wendell pwend...@gmail.com
 wrote:
 
 Hey Raymond,
 
 This won't work because AFAIK akka 2.3-M1 is not binary compatible
 with akka 2.2.3 (right?). For all of the non-yarn 2.2 versions we need
 to still use the older protobuf library, so we'd need to support both.
 
 I'd also be concerned about having a reference to a non-released
 version of akka. Akka is the source of our hardest-to-find bugs and
 simultaneously trying to support 2.2.3 and 2.3-M1 is a bit daunting.
 Of course, if you are building off of master you can maintain a fork
 that uses this.
 
 - Patrick
 
 
 On Thu, Dec 12, 2013 at 12:42 AM, Liu, Raymond
 raymond@intel.comwrote:
 
 Hi Patrick
 
What does that means for drop YARN 2.2? seems codes are still
 there. You mean if build upon 2.2 it will break, and won't and work
 right?
 Since the home made akka build on scala 2.10 are not there. While, if
 for this case, can we just use akka 2.3-M1 which run on protobuf 2.5
 for replacement?
 
 Best Regards,
 Raymond Liu
 
 
 -Original Message-
 From: Patrick Wendell [mailto:pwend...@gmail.com]
 Sent: Thursday, December 12, 2013 4:21 PM
 To: dev@spark.incubator.apache.org
 Subject: Scala 2.10 Merge
 
 Hi Developers,
 
 In the next few days we are planning to merge Scala 2.10 support into
 Spark. For those that haven't been following this, Prashant Sharma
 has been maintaining the scala-2.10 branch of Spark for several
 months. This branch is current with master and has been reviewed for
 merging:
 
 https://github.com/apache/incubator-spark/tree/scala-2.10
 
 Scala 2.10 support is one of the most requested features for Spark -
 it will be great to get this into Spark 0.9! Please note that *Scala
 2.10 is not binary compatible with Scala 2.9*. With that in mind, I
 wanted to give a few heads-up/requests to developers:
 
 If you are developing applications on top of Spark's master branch,
 those will need to migrate to Scala 2.10. You may want to download
 and test the current scala-2.10 branch in order to make sure you will
 be okay as Spark developments move forward. Of course, you can always
 stick with the current master commit and be fine (I'll cut a tag when
 we do the merge in order to delineate where the version changes).
 Please open new threads on the dev list to report and discuss any
 issues.
 
 This merge will temporarily drop support for YARN 2.2 on the master
 branch.
 This is because the workaround we used was only compiled for Scala 2.9.
 We are going to come up with a more robust solution to YARN 2.2
 support before releasing 0.9.
 
 Going forward, we will continue to make maintenance releases on
 branch-0.8 which will remain compatible with Scala 2.9.
 
 For those interested, the 

Re : Scala 2.10 Merge

2013-12-14 Thread andy.petre...@gmail.com
That's a very good news!
Congrats

Envoyé depuis mon HTC

- Reply message -
De : Sam Bessalah samkil...@gmail.com
Pour : dev@spark.incubator.apache.org dev@spark.incubator.apache.org
Objet : Scala 2.10 Merge
Date : sam., déc. 14, 2013 11:03


Yes. Awesome.
Great job guys.

Sam Bessalah

 On Dec 14, 2013, at 9:59 AM, Patrick Wendell pwend...@gmail.com wrote:
 
 Alright I just merged this in - so Spark is officially Scala 2.10
 from here forward.
 
 For reference I cut a new branch called scala-2.9 with the commit
 immediately prior to the merge:
 https://git-wip-us.apache.org/repos/asf/incubator-spark/repo?p=incubator-spark.git;a=shortlog;h=refs/heads/scala-2.9
 
 - Patrick
 
 On Thu, Dec 12, 2013 at 8:26 PM, Patrick Wendell pwend...@gmail.com wrote:
 Hey Reymond,
 
 Let's move this discussion out of this thread and into the associated JIRA.
 I'll write up our current approach over there.
 
 https://spark-project.atlassian.net/browse/SPARK-995
 
 - Patrick
 
 
 On Thu, Dec 12, 2013 at 5:56 PM, Liu, Raymond raymond@intel.com wrote:
 
 Hi Patrick
 
So what's the plan for support Yarn 2.2 in 0.9? As far as I can
 see, if you want to support both 2.2 and 2.0 , due to protobuf version
 incompatible issue. You need two version of akka anyway.
 
Akka 2.3-M1 looks like have a little bit change in API, we
 probably could isolate the code like what we did on yarn part API. I
 remember that it is mentioned that to use reflection for different API is
 preferred. So the purpose to use reflection is to use one release bin jar to
 support both version of Hadoop/Yarn on runtime, instead of build different
 bin jar on compile time?
 
 Then all code related to hadoop will also be built in separate
 modules for loading on demand? This sounds to me involve a lot of works. And
 you still need to have shim layer and separate code for different version
 API and depends on different version Akka etc. Sounds like and even strict
 demands versus our current approaching on master, and with dynamic class
 loader in addition, And the problem we are facing now are still there?
 
 Best Regards,
 Raymond Liu
 
 -Original Message-
 From: Patrick Wendell [mailto:pwend...@gmail.com]
 Sent: Thursday, December 12, 2013 5:13 PM
 To: dev@spark.incubator.apache.org
 Subject: Re: Scala 2.10 Merge
 
 Also - the code is still there because of a recent merge that took in some
 newer changes... we'll be removing it for the final merge.
 
 
 On Thu, Dec 12, 2013 at 1:12 AM, Patrick Wendell pwend...@gmail.com
 wrote:
 
 Hey Raymond,
 
 This won't work because AFAIK akka 2.3-M1 is not binary compatible
 with akka 2.2.3 (right?). For all of the non-yarn 2.2 versions we need
 to still use the older protobuf library, so we'd need to support both.
 
 I'd also be concerned about having a reference to a non-released
 version of akka. Akka is the source of our hardest-to-find bugs and
 simultaneously trying to support 2.2.3 and 2.3-M1 is a bit daunting.
 Of course, if you are building off of master you can maintain a fork
 that uses this.
 
 - Patrick
 
 
 On Thu, Dec 12, 2013 at 12:42 AM, Liu, Raymond
 raymond@intel.comwrote:
 
 Hi Patrick
 
What does that means for drop YARN 2.2? seems codes are still
 there. You mean if build upon 2.2 it will break, and won't and work
 right?
 Since the home made akka build on scala 2.10 are not there. While, if
 for this case, can we just use akka 2.3-M1 which run on protobuf 2.5
 for replacement?
 
 Best Regards,
 Raymond Liu
 
 
 -Original Message-
 From: Patrick Wendell [mailto:pwend...@gmail.com]
 Sent: Thursday, December 12, 2013 4:21 PM
 To: dev@spark.incubator.apache.org
 Subject: Scala 2.10 Merge
 
 Hi Developers,
 
 In the next few days we are planning to merge Scala 2.10 support into
 Spark. For those that haven't been following this, Prashant Sharma
 has been maintaining the scala-2.10 branch of Spark for several
 months. This branch is current with master and has been reviewed for
 merging:
 
 https://github.com/apache/incubator-spark/tree/scala-2.10
 
 Scala 2.10 support is one of the most requested features for Spark -
 it will be great to get this into Spark 0.9! Please note that *Scala
 2.10 is not binary compatible with Scala 2.9*. With that in mind, I
 wanted to give a few heads-up/requests to developers:
 
 If you are developing applications on top of Spark's master branch,
 those will need to migrate to Scala 2.10. You may want to download
 and test the current scala-2.10 branch in order to make sure you will
 be okay as Spark developments move forward. Of course, you can always
 stick with the current master commit and be fine (I'll cut a tag when
 we do the merge in order to delineate where the version changes).
 Please open new threads on the dev list to report and discuss any
 issues.
 
 This merge will temporarily drop support for YARN 2.2 on the master
 branch.
 This is because the workaround we used was only compiled for Scala 2.9.
 

Re: [VOTE] Release Apache Spark 0.8.1-incubating (rc4)

2013-12-14 Thread Henry Saputra
Hi Patrick, as sebb has mentioned let's move the binaries from the
voting directory in your people.apache.org directory.
ASF release voting is for source code and not binaries, and
technically we provide binaries for convenience.

And add link to the KEYS location in the dist[1] to let verify signatures.

Sorry for the late response to the VOTE thread, guys.

- Henry

[1] https://dist.apache.org/repos/dist/release/incubator/spark/KEYS

On Fri, Dec 13, 2013 at 6:37 PM, Patrick Wendell pwend...@gmail.com wrote:
 The vote is now closed. This vote passes with 5 PPMC +1's and no 0 or -1
 votes.

 +1 (5 Total)
 Matei Zaharia*
 Nick Pentreath*
 Patrick Wendell*
 Prashant Sharma*
 Tom Graves*

 0 (0 Total)

 -1 (0 Total)

 * = Binding Vote

 As per the incubator release guide [1] I'll be sending this to the
 general incubator list for a final vote from IPMC members.

 [1]
 http://incubator.apache.org/guides/releasemanagement.html#best-practice-incubator-release-
 vote


 On Thu, Dec 12, 2013 at 8:59 AM, Evan Chan e...@ooyala.com wrote:

 I'd be personally fine with a standard workflow of assemble-deps +
 packaging just the Spark files as separate packages, if it speeds up
 everyone's development time.


 On Wed, Dec 11, 2013 at 1:10 PM, Mark Hamstra m...@clearstorydata.com
 wrote:

  I don't know how to make sense of the numbers, but here's what I've got
  from a very small sample size.
 
  For both v0.8.0-incubating and v0.8.1-incubating, building separate
  assemblies is faster than `./sbt/sbt assembly` and the times for building
  separate assemblies for 0.8.0 and 0.8.1 are about the same.
 
  For v0.8.0-incubating, `./sbt/sbt assembly` takes about 2.5x as long as
 the
  sum of the separate assemblies.
  For v0.8.1-incubating, `./sbt/sbt assembly` takes almost 8x as long as
 the
  sum of the separate assemblies.
 
  Weird.
 
 
  On Wed, Dec 11, 2013 at 11:49 AM, Patrick Wendell pwend...@gmail.com
  wrote:
 
   I'll +1 myself also.
  
   For anyone who has the slow build problem: does this issue happen when
   building v0.8.0-incubating also? Trying to figure out whether it's
   related to something we added in 0.8.1 or if it's a long standing
   issue.
  
   - Patrick
  
   On Wed, Dec 11, 2013 at 10:39 AM, Matei Zaharia 
 matei.zaha...@gmail.com
  
   wrote:
Woah, weird, but definitely good to know.
   
If you’re doing Spark development, there’s also a more convenient
  option
   added by Shivaram in the master branch. You can do sbt assemble-deps to
   package *just* the dependencies of each project in a special assembly
  JAR,
   and then use sbt compile to update the code. This will use the classes
   directly out of the target/scala-2.9.3/classes directories. You have to
   redo assemble-deps only if your external dependencies change.
   
Matei
   
On Dec 11, 2013, at 1:04 AM, Prashant Sharma scrapco...@gmail.com
   wrote:
   
I hope this PR https://github.com/apache/incubator-spark/pull/252can
   help.
Again this is not a blocker for the release from my side either.
   
   
On Wed, Dec 11, 2013 at 2:14 PM, Mark Hamstra 
  m...@clearstorydata.com
   wrote:
   
Interesting, and confirmed: On my machine where `./sbt/sbt
 assembly`
   takes
a long, long, long time to complete (a MBP, in my case),
 building
   three
separate assemblies (`./sbt/sbt assembly/assembly`, `./sbt/sbt
examples/assembly`, `./sbt/sbt tools/assembly`) takes much, much
 less
   time.
   
   
   
On Wed, Dec 11, 2013 at 12:02 AM, Prashant Sharma 
   scrapco...@gmail.com
wrote:
   
forgot to mention, after running sbt/sbt assembly/assembly running
sbt/sbt
examples/assembly takes just 37s. Not to mention my hardware is
 not
really
great.
   
   
On Wed, Dec 11, 2013 at 1:28 PM, Prashant Sharma 
   scrapco...@gmail.com
wrote:
   
Hi Patrick and Matei,
   
Was trying out this and followed the quick start guide which says
  do
sbt/sbt assembly, like few others I was also stuck for few
 minutes
  on
linux. On the other hand if I use sbt/sbt assembly/assembly it is
   much
faster.
   
Should we change the documentation to reflect this. It will not
 be
great
for first time users to get stuck there.
   
   
On Wed, Dec 11, 2013 at 9:54 AM, Matei Zaharia 
matei.zaha...@gmail.com
wrote:
   
+1
   
Built and tested it on Mac OS X.
   
Matei
   
   
On Dec 10, 2013, at 4:49 PM, Patrick Wendell 
 pwend...@gmail.com
wrote:
   
Please vote on releasing the following candidate as Apache
 Spark
(incubating) version 0.8.1.
   
The tag to be voted on is v0.8.1-incubating (commit b87d31d):
   
   
   
   
  
 
 https://git-wip-us.apache.org/repos/asf/incubator-spark/repo?p=incubator-spark.git;a=commit;h=b87d31dd8eb4b4e47c0138e9242d0dd6922c8c4e
   
The release files, including signatures, digests, etc can be
  found
at:
http://people.apache.org/~pwendell/spark-0.8.1-incubating-rc4/
   

Re: [VOTE] Release Apache Spark 0.8.1-incubating (rc4)

2013-12-14 Thread Henry Saputra
Actually we should be fine putting the binaries there as long as the
VOTE is for the source.

Let's verify with sebb in the general@ list about his concern.

- Henry

On Sat, Dec 14, 2013 at 10:31 AM, Henry Saputra henry.sapu...@gmail.com wrote:
 Hi Patrick, as sebb has mentioned let's move the binaries from the
 voting directory in your people.apache.org directory.
 ASF release voting is for source code and not binaries, and
 technically we provide binaries for convenience.

 And add link to the KEYS location in the dist[1] to let verify signatures.

 Sorry for the late response to the VOTE thread, guys.

 - Henry

 [1] https://dist.apache.org/repos/dist/release/incubator/spark/KEYS

 On Fri, Dec 13, 2013 at 6:37 PM, Patrick Wendell pwend...@gmail.com wrote:
 The vote is now closed. This vote passes with 5 PPMC +1's and no 0 or -1
 votes.

 +1 (5 Total)
 Matei Zaharia*
 Nick Pentreath*
 Patrick Wendell*
 Prashant Sharma*
 Tom Graves*

 0 (0 Total)

 -1 (0 Total)

 * = Binding Vote

 As per the incubator release guide [1] I'll be sending this to the
 general incubator list for a final vote from IPMC members.

 [1]
 http://incubator.apache.org/guides/releasemanagement.html#best-practice-incubator-release-
 vote


 On Thu, Dec 12, 2013 at 8:59 AM, Evan Chan e...@ooyala.com wrote:

 I'd be personally fine with a standard workflow of assemble-deps +
 packaging just the Spark files as separate packages, if it speeds up
 everyone's development time.


 On Wed, Dec 11, 2013 at 1:10 PM, Mark Hamstra m...@clearstorydata.com
 wrote:

  I don't know how to make sense of the numbers, but here's what I've got
  from a very small sample size.
 
  For both v0.8.0-incubating and v0.8.1-incubating, building separate
  assemblies is faster than `./sbt/sbt assembly` and the times for building
  separate assemblies for 0.8.0 and 0.8.1 are about the same.
 
  For v0.8.0-incubating, `./sbt/sbt assembly` takes about 2.5x as long as
 the
  sum of the separate assemblies.
  For v0.8.1-incubating, `./sbt/sbt assembly` takes almost 8x as long as
 the
  sum of the separate assemblies.
 
  Weird.
 
 
  On Wed, Dec 11, 2013 at 11:49 AM, Patrick Wendell pwend...@gmail.com
  wrote:
 
   I'll +1 myself also.
  
   For anyone who has the slow build problem: does this issue happen when
   building v0.8.0-incubating also? Trying to figure out whether it's
   related to something we added in 0.8.1 or if it's a long standing
   issue.
  
   - Patrick
  
   On Wed, Dec 11, 2013 at 10:39 AM, Matei Zaharia 
 matei.zaha...@gmail.com
  
   wrote:
Woah, weird, but definitely good to know.
   
If you’re doing Spark development, there’s also a more convenient
  option
   added by Shivaram in the master branch. You can do sbt assemble-deps to
   package *just* the dependencies of each project in a special assembly
  JAR,
   and then use sbt compile to update the code. This will use the classes
   directly out of the target/scala-2.9.3/classes directories. You have to
   redo assemble-deps only if your external dependencies change.
   
Matei
   
On Dec 11, 2013, at 1:04 AM, Prashant Sharma scrapco...@gmail.com
   wrote:
   
I hope this PR https://github.com/apache/incubator-spark/pull/252can
   help.
Again this is not a blocker for the release from my side either.
   
   
On Wed, Dec 11, 2013 at 2:14 PM, Mark Hamstra 
  m...@clearstorydata.com
   wrote:
   
Interesting, and confirmed: On my machine where `./sbt/sbt
 assembly`
   takes
a long, long, long time to complete (a MBP, in my case),
 building
   three
separate assemblies (`./sbt/sbt assembly/assembly`, `./sbt/sbt
examples/assembly`, `./sbt/sbt tools/assembly`) takes much, much
 less
   time.
   
   
   
On Wed, Dec 11, 2013 at 12:02 AM, Prashant Sharma 
   scrapco...@gmail.com
wrote:
   
forgot to mention, after running sbt/sbt assembly/assembly running
sbt/sbt
examples/assembly takes just 37s. Not to mention my hardware is
 not
really
great.
   
   
On Wed, Dec 11, 2013 at 1:28 PM, Prashant Sharma 
   scrapco...@gmail.com
wrote:
   
Hi Patrick and Matei,
   
Was trying out this and followed the quick start guide which says
  do
sbt/sbt assembly, like few others I was also stuck for few
 minutes
  on
linux. On the other hand if I use sbt/sbt assembly/assembly it is
   much
faster.
   
Should we change the documentation to reflect this. It will not
 be
great
for first time users to get stuck there.
   
   
On Wed, Dec 11, 2013 at 9:54 AM, Matei Zaharia 
matei.zaha...@gmail.com
wrote:
   
+1
   
Built and tested it on Mac OS X.
   
Matei
   
   
On Dec 10, 2013, at 4:49 PM, Patrick Wendell 
 pwend...@gmail.com
wrote:
   
Please vote on releasing the following candidate as Apache
 Spark
(incubating) version 0.8.1.
   
The tag to be voted on is v0.8.1-incubating (commit b87d31d):