Re: [VOTE] Release Apache Spark 1.1.0 (RC4)

2014-09-03 Thread Mubarak Seyed
+1 (non-binding)

Tested locally on Mac OS X with local-cluster mode.


On Wed, Sep 3, 2014 at 12:23 PM, Cheng Lian lian.cs@gmail.com wrote:

 +1.

 Tested locally on OSX 10.9, built with Hadoop 2.4.1

 - Checked Datanucleus jar files
 - Tested Spark SQL Thrift server and CLI under local mode and standalone
 cluster against MySQL backed metastore



 On Wed, Sep 3, 2014 at 11:25 AM, Josh Rosen rosenvi...@gmail.com wrote:

  +1.  Tested on Windows and EC2.  Confirmed that the EC2 pvm-hvm switch
  fixed the SPARK-3358 regression.
 
 
  On September 3, 2014 at 10:33:45 AM, Marcelo Vanzin (van...@cloudera.com
 )
  wrote:
 
  +1 (non-binding)
 
  - checked checksums of a few packages
  - ran few jobs against yarn client/cluster using hadoop2.3 package
  - played with spark-shell in yarn-client mode
 
  On Wed, Sep 3, 2014 at 12:24 AM, Patrick Wendell pwend...@gmail.com
  wrote:
   Please vote on releasing the following candidate as Apache Spark
 version
  1.1.0!
  
   The tag to be voted on is v1.1.0-rc4 (commit 2f9b2bd):
  
 
 https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=2f9b2bd7844ee8393dc9c319f4fefedf95f5e460
  
   The release files, including signatures, digests, etc. can be found at:
   http://people.apache.org/~pwendell/spark-1.1.0-rc4/
  
   Release artifacts are signed with the following key:
   https://people.apache.org/keys/committer/pwendell.asc
  
   The staging repository for this release can be found at:
  
 https://repository.apache.org/content/repositories/orgapachespark-1031/
  
   The documentation corresponding to this release can be found at:
   http://people.apache.org/~pwendell/spark-1.1.0-rc4-docs/
  
   Please vote on releasing this package as Apache Spark 1.1.0!
  
   The vote is open until Saturday, September 06, at 08:30 UTC and passes
 if
   a majority of at least 3 +1 PMC votes are cast.
  
   [ ] +1 Release this package as Apache Spark 1.1.0
   [ ] -1 Do not release this package because ...
  
   To learn more about Apache Spark, please see
   http://spark.apache.org/
  
   == Regressions fixed since RC3 ==
   SPARK-3332 - Issue with tagging in EC2 scripts
   SPARK-3358 - Issue with regression for m3.XX instances
  
   == What justifies a -1 vote for this release? ==
   This vote is happening very late into the QA period compared with
   previous votes, so -1 votes should only occur for significant
   regressions from 1.0.2. Bugs already present in 1.0.X will not block
   this release.
  
   == What default changes should I be aware of? ==
   1. The default value of spark.io.compression.codec is now snappy
   -- Old behavior can be restored by switching to lzf
  
   2. PySpark now performs external spilling during aggregations.
   -- Old behavior can be restored by setting spark.shuffle.spill to
  false.
  
   3. PySpark uses a new heuristic for determining the parallelism of
   shuffle operations.
   -- Old behavior can be restored by setting
   spark.default.parallelism to the number of cores in the cluster.
  
   -
   To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
   For additional commands, e-mail: dev-h...@spark.apache.org
  
 
 
 
  --
  Marcelo
 
  -
  To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
  For additional commands, e-mail: dev-h...@spark.apache.org
 
 



Re: OutOfMemoryError when running sbt/sbt test

2014-08-26 Thread Mubarak Seyed
What is your ulimit value?


On Tue, Aug 26, 2014 at 5:49 PM, jay vyas jayunit100.apa...@gmail.com
wrote:

 Hi spark.

 I've been trying to build spark, but I've been getting lots of oome
 exceptions.

 https://gist.github.com/jayunit100/d424b6b825ce8517d68c

 For the most part, they are of the form:

 java.lang.OutOfMemoryError: unable to create new native thread

 I've attempted to hard code the get_mem_opts function, which is in the
 sbt-launch-lib.bash file, to
 have various very high parameter sizes (i.e. -Xms5g) with high
 MaxPermSize, etc... and to no avail.

 Any thoughts on this would be appreciated.

 I know of others having the same problem as well.

 Thanks!

 --
 jay vyas



Re: [VOTE] Release Apache Spark 1.0.2 (RC1)

2014-07-28 Thread Mubarak Seyed
+1 (non-binding)

Tested this on Mac OS X.


On Mon, Jul 28, 2014 at 6:52 PM, Andrew Or and...@databricks.com wrote:

 +1 Tested on standalone and yarn clusters


 2014-07-28 14:59 GMT-07:00 Tathagata Das tathagata.das1...@gmail.com:

  Let me add my vote as well.
  Did some basic tests by running simple projects with various Spark
  modules. Tested checksums.
 
  +1
 
  On Sun, Jul 27, 2014 at 4:52 PM, Matei Zaharia matei.zaha...@gmail.com
  wrote:
   +1
  
   Tested this on Mac OS X.
  
   Matei
  
   On Jul 25, 2014, at 4:08 PM, Tathagata Das 
 tathagata.das1...@gmail.com
  wrote:
  
   Please vote on releasing the following candidate as Apache Spark
  version 1.0.2.
  
   This release fixes a number of bugs in Spark 1.0.1.
   Some of the notable ones are
   - SPARK-2452: Known issue is Spark 1.0.1 caused by attempted fix for
   SPARK-1199. The fix was reverted for 1.0.2.
   - SPARK-2576: NoClassDefFoundError when executing Spark QL query on
   HDFS CSV file.
   The full list is at http://s.apache.org/9NJ
  
   The tag to be voted on is v1.0.2-rc1 (commit 8fb6f00e):
  
 
 https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=8fb6f00e195fb258f3f70f04756e07c259a2351f
  
   The release files, including signatures, digests, etc can be found at:
   http://people.apache.org/~tdas/spark-1.0.2-rc1/
  
   Release artifacts are signed with the following key:
   https://people.apache.org/keys/committer/tdas.asc
  
   The staging repository for this release can be found at:
  
 https://repository.apache.org/content/repositories/orgapachespark-1024/
  
   The documentation corresponding to this release can be found at:
   http://people.apache.org/~tdas/spark-1.0.2-rc1-docs/
  
   Please vote on releasing this package as Apache Spark 1.0.2!
  
   The vote is open until Tuesday, July 29, at 23:00 UTC and passes if
   a majority of at least 3 +1 PMC votes are cast.
   [ ] +1 Release this package as Apache Spark 1.0.2
   [ ] -1 Do not release this package because ...
  
   To learn more about Apache Spark, please see
   http://spark.apache.org/