svn commit: r8272 - /release/spark/spark-1.2.0/

2015-03-12 Thread pwendell
Author: pwendell Date: Fri Mar 13 03:59:33 2015 New Revision: 8272 Log: Removing Spark 1.2.0 release. Removed: release/spark/spark-1.2.0/ - To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org For additional comman

svn commit: r1666347 - in /spark/site/docs/1.3.0: ./ api/ api/java/ api/java/org/ api/java/org/apache/ api/java/org/apache/spark/ api/java/org/apache/spark/annotation/ api/java/org/apache/spark/api/ a

2015-03-12 Thread pwendell
Author: pwendell Date: Fri Mar 13 02:30:55 2015 New Revision: 1666347 URL: http://svn.apache.org/r1666347 Log: Spark 1.3.0 docs [This commit notification would consist of 395 parts, which exceeds the limit of 50 ones, so it was shortened to the summary.] ---

svn commit: r8271 - /dev/spark/spark-1.3.0-rc3/ /release/spark/spark-1.3.0/

2015-03-12 Thread pwendell
Author: pwendell Date: Fri Mar 13 01:58:36 2015 New Revision: 8271 Log: Adding Spark 1.3.0 Added: release/spark/spark-1.3.0/ - copied from r8270, dev/spark/spark-1.3.0-rc3/ Removed: dev/spark/spark-1.3.0-rc3/ - To

svn commit: r8270 - /dev/spark/spark-1.3.0-rc3/

2015-03-12 Thread pwendell
Author: pwendell Date: Fri Mar 13 01:55:45 2015 New Revision: 8270 Log: Add spark-1.3.0-rc3 Added: dev/spark/spark-1.3.0-rc3/ dev/spark/spark-1.3.0-rc3/spark-1.3.0-bin-cdh4.tgz (with props) dev/spark/spark-1.3.0-rc3/spark-1.3.0-bin-cdh4.tgz.asc (with props) dev/spark/spark-1.3

spark git commit: HOTFIX: Changes to release script.

2015-03-12 Thread pwendell
Repository: spark Updated Branches: refs/heads/master 17c309c87 -> 3980ebdf1 HOTFIX: Changes to release script. This fixes a big in the release script and also properly sets things up so that Zinc launches multiple processes. I had done something similar in 0c9a8e but it didn't fully work. P

spark git commit: [mllib] [python] Add LassoModel to __all__ in regression.py

2015-03-12 Thread meng
Repository: spark Updated Branches: refs/heads/branch-1.3 850e69451 -> 23069bd02 [mllib] [python] Add LassoModel to __all__ in regression.py Add LassoModel to __all__ in regression.py LassoModel does not show up in Python docs This should be merged into branch-1.3 and master. Author: Joseph

spark git commit: [mllib] [python] Add LassoModel to __all__ in regression.py

2015-03-12 Thread meng
Repository: spark Updated Branches: refs/heads/master a4b27162f -> 17c309c87 [mllib] [python] Add LassoModel to __all__ in regression.py Add LassoModel to __all__ in regression.py LassoModel does not show up in Python docs This should be merged into branch-1.3 and master. Author: Joseph K.

spark git commit: [SPARK-4588] ML Attributes

2015-03-12 Thread meng
Repository: spark Updated Branches: refs/heads/master fb4787c95 -> a4b27162f [SPARK-4588] ML Attributes This continues the work in #4460 from srowen . The design doc is published on the JIRA page with some minor changes. Short description of ML attributes: https://github.com/apache/spark/pu

spark git commit: [SPARK-6268][MLlib] KMeans parameter getter methods

2015-03-12 Thread meng
Repository: spark Updated Branches: refs/heads/master 8f1bc7989 -> fb4787c95 [SPARK-6268][MLlib] KMeans parameter getter methods jira: https://issues.apache.org/jira/browse/SPARK-6268 KMeans has many setters for parameters. It should have matching getters. Author: Yuhao Yang Closes #4974 f

spark git commit: [SPARK-6294] [PySpark] fix take of PythonRDD in JVM (branch 1.2)

2015-03-12 Thread meng
Repository: spark Updated Branches: refs/heads/branch-1.2 c684e5f9f -> 9ebd6f12e [SPARK-6294] [PySpark] fix take of PythonRDD in JVM (branch 1.2) The Thread.interrupt() can not terminate the thread in some cases, so we should not wait for the writerThread of PythonRDD. This PR also ignore so

spark git commit: [build] [hotfix] Fix make-distribution.sh for Scala 2.11.

2015-03-12 Thread srowen
Repository: spark Updated Branches: refs/heads/master 304366c46 -> 8f1bc7989 [build] [hotfix] Fix make-distribution.sh for Scala 2.11. Author: Marcelo Vanzin Closes #5002 from vanzin/mkdist-hotfix and squashes the following commits: ced65f7 [Marcelo Vanzin] [build] [hotfix] Fix make-distrib

spark git commit: [SPARK-6275][Documentation]Miss toDF() function in docs/sql-programming-guide.md

2015-03-12 Thread srowen
Repository: spark Updated Branches: refs/heads/master 4e47d54be -> 304366c46 [SPARK-6275][Documentation]Miss toDF() function in docs/sql-programming-guide.md Miss `toDF()` function in docs/sql-programming-guide.md Author: zzcclp Closes #4977 from zzcclp/SPARK-6275 and squashes the following

spark git commit: [docs] [SPARK-6306] Readme points to dead link

2015-03-12 Thread srowen
Repository: spark Updated Branches: refs/heads/master 0cba802ad -> 4e47d54be [docs] [SPARK-6306] Readme points to dead link The link to "Specifying the Hadoop Version" currently points to http://spark.apache.org/docs/latest/building-with-maven.html#specifying-the-hadoop-version. The correct

spark git commit: [SPARK-5186][branch-1.2] Vector.hashCode is not efficient

2015-03-12 Thread meng
Repository: spark Updated Branches: refs/heads/branch-1.2 d7c359b49 -> c684e5f9f [SPARK-5186][branch-1.2] Vector.hashCode is not efficient Backport hhbyyh 's hasCode implementation to branch-1.2. The old implementation causes performance issues with PySpark, which calls hashCode (https://iss

spark git commit: [SPARK-5814][MLLIB][GRAPHX] Remove JBLAS from runtime

2015-03-12 Thread meng
Repository: spark Updated Branches: refs/heads/master 712679a7b -> 0cba802ad [SPARK-5814][MLLIB][GRAPHX] Remove JBLAS from runtime The issue is discussed in https://issues.apache.org/jira/browse/SPARK-5669. Replacing all JBLAS usage by netlib-java gives us a simpler dependency tree and less

spark git commit: [SPARK-6294] fix hang when call take() in JVM on PythonRDD

2015-03-12 Thread meng
Repository: spark Updated Branches: refs/heads/branch-1.3 d9e141cb7 -> 850e69451 [SPARK-6294] fix hang when call take() in JVM on PythonRDD The Thread.interrupt() can not terminate the thread in some cases, so we should not wait for the writerThread of PythonRDD. This PR also ignore some exc

spark git commit: SPARK-6245 [SQL] jsonRDD() of empty RDD results in exception

2015-03-12 Thread srowen
Repository: spark Updated Branches: refs/heads/master 2d87a415f -> 55c4831d6 SPARK-6245 [SQL] jsonRDD() of empty RDD results in exception Avoid `UnsupportedOperationException` from JsonRDD.inferSchema on empty RDD. Not sure if this is supposed to be an error (but a better one), but it seems

spark git commit: [SPARK-6294] fix hang when call take() in JVM on PythonRDD

2015-03-12 Thread meng
Repository: spark Updated Branches: refs/heads/master 25b71d8c1 -> 712679a7b [SPARK-6294] fix hang when call take() in JVM on PythonRDD The Thread.interrupt() can not terminate the thread in some cases, so we should not wait for the writerThread of PythonRDD. This PR also ignore some excepti

spark git commit: [SPARK-6296] [SQL] Added equals to Column

2015-03-12 Thread rxin
Repository: spark Updated Branches: refs/heads/branch-1.3 bdc4682af -> d9e141cb7 [SPARK-6296] [SQL] Added equals to Column Author: Volodymyr Lyubinets Closes #4988 from vlyubin/columncomp and squashes the following commits: 92d7c8f [Volodymyr Lyubinets] Added equals to Column (cherry picke

spark git commit: [SPARK-6296] [SQL] Added equals to Column

2015-03-12 Thread rxin
Repository: spark Updated Branches: refs/heads/master e921a665c -> 25b71d8c1 [SPARK-6296] [SQL] Added equals to Column Author: Volodymyr Lyubinets Closes #4988 from vlyubin/columncomp and squashes the following commits: 92d7c8f [Volodymyr Lyubinets] Added equals to Column Project: http://