spark git commit: [SPARK-6262][MLLIB]Implement missing methods for MultivariateStatisticalSummary

2015-04-05 Thread meng
Repository: spark
Updated Branches:
  refs/heads/master f15806a8f - acffc4345


[SPARK-6262][MLLIB]Implement missing methods for MultivariateStatisticalSummary

Add below methods in pyspark for MultivariateStatisticalSummary
- normL1
- normL2

Author: lewuathe lewua...@me.com

Closes #5359 from Lewuathe/SPARK-6262 and squashes the following commits:

cbe439e [lewuathe] Implement missing methods for MultivariateStatisticalSummary


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/acffc434
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/acffc434
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/acffc434

Branch: refs/heads/master
Commit: acffc43455d7b3e4000be4ff0175b8ea19cd280b
Parents: f15806a
Author: lewuathe lewua...@me.com
Authored: Sun Apr 5 16:13:31 2015 -0700
Committer: Xiangrui Meng m...@databricks.com
Committed: Sun Apr 5 16:13:31 2015 -0700

--
 python/pyspark/mllib/stat/_statistics.py | 6 ++
 python/pyspark/mllib/tests.py| 6 ++
 2 files changed, 12 insertions(+)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/acffc434/python/pyspark/mllib/stat/_statistics.py
--
diff --git a/python/pyspark/mllib/stat/_statistics.py 
b/python/pyspark/mllib/stat/_statistics.py
index 218ac14..1d83e9d 100644
--- a/python/pyspark/mllib/stat/_statistics.py
+++ b/python/pyspark/mllib/stat/_statistics.py
@@ -49,6 +49,12 @@ class MultivariateStatisticalSummary(JavaModelWrapper):
 def min(self):
 return self.call(min).toArray()
 
+def normL1(self):
+return self.call(normL1).toArray()
+
+def normL2(self):
+return self.call(normL2).toArray()
+
 
 class Statistics(object):
 

http://git-wip-us.apache.org/repos/asf/spark/blob/acffc434/python/pyspark/mllib/tests.py
--
diff --git a/python/pyspark/mllib/tests.py b/python/pyspark/mllib/tests.py
index dd3b66c..47dad7d 100644
--- a/python/pyspark/mllib/tests.py
+++ b/python/pyspark/mllib/tests.py
@@ -357,6 +357,12 @@ class StatTests(PySparkTestCase):
 summary = Statistics.colStats(data)
 self.assertEqual(10, summary.count())
 
+def test_col_norms(self):
+data = RandomRDDs.normalVectorRDD(self.sc, 1000, 10, 10)
+summary = Statistics.colStats(data)
+self.assertEqual(10, len(summary.normL1()))
+self.assertEqual(10, len(summary.normL2()))
+
 
 class VectorUDTTests(PySparkTestCase):
 


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



spark git commit: [SPARK-6602][Core] Update MapOutputTrackerMasterActor to MapOutputTrackerMasterEndpoint

2015-04-05 Thread rxin
Repository: spark
Updated Branches:
  refs/heads/master acffc4345 - 0b5d028a9


[SPARK-6602][Core] Update MapOutputTrackerMasterActor to 
MapOutputTrackerMasterEndpoint

This is the second PR for [SPARK-6602]. It updated MapOutputTrackerMasterActor 
and its unit tests.

cc rxin

Author: zsxwing zsxw...@gmail.com

Closes #5371 from zsxwing/rpc-rewrite-part2 and squashes the following commits:

fcf3816 [zsxwing] Fix the code style
4013a22 [zsxwing] Add doc for uncaught exceptions in RpcEnv
93c6c20 [zsxwing] Add an example of UnserializableException and add 
ErrorMonitor to monitor errors from Akka
134fe7b [zsxwing] Update MapOutputTrackerMasterActor to 
MapOutputTrackerMasterEndpoint


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/0b5d028a
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/0b5d028a
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/0b5d028a

Branch: refs/heads/master
Commit: 0b5d028a93b7d5adb148fbf3a576257bb3a6d8cb
Parents: acffc43
Author: zsxwing zsxw...@gmail.com
Authored: Sun Apr 5 21:57:15 2015 -0700
Committer: Reynold Xin r...@databricks.com
Committed: Sun Apr 5 21:57:15 2015 -0700

--
 .../org/apache/spark/MapOutputTracker.scala |  61 +++---
 .../main/scala/org/apache/spark/SparkEnv.scala  |  18 +-
 .../scala/org/apache/spark/rpc/RpcEnv.scala |   4 +-
 .../org/apache/spark/rpc/akka/AkkaRpcEnv.scala  |  19 +-
 .../apache/spark/MapOutputTrackerSuite.scala| 100 +-
 .../org/apache/spark/rpc/RpcEnvSuite.scala  |  33 +++-
 .../org/apache/spark/util/AkkaUtilsSuite.scala  | 198 ---
 7 files changed, 221 insertions(+), 212 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/0b5d028a/core/src/main/scala/org/apache/spark/MapOutputTracker.scala
--
diff --git a/core/src/main/scala/org/apache/spark/MapOutputTracker.scala 
b/core/src/main/scala/org/apache/spark/MapOutputTracker.scala
index 5718951..d65c94e 100644
--- a/core/src/main/scala/org/apache/spark/MapOutputTracker.scala
+++ b/core/src/main/scala/org/apache/spark/MapOutputTracker.scala
@@ -21,13 +21,11 @@ import java.io._
 import java.util.concurrent.ConcurrentHashMap
 import java.util.zip.{GZIPInputStream, GZIPOutputStream}
 
-import scala.collection.mutable.{HashSet, HashMap, Map}
-import scala.concurrent.Await
+import scala.collection.mutable.{HashSet, Map}
 import scala.collection.JavaConversions._
+import scala.reflect.ClassTag
 
-import akka.actor._
-import akka.pattern.ask
-
+import org.apache.spark.rpc.{RpcEndpointRef, RpcEnv, RpcCallContext, 
RpcEndpoint}
 import org.apache.spark.scheduler.MapStatus
 import org.apache.spark.shuffle.MetadataFetchFailedException
 import org.apache.spark.storage.BlockManagerId
@@ -38,14 +36,15 @@ private[spark] case class GetMapOutputStatuses(shuffleId: 
Int)
   extends MapOutputTrackerMessage
 private[spark] case object StopMapOutputTracker extends MapOutputTrackerMessage
 
-/** Actor class for MapOutputTrackerMaster */
-private[spark] class MapOutputTrackerMasterActor(tracker: 
MapOutputTrackerMaster, conf: SparkConf)
-  extends Actor with ActorLogReceive with Logging {
+/** RpcEndpoint class for MapOutputTrackerMaster */
+private[spark] class MapOutputTrackerMasterEndpoint(
+override val rpcEnv: RpcEnv, tracker: MapOutputTrackerMaster, conf: 
SparkConf)
+  extends RpcEndpoint with Logging {
   val maxAkkaFrameSize = AkkaUtils.maxFrameSizeBytes(conf)
 
-  override def receiveWithLogging: PartialFunction[Any, Unit] = {
+  override def receiveAndReply(context: RpcCallContext): PartialFunction[Any, 
Unit] = {
 case GetMapOutputStatuses(shuffleId: Int) =
-  val hostPort = sender.path.address.hostPort
+  val hostPort = context.sender.address.hostPort
   logInfo(Asked to send map output locations for shuffle  + shuffleId + 
 to  + hostPort)
   val mapOutputStatuses = tracker.getSerializedMapOutputStatuses(shuffleId)
   val serializedSize = mapOutputStatuses.size
@@ -53,19 +52,19 @@ private[spark] class MapOutputTrackerMasterActor(tracker: 
MapOutputTrackerMaster
 val msg = sMap output statuses were $serializedSize bytes which  +
   sexceeds spark.akka.frameSize ($maxAkkaFrameSize bytes).
 
-/* For SPARK-1244 we'll opt for just logging an error and then 
throwing an exception.
- * Note that on exception the actor will just restart. A bigger 
refactoring (SPARK-1239)
- * will ultimately remove this entire code path. */
+/* For SPARK-1244 we'll opt for just logging an error and then sending 
it to the sender.
+ * A bigger refactoring (SPARK-1239) will ultimately remove this 
entire code path. */
 val exception = new SparkException(msg)
 logError(msg, exception)
-throw 

spark git commit: [SPARK-6209] Clean up connections in ExecutorClassLoader after failing to load classes (branch-1.2)

2015-04-05 Thread joshrosen
Repository: spark
Updated Branches:
  refs/heads/branch-1.2 7b7db5942 - f7fe87f4b


[SPARK-6209] Clean up connections in ExecutorClassLoader after failing to load 
classes (branch-1.2)

ExecutorClassLoader does not ensure proper cleanup of network connections that 
it opens. If it fails to load a class, it may leak partially-consumed 
InputStreams that are connected to the REPL's HTTP class server, causing that 
server to exhaust its thread pool, which can cause the entire job to hang.  See 
[SPARK-6209](https://issues.apache.org/jira/browse/SPARK-6209) for more 
details, including a bug reproduction.

This patch fixes this issue by ensuring proper cleanup of these resources.  It 
also adds logging for unexpected error cases.

(See #4944 for the corresponding PR for 1.3/1.4).

Author: Josh Rosen joshro...@databricks.com

Closes #5174 from JoshRosen/executorclassloaderleak-branch-1.2 and squashes the 
following commits:

16e38fe [Josh Rosen] [SPARK-6209] Clean up connections in ExecutorClassLoader 
after failing to load classes (master branch PR)


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/f7fe87f4
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/f7fe87f4
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/f7fe87f4

Branch: refs/heads/branch-1.2
Commit: f7fe87f4b71f469f3c67a00883697ff09cc8a96a
Parents: 7b7db59
Author: Josh Rosen joshro...@databricks.com
Authored: Sun Apr 5 13:59:48 2015 -0700
Committer: Josh Rosen joshro...@databricks.com
Committed: Sun Apr 5 13:59:48 2015 -0700

--
 repl/pom.xml|  5 ++
 .../apache/spark/repl/ExecutorClassLoader.scala | 92 
 .../spark/repl/ExecutorClassLoaderSuite.scala   | 70 ++-
 3 files changed, 148 insertions(+), 19 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/f7fe87f4/repl/pom.xml
--
diff --git a/repl/pom.xml b/repl/pom.xml
index 7eb92d5..b91ae8b 100644
--- a/repl/pom.xml
+++ b/repl/pom.xml
@@ -96,6 +96,11 @@
   artifactIdscalacheck_${scala.binary.version}/artifactId
   scopetest/scope
 /dependency
+dependency
+  groupIdorg.mockito/groupId
+  artifactIdmockito-all/artifactId
+  scopetest/scope
+/dependency
   /dependencies
   build
 
outputDirectorytarget/scala-${scala.binary.version}/classes/outputDirectory

http://git-wip-us.apache.org/repos/asf/spark/blob/f7fe87f4/repl/src/main/scala/org/apache/spark/repl/ExecutorClassLoader.scala
--
diff --git 
a/repl/src/main/scala/org/apache/spark/repl/ExecutorClassLoader.scala 
b/repl/src/main/scala/org/apache/spark/repl/ExecutorClassLoader.scala
index 5ee3250..439fd14 100644
--- a/repl/src/main/scala/org/apache/spark/repl/ExecutorClassLoader.scala
+++ b/repl/src/main/scala/org/apache/spark/repl/ExecutorClassLoader.scala
@@ -17,13 +17,14 @@
 
 package org.apache.spark.repl
 
-import java.io.{ByteArrayOutputStream, InputStream}
-import java.net.{URI, URL, URLEncoder}
-import java.util.concurrent.{Executors, ExecutorService}
+import java.io.{IOException, ByteArrayOutputStream, InputStream}
+import java.net.{HttpURLConnection, URI, URL, URLEncoder}
+
+import scala.util.control.NonFatal
 
 import org.apache.hadoop.fs.{FileSystem, Path}
 
-import org.apache.spark.{SparkConf, SparkEnv}
+import org.apache.spark.{Logging, SparkConf, SparkEnv}
 import org.apache.spark.deploy.SparkHadoopUtil
 import org.apache.spark.util.Utils
 import org.apache.spark.util.ParentClassLoader
@@ -37,12 +38,15 @@ import 
com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.Opcodes._
  * Allows the user to specify if user class path should be first
  */
 class ExecutorClassLoader(conf: SparkConf, classUri: String, parent: 
ClassLoader,
-userClassPathFirst: Boolean) extends ClassLoader {
+userClassPathFirst: Boolean) extends ClassLoader with Logging {
   val uri = new URI(classUri)
   val directory = uri.getPath
 
   val parentLoader = new ParentClassLoader(parent)
 
+  // Allows HTTP connect and read timeouts to be controlled for testing / 
debugging purposes
+  private[repl] var httpUrlConnectionTimeoutMillis: Int = -1
+
   // Hadoop FileSystem object for our URI, if it isn't using HTTP
   var fileSystem: FileSystem = {
 if (uri.getScheme() == http) {
@@ -71,27 +75,81 @@ class ExecutorClassLoader(conf: SparkConf, classUri: 
String, parent: ClassLoader
 }
   }
 
+  private def getClassFileInputStreamFromHttpServer(pathInDirectory: String): 
InputStream = {
+val url = if (SparkEnv.get.securityManager.isAuthenticationEnabled()) {
+  val uri = new URI(classUri + / + urlEncode(pathInDirectory))
+  val newuri = Utils.constructURIForAuthentication(uri, 

[1/2] spark git commit: [HOTFIX] Updating CHANGES.txt for Spark 1.2.2

2015-04-05 Thread pwendell
Repository: spark
Updated Branches:
  refs/heads/branch-1.2 f4a9c41b1 - 86d1715cd


[HOTFIX] Updating CHANGES.txt for Spark 1.2.2


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/eac95250
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/eac95250
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/eac95250

Branch: refs/heads/branch-1.2
Commit: eac952503a1c4b188c1839aca29fada92f605147
Parents: f4a9c41
Author: Patrick Wendell patr...@databricks.com
Authored: Sun Apr 5 08:15:33 2015 -0400
Committer: Patrick Wendell patr...@databricks.com
Committed: Sun Apr 5 08:15:33 2015 -0400

--
 CHANGES.txt | 395 +++
 1 file changed, 395 insertions(+)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/eac95250/CHANGES.txt
--
diff --git a/CHANGES.txt b/CHANGES.txt
new file mode 100644
index 000..ebb268f
--- /dev/null
+++ b/CHANGES.txt
@@ -0,0 +1,395 @@
+Spark Change Log
+
+
+Release 1.2.2
+
+  [CORE] The descriptionof jobHistory config should be 
spark.history.fs.logDirectory
+  KaiXinXiaoLei huleil...@huawei.com
+  2015-04-02 20:24:31 -0700
+  Commit: f4a9c41, github.com/apache/spark/pull/5332
+
+  [SPARK-5195][sql]Update HiveMetastoreCatalog.scala(override the 
MetastoreRelation's sameresult method only compare databasename and table name)
+  seayi 405078...@qq.com, Michael Armbrust mich...@databricks.com
+  2015-02-02 16:06:52 -0800
+  Commit: 2991dd0, github.com/apache/spark/pull/3898
+
+  [SPARK-6578] [core] Fix thread-safety issue in outbound path of network 
library.
+  Reynold Xin r...@databricks.com, Marcelo Vanzin van...@cloudera.com
+  2015-04-02 14:51:00 -0700
+  Commit: d82e732, github.com/apache/spark/pull/5336
+
+  SPARK-6414: Spark driver failed with NPE on job cancelation
+  Hung Lin hung@gmail.com
+  2015-04-02 14:01:43 -0700
+  Commit: 8fa09a4, github.com/apache/spark/pull/5124
+
+  [SPARK-6667] [PySpark] remove setReuseAddress
+  Davies Liu dav...@databricks.com
+  2015-04-02 12:18:33 -0700
+  Commit: a73055f, github.com/apache/spark/pull/5324
+
+  SPARK-6480 [CORE] histogram() bucket function is wrong in some simple edge 
cases
+  Sean Owen so...@cloudera.com
+  2015-03-26 15:00:23 +
+  Commit: 758ebf7, github.com/apache/spark/pull/5148
+
+  [SPARK-3266] Use intermediate abstract classes to fix type erasure issues in 
Java APIs
+  Josh Rosen joshro...@databricks.com
+  2015-03-17 09:18:57 -0700
+  Commit: 61c059a, github.com/apache/spark/pull/5050
+
+  [SPARK-5559] [Streaming] [Test] Remove oppotunity we met flakiness when 
running FlumeStreamSuite
+  Kousuke Saruta saru...@oss.nttdata.co.jp
+  2015-03-24 16:13:25 +
+  Commit: 8ef6995, github.com/apache/spark/pull/4337
+
+  [SPARK-5775] BugFix: GenericRow cannot be cast to SpecificMutableRow when 
nested data and partitioned table
+  Anselme Vignon anselme.vig...@flaminem.com, Cheng Lian 
l...@databricks.com
+  2015-03-23 12:00:50 -0700
+  Commit: e080cc3, github.com/apache/spark/pull/4697
+
+  [SPARK-6132][HOTFIX] ContextCleaner InterruptedException should be quiet
+  Andrew Or and...@databricks.com
+  2015-03-03 20:49:45 -0800
+  Commit: abdcec6, github.com/apache/spark/pull/4882
+
+  [SPARK-6132] ContextCleaner race condition across SparkContexts
+  Andrew Or and...@databricks.com
+  2015-03-03 13:44:05 -0800
+  Commit: 06d883c, github.com/apache/spark/pull/4869
+
+  [SPARK-6313] Add config option to disable file locks/fetchFile cache to ...
+  nemccarthy nat...@nemccarthy.me
+  2015-03-17 09:33:11 -0700
+  Commit: a2a94a1, github.com/apache/spark/pull/5036
+
+  [SPARK-6294] [PySpark] fix take of PythonRDD in JVM (branch 1.2)
+  Davies Liu dav...@databricks.com
+  2015-03-12 15:19:17 -0700
+  Commit: 9ebd6f1, github.com/apache/spark/pull/5003
+
+  [SPARK-5186][branch-1.2] Vector.hashCode is not efficient
+  Yuhao Yang hhb...@gmail.com
+  2015-03-12 01:40:40 -0700
+  Commit: c684e5f, github.com/apache/spark/pull/4985
+
+  [SPARK-6194] [SPARK-677] [PySpark] fix memory leak in collect()
+  Davies Liu dav...@databricks.com
+  2015-03-09 16:24:06 -0700
+  Commit: d7c359b, github.com/apache/spark/pull/4923
+
+  [EXAMPLES] fix typo.
+  Makoto Fukuhara fuku...@gmail.com
+  2015-02-23 09:24:33 +
+  Commit: e753f9c, github.com/apache/spark/pull/4724
+
+  SPARK-1911 [DOCS] Backport. Warn users if their assembly jars are not built 
with Java 6
+  Sean Owen so...@cloudera.com
+  2015-03-04 11:42:50 +
+  Commit: 77a8c06, github.com/apache/spark/pull/4888
+
+  [SPARK-6133] Make sc.stop() idempotent
+  Andrew Or and...@databricks.com
+  2015-03-03 15:09:57 -0800
+  Commit: a91c1c5, github.com/apache/spark/pull/4871
+
+  Revert [SPARK-5423][Core] Cleanup resources in 

Git Push Summary

2015-04-05 Thread pwendell
Repository: spark
Updated Tags:  refs/tags/v1.2.2-rc1 [created] 7531b50e4

-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



[2/2] spark git commit: Preparing development version 1.2.3-SNAPSHOT

2015-04-05 Thread pwendell
Preparing development version 1.2.3-SNAPSHOT


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/7b7db594
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/7b7db594
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/7b7db594

Branch: refs/heads/branch-1.2
Commit: 7b7db5942481e3878901c22010dd1cb77ccdda0e
Parents: 7531b50
Author: Patrick Wendell patr...@databricks.com
Authored: Sun Apr 5 12:17:48 2015 +
Committer: Patrick Wendell patr...@databricks.com
Committed: Sun Apr 5 12:17:48 2015 +

--
 assembly/pom.xml  | 2 +-
 bagel/pom.xml | 2 +-
 core/pom.xml  | 2 +-
 examples/pom.xml  | 2 +-
 external/flume-sink/pom.xml   | 2 +-
 external/flume/pom.xml| 2 +-
 external/kafka/pom.xml| 2 +-
 external/mqtt/pom.xml | 2 +-
 external/twitter/pom.xml  | 2 +-
 external/zeromq/pom.xml   | 2 +-
 extras/java8-tests/pom.xml| 2 +-
 extras/kinesis-asl/pom.xml| 2 +-
 extras/spark-ganglia-lgpl/pom.xml | 2 +-
 graphx/pom.xml| 2 +-
 mllib/pom.xml | 2 +-
 network/common/pom.xml| 2 +-
 network/shuffle/pom.xml   | 2 +-
 network/yarn/pom.xml  | 2 +-
 pom.xml   | 2 +-
 repl/pom.xml  | 2 +-
 sql/catalyst/pom.xml  | 2 +-
 sql/core/pom.xml  | 2 +-
 sql/hive-thriftserver/pom.xml | 2 +-
 sql/hive/pom.xml  | 2 +-
 streaming/pom.xml | 2 +-
 tools/pom.xml | 2 +-
 yarn/alpha/pom.xml| 2 +-
 yarn/pom.xml  | 2 +-
 yarn/stable/pom.xml   | 2 +-
 29 files changed, 29 insertions(+), 29 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/7b7db594/assembly/pom.xml
--
diff --git a/assembly/pom.xml b/assembly/pom.xml
index 50accc4..a6faa1f 100644
--- a/assembly/pom.xml
+++ b/assembly/pom.xml
@@ -21,7 +21,7 @@
   parent
 groupIdorg.apache.spark/groupId
 artifactIdspark-parent/artifactId
-version1.2.2/version
+version1.2.3-SNAPSHOT/version
 relativePath../pom.xml/relativePath
   /parent
 

http://git-wip-us.apache.org/repos/asf/spark/blob/7b7db594/bagel/pom.xml
--
diff --git a/bagel/pom.xml b/bagel/pom.xml
index 7ae8346..4701716 100644
--- a/bagel/pom.xml
+++ b/bagel/pom.xml
@@ -21,7 +21,7 @@
   parent
 groupIdorg.apache.spark/groupId
 artifactIdspark-parent/artifactId
-version1.2.2/version
+version1.2.3-SNAPSHOT/version
 relativePath../pom.xml/relativePath
   /parent
 

http://git-wip-us.apache.org/repos/asf/spark/blob/7b7db594/core/pom.xml
--
diff --git a/core/pom.xml b/core/pom.xml
index 5e1dc75..1fd8d86 100644
--- a/core/pom.xml
+++ b/core/pom.xml
@@ -21,7 +21,7 @@
   parent
 groupIdorg.apache.spark/groupId
 artifactIdspark-parent/artifactId
-version1.2.2/version
+version1.2.3-SNAPSHOT/version
 relativePath../pom.xml/relativePath
   /parent
 

http://git-wip-us.apache.org/repos/asf/spark/blob/7b7db594/examples/pom.xml
--
diff --git a/examples/pom.xml b/examples/pom.xml
index 8b1a3f2..c5de0ea 100644
--- a/examples/pom.xml
+++ b/examples/pom.xml
@@ -21,7 +21,7 @@
   parent
 groupIdorg.apache.spark/groupId
 artifactIdspark-parent/artifactId
-version1.2.2/version
+version1.2.3-SNAPSHOT/version
 relativePath../pom.xml/relativePath
   /parent
 

http://git-wip-us.apache.org/repos/asf/spark/blob/7b7db594/external/flume-sink/pom.xml
--
diff --git a/external/flume-sink/pom.xml b/external/flume-sink/pom.xml
index 9e9ab2f..682e84d 100644
--- a/external/flume-sink/pom.xml
+++ b/external/flume-sink/pom.xml
@@ -21,7 +21,7 @@
   parent
 groupIdorg.apache.spark/groupId
 artifactIdspark-parent/artifactId
-version1.2.2/version
+version1.2.3-SNAPSHOT/version
 relativePath../../pom.xml/relativePath
   /parent
 

http://git-wip-us.apache.org/repos/asf/spark/blob/7b7db594/external/flume/pom.xml
--
diff --git a/external/flume/pom.xml b/external/flume/pom.xml
index 6959bcd..96fee9d 100644
--- a/external/flume/pom.xml
+++ b/external/flume/pom.xml
@@ -21,7 +21,7 @@
   parent
 groupIdorg.apache.spark/groupId
 artifactIdspark-parent/artifactId
-version1.2.2/version
+version1.2.3-SNAPSHOT/version
 relativePath../../pom.xml/relativePath
   

[1/2] spark git commit: Preparing Spark release v1.2.2-rc1

2015-04-05 Thread pwendell
Repository: spark
Updated Branches:
  refs/heads/branch-1.2 86d1715cd - 7b7db5942


Preparing Spark release v1.2.2-rc1


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/7531b50e
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/7531b50e
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/7531b50e

Branch: refs/heads/branch-1.2
Commit: 7531b50e406ee2e3301b009ceea7c684272b2e27
Parents: 86d1715
Author: Patrick Wendell patr...@databricks.com
Authored: Sun Apr 5 12:17:47 2015 +
Committer: Patrick Wendell patr...@databricks.com
Committed: Sun Apr 5 12:17:47 2015 +

--
 assembly/pom.xml  | 2 +-
 bagel/pom.xml | 2 +-
 core/pom.xml  | 2 +-
 examples/pom.xml  | 2 +-
 external/flume-sink/pom.xml   | 2 +-
 external/flume/pom.xml| 2 +-
 external/kafka/pom.xml| 2 +-
 external/mqtt/pom.xml | 2 +-
 external/twitter/pom.xml  | 2 +-
 external/zeromq/pom.xml   | 2 +-
 extras/java8-tests/pom.xml| 2 +-
 extras/kinesis-asl/pom.xml| 2 +-
 extras/spark-ganglia-lgpl/pom.xml | 2 +-
 graphx/pom.xml| 2 +-
 mllib/pom.xml | 2 +-
 network/common/pom.xml| 2 +-
 network/shuffle/pom.xml   | 2 +-
 network/yarn/pom.xml  | 2 +-
 pom.xml   | 2 +-
 repl/pom.xml  | 2 +-
 sql/catalyst/pom.xml  | 2 +-
 sql/core/pom.xml  | 2 +-
 sql/hive-thriftserver/pom.xml | 2 +-
 sql/hive/pom.xml  | 2 +-
 streaming/pom.xml | 2 +-
 tools/pom.xml | 2 +-
 yarn/alpha/pom.xml| 2 +-
 yarn/pom.xml  | 2 +-
 yarn/stable/pom.xml   | 2 +-
 29 files changed, 29 insertions(+), 29 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/7531b50e/assembly/pom.xml
--
diff --git a/assembly/pom.xml b/assembly/pom.xml
index 6889a6c..50accc4 100644
--- a/assembly/pom.xml
+++ b/assembly/pom.xml
@@ -21,7 +21,7 @@
   parent
 groupIdorg.apache.spark/groupId
 artifactIdspark-parent/artifactId
-version1.2.2-SNAPSHOT/version
+version1.2.2/version
 relativePath../pom.xml/relativePath
   /parent
 

http://git-wip-us.apache.org/repos/asf/spark/blob/7531b50e/bagel/pom.xml
--
diff --git a/bagel/pom.xml b/bagel/pom.xml
index f785cf6..7ae8346 100644
--- a/bagel/pom.xml
+++ b/bagel/pom.xml
@@ -21,7 +21,7 @@
   parent
 groupIdorg.apache.spark/groupId
 artifactIdspark-parent/artifactId
-version1.2.2-SNAPSHOT/version
+version1.2.2/version
 relativePath../pom.xml/relativePath
   /parent
 

http://git-wip-us.apache.org/repos/asf/spark/blob/7531b50e/core/pom.xml
--
diff --git a/core/pom.xml b/core/pom.xml
index 9e202f3..5e1dc75 100644
--- a/core/pom.xml
+++ b/core/pom.xml
@@ -21,7 +21,7 @@
   parent
 groupIdorg.apache.spark/groupId
 artifactIdspark-parent/artifactId
-version1.2.2-SNAPSHOT/version
+version1.2.2/version
 relativePath../pom.xml/relativePath
   /parent
 

http://git-wip-us.apache.org/repos/asf/spark/blob/7531b50e/examples/pom.xml
--
diff --git a/examples/pom.xml b/examples/pom.xml
index df6975f..8b1a3f2 100644
--- a/examples/pom.xml
+++ b/examples/pom.xml
@@ -21,7 +21,7 @@
   parent
 groupIdorg.apache.spark/groupId
 artifactIdspark-parent/artifactId
-version1.2.2-SNAPSHOT/version
+version1.2.2/version
 relativePath../pom.xml/relativePath
   /parent
 

http://git-wip-us.apache.org/repos/asf/spark/blob/7531b50e/external/flume-sink/pom.xml
--
diff --git a/external/flume-sink/pom.xml b/external/flume-sink/pom.xml
index 0002bf2..9e9ab2f 100644
--- a/external/flume-sink/pom.xml
+++ b/external/flume-sink/pom.xml
@@ -21,7 +21,7 @@
   parent
 groupIdorg.apache.spark/groupId
 artifactIdspark-parent/artifactId
-version1.2.2-SNAPSHOT/version
+version1.2.2/version
 relativePath../../pom.xml/relativePath
   /parent
 

http://git-wip-us.apache.org/repos/asf/spark/blob/7531b50e/external/flume/pom.xml
--
diff --git a/external/flume/pom.xml b/external/flume/pom.xml
index e783d39..6959bcd 100644
--- a/external/flume/pom.xml
+++ b/external/flume/pom.xml
@@ -21,7 +21,7 @@
   parent
 groupIdorg.apache.spark/groupId
 artifactIdspark-parent/artifactId
-version1.2.2-SNAPSHOT/version
+

[2/2] spark git commit: [HOTFIX] Bumping versions for Spark 1.2.2

2015-04-05 Thread pwendell
[HOTFIX] Bumping versions for Spark 1.2.2


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/86d1715c
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/86d1715c
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/86d1715c

Branch: refs/heads/branch-1.2
Commit: 86d1715cd44186a43b8676d48e995d82b215581e
Parents: eac9525
Author: Patrick Wendell patr...@databricks.com
Authored: Sun Apr 5 08:17:49 2015 -0400
Committer: Patrick Wendell patr...@databricks.com
Committed: Sun Apr 5 08:17:49 2015 -0400

--
 core/src/main/scala/org/apache/spark/package.scala | 2 +-
 docs/_config.yml   | 4 ++--
 ec2/spark_ec2.py   | 1 +
 3 files changed, 4 insertions(+), 3 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/86d1715c/core/src/main/scala/org/apache/spark/package.scala
--
diff --git a/core/src/main/scala/org/apache/spark/package.scala 
b/core/src/main/scala/org/apache/spark/package.scala
index 9bb5f5e..b33214f 100644
--- a/core/src/main/scala/org/apache/spark/package.scala
+++ b/core/src/main/scala/org/apache/spark/package.scala
@@ -44,5 +44,5 @@ package org.apache
 
 package object spark {
   // For package docs only
-  val SPARK_VERSION = 1.2.1
+  val SPARK_VERSION = 1.2.2
 }

http://git-wip-us.apache.org/repos/asf/spark/blob/86d1715c/docs/_config.yml
--
diff --git a/docs/_config.yml b/docs/_config.yml
index 80d0efe..8ace356 100644
--- a/docs/_config.yml
+++ b/docs/_config.yml
@@ -14,8 +14,8 @@ include:
 
 # These allow the documentation to be updated with newer releases
 # of Spark, Scala, and Mesos.
-SPARK_VERSION: 1.2.1
-SPARK_VERSION_SHORT: 1.2.1
+SPARK_VERSION: 1.2.2
+SPARK_VERSION_SHORT: 1.2.2
 SCALA_BINARY_VERSION: 2.10
 SCALA_VERSION: 2.10.4
 MESOS_VERSION: 0.18.1

http://git-wip-us.apache.org/repos/asf/spark/blob/86d1715c/ec2/spark_ec2.py
--
diff --git a/ec2/spark_ec2.py b/ec2/spark_ec2.py
index 43311c2..4fea3d0 100755
--- a/ec2/spark_ec2.py
+++ b/ec2/spark_ec2.py
@@ -218,6 +218,7 @@ def get_spark_shark_version(opts):
 1.1.1: 1.1.1,
 1.2.0: 1.2.0,
 1.2.1: 1.2.1,
+1.2.2: 1.2.2,
 }
 version = opts.spark_version.replace(v, )
 if version not in spark_shark_map:


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org