spark git commit: [SQL] Use path.makeQualified in newParquet.

2015-04-04 Thread lian
Repository: spark
Updated Branches:
  refs/heads/master 9b40c17ab - da25c86d6


[SQL] Use path.makeQualified in newParquet.

Author: Yin Huai yh...@databricks.com

Closes #5353 from yhuai/wrongFS and squashes the following commits:

849603b [Yin Huai] Not use deprecated method.
6d6ae34 [Yin Huai] Use path.makeQualified.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/da25c86d
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/da25c86d
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/da25c86d

Branch: refs/heads/master
Commit: da25c86d64ff9ce80f88186ba083f6c21dd9a568
Parents: 9b40c17
Author: Yin Huai yh...@databricks.com
Authored: Sat Apr 4 23:26:10 2015 +0800
Committer: Cheng Lian l...@databricks.com
Committed: Sat Apr 4 23:26:10 2015 +0800

--
 .../src/main/scala/org/apache/spark/sql/parquet/newParquet.scala  | 3 ++-
 1 file changed, 2 insertions(+), 1 deletion(-)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/da25c86d/sql/core/src/main/scala/org/apache/spark/sql/parquet/newParquet.scala
--
diff --git 
a/sql/core/src/main/scala/org/apache/spark/sql/parquet/newParquet.scala 
b/sql/core/src/main/scala/org/apache/spark/sql/parquet/newParquet.scala
index 583bac4..0dce362 100644
--- a/sql/core/src/main/scala/org/apache/spark/sql/parquet/newParquet.scala
+++ b/sql/core/src/main/scala/org/apache/spark/sql/parquet/newParquet.scala
@@ -268,7 +268,8 @@ private[sql] case class ParquetRelation2(
   // containing Parquet files (e.g. partitioned Parquet table).
   val baseStatuses = paths.distinct.map { p =
 val fs = FileSystem.get(URI.create(p), 
sparkContext.hadoopConfiguration)
-val qualified = fs.makeQualified(new Path(p))
+val path = new Path(p)
+val qualified = path.makeQualified(fs.getUri, fs.getWorkingDirectory)
 
 if (!fs.exists(qualified)  maybeSchema.isDefined) {
   fs.mkdirs(qualified)


-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org



spark git commit: [SPARK-6607][SQL] Check invalid characters for Parquet schema and show error messages

2015-04-04 Thread lian
Repository: spark
Updated Branches:
  refs/heads/master da25c86d6 - 7bca62f79


[SPARK-6607][SQL] Check invalid characters for Parquet schema and show error 
messages

'(' and ')' are special characters used in Parquet schema for type annotation. 
When we run an aggregation query, we will obtain attribute name such as 
MAX(a).

If we directly store the generated DataFrame as Parquet file, it causes failure 
when reading and parsing the stored schema string.

Several methods can be adopted to solve this. This pr uses a simplest one to 
just replace attribute names before generating Parquet schema based on these 
attributes.

Another possible method might be modifying all aggregation expression names 
from func(column) to func[column].

Author: Liang-Chi Hsieh vii...@gmail.com

Closes #5263 from viirya/parquet_aggregation_name and squashes the following 
commits:

2d70542 [Liang-Chi Hsieh] Address comment.
463dff4 [Liang-Chi Hsieh] Instead of replacing special chars, showing error 
message to user to suggest using Alias.
1de001d [Liang-Chi Hsieh] Replace special characters '(' and ')' of Parquet 
schema.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/7bca62f7
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/7bca62f7
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/7bca62f7

Branch: refs/heads/master
Commit: 7bca62f79056e592cf07b49d8b8d04c59dea25fc
Parents: da25c86
Author: Liang-Chi Hsieh vii...@gmail.com
Authored: Sun Apr 5 00:20:43 2015 +0800
Committer: Cheng Lian l...@databricks.com
Committed: Sun Apr 5 00:20:43 2015 +0800

--
 .../org/apache/spark/sql/parquet/ParquetTypes.scala | 14 ++
 .../org/apache/spark/sql/hive/parquetSuites.scala   | 16 
 2 files changed, 30 insertions(+)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/7bca62f7/sql/core/src/main/scala/org/apache/spark/sql/parquet/ParquetTypes.scala
--
diff --git 
a/sql/core/src/main/scala/org/apache/spark/sql/parquet/ParquetTypes.scala 
b/sql/core/src/main/scala/org/apache/spark/sql/parquet/ParquetTypes.scala
index da668f0..60e1bec 100644
--- a/sql/core/src/main/scala/org/apache/spark/sql/parquet/ParquetTypes.scala
+++ b/sql/core/src/main/scala/org/apache/spark/sql/parquet/ParquetTypes.scala
@@ -390,6 +390,7 @@ private[parquet] object ParquetTypesConverter extends 
Logging {
 
   def convertFromAttributes(attributes: Seq[Attribute],
 toThriftSchemaNames: Boolean = false): MessageType 
= {
+checkSpecialCharacters(attributes)
 val fields = attributes.map(
   attribute =
 fromDataType(attribute.dataType, attribute.name, attribute.nullable,
@@ -404,7 +405,20 @@ private[parquet] object ParquetTypesConverter extends 
Logging {
 }
   }
 
+  private def checkSpecialCharacters(schema: Seq[Attribute]) = {
+// ,;{}()\n\t= and space character are special characters in Parquet schema
+schema.map(_.name).foreach { name =
+  if (name.matches(.*[ ,;{}()\n\t=].*)) {
+sys.error(
+  sAttribute name $name contains invalid character(s) among  
,;{}()\n\t=.
+ |Please use alias to rename it.
+   .stripMargin.split(\n).mkString( ))
+  }
+}
+  }
+
   def convertToString(schema: Seq[Attribute]): String = {
+checkSpecialCharacters(schema)
 StructType.fromAttributes(schema).json
   }
 

http://git-wip-us.apache.org/repos/asf/spark/blob/7bca62f7/sql/hive/src/test/scala/org/apache/spark/sql/hive/parquetSuites.scala
--
diff --git 
a/sql/hive/src/test/scala/org/apache/spark/sql/hive/parquetSuites.scala 
b/sql/hive/src/test/scala/org/apache/spark/sql/hive/parquetSuites.scala
index 1319c81..5f71e1b 100644
--- a/sql/hive/src/test/scala/org/apache/spark/sql/hive/parquetSuites.scala
+++ b/sql/hive/src/test/scala/org/apache/spark/sql/hive/parquetSuites.scala
@@ -688,6 +688,22 @@ class ParquetDataSourceOnSourceSuite extends 
ParquetSourceSuiteBase {
 
 sql(DROP TABLE alwaysNullable)
   }
+
+  test(Aggregation attribute names can't contain special chars \ 
,;{}()\\n\\t=\) {
+val tempDir = Utils.createTempDir()
+val filePath = new File(tempDir, testParquet).getCanonicalPath
+val filePath2 = new File(tempDir, testParquet2).getCanonicalPath
+
+val df = Seq(1,2,3).map(i = (i, i.toString)).toDF(int, str)
+val df2 = df.as('x).join(df.as('y), $x.str === 
$y.str).groupBy(y.str).max(y.int)
+intercept[RuntimeException](df2.saveAsParquetFile(filePath))
+
+val df3 = df2.toDF(str, max_int)
+df3.saveAsParquetFile(filePath2)
+val df4 = parquetFile(filePath2)
+checkAnswer(df4, Row(1, 1) :: Row(2, 2) :: Row(3, 3) :: Nil)
+assert(df4.columns === Array(str, 

[2/2] spark git commit: [SPARK-6602][Core] Replace direct use of Akka with Spark RPC interface - part 1

2015-04-04 Thread rxin
[SPARK-6602][Core] Replace direct use of Akka with Spark RPC interface - part 1

This PR replaced the following `Actor`s to `RpcEndpoint`:

1. HeartbeatReceiver
1. ExecutorActor
1. BlockManagerMasterActor
1. BlockManagerSlaveActor
1. CoarseGrainedExecutorBackend and subclasses
1. CoarseGrainedSchedulerBackend.DriverActor

This is the first PR. I will split the work of SPARK-6602 to several PRs for 
code review.

Author: zsxwing zsxw...@gmail.com

Closes #5268 from zsxwing/rpc-rewrite and squashes the following commits:

287e9f8 [zsxwing] Fix the code style
26c56b7 [zsxwing] Merge branch 'master' into rpc-rewrite
9cc825a [zsxwing] Rmove setupThreadSafeEndpoint and add ThreadSafeRpcEndpoint
30a9036 [zsxwing] Make self return null after stopping RpcEndpointRef; fix docs 
and error messages
705245d [zsxwing] Fix some bugs after rebasing the changes on the master
003cf80 [zsxwing] Update CoarseGrainedExecutorBackend and 
CoarseGrainedSchedulerBackend to use RpcEndpoint
7d0e6dc [zsxwing] Update BlockManagerSlaveActor to use RpcEndpoint
f5d6543 [zsxwing] Update BlockManagerMaster to use RpcEndpoint
30e3f9f [zsxwing] Update ExecutorActor to use RpcEndpoint
478b443 [zsxwing] Update HeartbeatReceiver to use RpcEndpoint


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/f15806a8
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/f15806a8
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/f15806a8

Branch: refs/heads/master
Commit: f15806a8f8ca34288ddb2d74b9ff1972c8374b59
Parents: 7bca62f
Author: zsxwing zsxw...@gmail.com
Authored: Sat Apr 4 11:52:05 2015 -0700
Committer: Reynold Xin r...@databricks.com
Committed: Sat Apr 4 11:52:05 2015 -0700

--
 .../org/apache/spark/HeartbeatReceiver.scala|  66 ++-
 .../scala/org/apache/spark/SparkContext.scala   |  23 +-
 .../main/scala/org/apache/spark/SparkEnv.scala  |  13 +-
 .../executor/CoarseGrainedExecutorBackend.scala |  79 +--
 .../org/apache/spark/executor/Executor.scala|  18 +-
 .../apache/spark/executor/ExecutorActor.scala   |  41 --
 .../spark/executor/ExecutorEndpoint.scala   |  43 ++
 .../scala/org/apache/spark/rpc/RpcEnv.scala |  39 +-
 .../org/apache/spark/rpc/akka/AkkaRpcEnv.scala  |  10 +-
 .../apache/spark/scheduler/DAGScheduler.scala   |  11 +-
 .../cluster/CoarseGrainedClusterMessage.scala   |   6 +-
 .../cluster/CoarseGrainedSchedulerBackend.scala | 148 +++---
 .../spark/scheduler/cluster/ExecutorData.scala  |   8 +-
 .../cluster/SimrSchedulerBackend.scala  |  13 +-
 .../cluster/SparkDeploySchedulerBackend.scala   |  14 +-
 .../cluster/YarnSchedulerBackend.scala  |  93 ++--
 .../mesos/CoarseMesosSchedulerBackend.scala |   4 +-
 .../spark/scheduler/local/LocalBackend.scala|  48 +-
 .../org/apache/spark/storage/BlockManager.scala |  22 +-
 .../spark/storage/BlockManagerMaster.scala  |  72 ++-
 .../spark/storage/BlockManagerMasterActor.scala | 512 ---
 .../storage/BlockManagerMasterEndpoint.scala| 509 ++
 .../spark/storage/BlockManagerMessages.scala|   7 +-
 .../spark/storage/BlockManagerSlaveActor.scala  |  88 
 .../storage/BlockManagerSlaveEndpoint.scala |  94 
 .../scala/org/apache/spark/util/Utils.scala |  10 +
 .../apache/spark/HeartbeatReceiverSuite.scala   |  81 +++
 .../org/apache/spark/rpc/RpcEnvSuite.scala  |  14 +-
 .../storage/BlockManagerReplicationSuite.scala  |  28 +-
 .../spark/storage/BlockManagerSuite.scala   |  37 +-
 .../streaming/ReceivedBlockHandlerSuite.scala   |  25 +-
 .../spark/deploy/yarn/ApplicationMaster.scala   |  86 ++--
 .../spark/deploy/yarn/YarnAllocator.scala   |   2 +-
 33 files changed, 1169 insertions(+), 1095 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/f15806a8/core/src/main/scala/org/apache/spark/HeartbeatReceiver.scala
--
diff --git a/core/src/main/scala/org/apache/spark/HeartbeatReceiver.scala 
b/core/src/main/scala/org/apache/spark/HeartbeatReceiver.scala
index 9f8ad03..5871b8c 100644
--- a/core/src/main/scala/org/apache/spark/HeartbeatReceiver.scala
+++ b/core/src/main/scala/org/apache/spark/HeartbeatReceiver.scala
@@ -17,15 +17,15 @@
 
 package org.apache.spark
 
-import scala.concurrent.duration._
-import scala.collection.mutable
+import java.util.concurrent.{ScheduledFuture, TimeUnit, Executors}
 
-import akka.actor.{Actor, Cancellable}
+import scala.collection.mutable
 
 import org.apache.spark.executor.TaskMetrics
+import org.apache.spark.rpc.{ThreadSafeRpcEndpoint, RpcEnv, RpcCallContext}
 import org.apache.spark.storage.BlockManagerId
 import org.apache.spark.scheduler.{SlaveLost, TaskScheduler}
-import org.apache.spark.util.ActorLogReceive
+import org.apache.spark.util.Utils
 
 /**
  * A heartbeat 

[1/2] spark git commit: [SPARK-6602][Core] Replace direct use of Akka with Spark RPC interface - part 1

2015-04-04 Thread rxin
Repository: spark
Updated Branches:
  refs/heads/master 7bca62f79 - f15806a8f


http://git-wip-us.apache.org/repos/asf/spark/blob/f15806a8/core/src/main/scala/org/apache/spark/storage/BlockManagerMasterActor.scala
--
diff --git 
a/core/src/main/scala/org/apache/spark/storage/BlockManagerMasterActor.scala 
b/core/src/main/scala/org/apache/spark/storage/BlockManagerMasterActor.scala
deleted file mode 100644
index 5b53280..000
--- a/core/src/main/scala/org/apache/spark/storage/BlockManagerMasterActor.scala
+++ /dev/null
@@ -1,512 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one or more
- * contributor license agreements.  See the NOTICE file distributed with
- * this work for additional information regarding copyright ownership.
- * The ASF licenses this file to You under the Apache License, Version 2.0
- * (the License); you may not use this file except in compliance with
- * the License.  You may obtain a copy of the License at
- *
- *http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an AS IS BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-package org.apache.spark.storage
-
-import java.util.{HashMap = JHashMap}
-
-import scala.collection.mutable
-import scala.collection.JavaConversions._
-import scala.concurrent.Future
-import scala.concurrent.duration._
-
-import akka.actor.{Actor, ActorRef}
-import akka.pattern.ask
-
-import org.apache.spark.{Logging, SparkConf, SparkException}
-import org.apache.spark.annotation.DeveloperApi
-import org.apache.spark.scheduler._
-import org.apache.spark.storage.BlockManagerMessages._
-import org.apache.spark.util.{ActorLogReceive, AkkaUtils, Utils}
-
-/**
- * BlockManagerMasterActor is an actor on the master node to track statuses of
- * all slaves' block managers.
- */
-private[spark]
-class BlockManagerMasterActor(val isLocal: Boolean, conf: SparkConf, 
listenerBus: LiveListenerBus)
-  extends Actor with ActorLogReceive with Logging {
-
-  // Mapping from block manager id to the block manager's information.
-  private val blockManagerInfo = new mutable.HashMap[BlockManagerId, 
BlockManagerInfo]
-
-  // Mapping from executor ID to block manager ID.
-  private val blockManagerIdByExecutor = new mutable.HashMap[String, 
BlockManagerId]
-
-  // Mapping from block id to the set of block managers that have the block.
-  private val blockLocations = new JHashMap[BlockId, 
mutable.HashSet[BlockManagerId]]
-
-  private val akkaTimeout = AkkaUtils.askTimeout(conf)
-
-  override def receiveWithLogging: PartialFunction[Any, Unit] = {
-case RegisterBlockManager(blockManagerId, maxMemSize, slaveActor) =
-  register(blockManagerId, maxMemSize, slaveActor)
-  sender ! true
-
-case UpdateBlockInfo(
-  blockManagerId, blockId, storageLevel, deserializedSize, size, 
tachyonSize) =
-  sender ! updateBlockInfo(
-blockManagerId, blockId, storageLevel, deserializedSize, size, 
tachyonSize)
-
-case GetLocations(blockId) =
-  sender ! getLocations(blockId)
-
-case GetLocationsMultipleBlockIds(blockIds) =
-  sender ! getLocationsMultipleBlockIds(blockIds)
-
-case GetPeers(blockManagerId) =
-  sender ! getPeers(blockManagerId)
-
-case GetActorSystemHostPortForExecutor(executorId) =
-  sender ! getActorSystemHostPortForExecutor(executorId)
-
-case GetMemoryStatus =
-  sender ! memoryStatus
-
-case GetStorageStatus =
-  sender ! storageStatus
-
-case GetBlockStatus(blockId, askSlaves) =
-  sender ! blockStatus(blockId, askSlaves)
-
-case GetMatchingBlockIds(filter, askSlaves) =
-  sender ! getMatchingBlockIds(filter, askSlaves)
-
-case RemoveRdd(rddId) =
-  sender ! removeRdd(rddId)
-
-case RemoveShuffle(shuffleId) =
-  sender ! removeShuffle(shuffleId)
-
-case RemoveBroadcast(broadcastId, removeFromDriver) =
-  sender ! removeBroadcast(broadcastId, removeFromDriver)
-
-case RemoveBlock(blockId) =
-  removeBlockFromWorkers(blockId)
-  sender ! true
-
-case RemoveExecutor(execId) =
-  removeExecutor(execId)
-  sender ! true
-
-case StopBlockManagerMaster =
-  sender ! true
-  context.stop(self)
-
-case BlockManagerHeartbeat(blockManagerId) =
-  sender ! heartbeatReceived(blockManagerId)
-
-case other =
-  logWarning(Got unknown message:  + other)
-  }
-
-  private def removeRdd(rddId: Int): Future[Seq[Int]] = {
-// First remove the metadata for the given RDD, and then asynchronously 
remove the blocks
-// from the slaves.
-
-// Find all blocks for the given RDD, remove the block from both 
blockLocations and
-// the blockManagerInfo that is 

spark git commit: Version info and CHANGES.txt for 1.3.1

2015-04-04 Thread pwendell
Repository: spark
Updated Branches:
  refs/heads/branch-1.3 eb57d4f88 - 5db4ff2f3


Version info and CHANGES.txt for 1.3.1


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/5db4ff2f
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/5db4ff2f
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/5db4ff2f

Branch: refs/heads/branch-1.3
Commit: 5db4ff2f3d4ed87ec7fc31ec609165cbb3831504
Parents: eb57d4f
Author: Patrick Wendell patr...@databricks.com
Authored: Sat Apr 4 16:19:14 2015 -0400
Committer: Patrick Wendell patr...@databricks.com
Committed: Sat Apr 4 16:19:14 2015 -0400

--
 CHANGES.txt | 728 +++
 .../main/scala/org/apache/spark/package.scala   |   2 +-
 dev/create-release/generate-changelist.py   |   4 +-
 docs/_config.yml|   4 +-
 ec2/spark_ec2.py|   3 +-
 5 files changed, 735 insertions(+), 6 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/5db4ff2f/CHANGES.txt
--
diff --git a/CHANGES.txt b/CHANGES.txt
index d3713de..7da0244 100644
--- a/CHANGES.txt
+++ b/CHANGES.txt
@@ -1,6 +1,734 @@
 Spark Change Log
 
 
+Release 1.3.1
+
+  [SQL] Use path.makeQualified in newParquet.
+  Yin Huai yh...@databricks.com
+  2015-04-04 23:26:10 +0800
+  Commit: eb57d4f, github.com/apache/spark/pull/5353
+
+  [SPARK-6700] disable flaky test
+  Davies Liu dav...@databricks.com
+  2015-04-03 15:22:21 -0700
+  Commit: 3366af6, github.com/apache/spark/pull/5356
+
+  [SPARK-6688] [core] Always use resolved URIs in EventLoggingListener.
+  Marcelo Vanzin van...@cloudera.com
+  2015-04-03 11:54:31 -0700
+  Commit: f17a2fe, github.com/apache/spark/pull/5340
+
+  [SPARK-6575][SQL] Converted Parquet Metastore tables no longer cache metadata
+  Yin Huai yh...@databricks.com
+  2015-04-03 14:40:36 +0800
+  Commit: 0c1b78b, github.com/apache/spark/pull/5339
+
+  [SPARK-6621][Core] Fix the bug that calling EventLoop.stop in 
EventLoop.onReceive/onError/onStart doesn't call onStop
+  zsxwing zsxw...@gmail.com
+  2015-04-02 22:54:30 -0700
+  Commit: ac705aa, github.com/apache/spark/pull/5280
+
+  [SPARK-6345][STREAMING][MLLIB] Fix for training with prediction
+  freeman the.freeman@gmail.com
+  2015-04-02 21:37:44 -0700
+  Commit: d21f779, github.com/apache/spark/pull/5037
+
+  [CORE] The descriptionof jobHistory config should be 
spark.history.fs.logDirectory
+  KaiXinXiaoLei huleil...@huawei.com
+  2015-04-02 20:24:31 -0700
+  Commit: 17ab6b0, github.com/apache/spark/pull/5332
+
+  [SPARK-6575][SQL] Converted Parquet Metastore tables no longer cache metadata
+  Yin Huai yh...@databricks.com
+  2015-04-02 20:23:08 -0700
+  Commit: 0c1c0fb, github.com/apache/spark/pull/5339
+
+  [SPARK-6650] [core] Stop ExecutorAllocationManager when context stops.
+  Marcelo Vanzin van...@cloudera.com
+  2015-04-02 19:48:55 -0700
+  Commit: 0ef46b2, github.com/apache/spark/pull/5311
+
+  [SPARK-6686][SQL] Use resolved output instead of names for toDF rename
+  Michael Armbrust mich...@databricks.com
+  2015-04-02 18:30:55 -0700
+  Commit: 2927af1, github.com/apache/spark/pull/5337
+
+  [SPARK-6672][SQL] convert row to catalyst in createDataFrame(RDD[Row], ...)
+  Xiangrui Meng m...@databricks.com
+  2015-04-02 17:57:01 +0800
+  Commit: c2694bb, github.com/apache/spark/pull/5329
+
+  [SPARK-6618][SPARK-6669][SQL] Lock Hive metastore client correctly.
+  Yin Huai yh...@databricks.com, Michael Armbrust mich...@databricks.com
+  2015-04-02 16:46:50 -0700
+  Commit: e6ee95c, github.com/apache/spark/pull/5333
+
+  [Minor] [SQL] Follow-up of PR #5210
+  Cheng Lian l...@databricks.com
+  2015-04-02 16:15:34 -0700
+  Commit: 4f1fe3f, github.com/apache/spark/pull/5219
+
+  [SPARK-6655][SQL] We need to read the schema of a data source table stored 
in spark.sql.sources.schema property
+  Yin Huai yh...@databricks.com
+  2015-04-02 16:02:31 -0700
+  Commit: aecec07, github.com/apache/spark/pull/5313
+
+  [SQL] Throw UnsupportedOperationException instead of NotImplementedError
+  Michael Armbrust mich...@databricks.com
+  2015-04-02 16:01:03 -0700
+  Commit: 78ba245, github.com/apache/spark/pull/5315
+
+  SPARK-6414: Spark driver failed with NPE on job cancelation
+  Hung Lin hung@gmail.com
+  2015-04-02 14:01:43 -0700
+  Commit: 58e2b3f, github.com/apache/spark/pull/5124
+
+  [SPARK-6079] Use index to speed up StatusTracker.getJobIdsForGroup()
+  Josh Rosen joshro...@databricks.com
+  2015-03-25 17:40:00 -0700
+  Commit: a6664dc, github.com/apache/spark/pull/4830
+
+  [SPARK-6667] [PySpark] remove setReuseAddress
+  Davies Liu dav...@databricks.com
+  2015-04-02 12:18:33 -0700
+  Commit: ee2bd70, 

[1/2] spark git commit: Preparing Spark release v1.3.1-rc1

2015-04-04 Thread pwendell
Repository: spark
Updated Branches:
  refs/heads/branch-1.3 5db4ff2f3 - 728c1f927


Preparing Spark release v1.3.1-rc1


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/0dcb5d9f
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/0dcb5d9f
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/0dcb5d9f

Branch: refs/heads/branch-1.3
Commit: 0dcb5d9f31b713ed90bcec63ebc4e530cbb69851
Parents: 5db4ff2
Author: Patrick Wendell patr...@databricks.com
Authored: Sat Apr 4 20:25:25 2015 +
Committer: Patrick Wendell patr...@databricks.com
Committed: Sat Apr 4 20:25:25 2015 +

--
 assembly/pom.xml  | 2 +-
 bagel/pom.xml | 2 +-
 core/pom.xml  | 2 +-
 examples/pom.xml  | 2 +-
 external/flume-sink/pom.xml   | 2 +-
 external/flume/pom.xml| 2 +-
 external/kafka-assembly/pom.xml   | 2 +-
 external/kafka/pom.xml| 2 +-
 external/mqtt/pom.xml | 2 +-
 external/twitter/pom.xml  | 2 +-
 external/zeromq/pom.xml   | 2 +-
 extras/java8-tests/pom.xml| 2 +-
 extras/kinesis-asl/pom.xml| 2 +-
 extras/spark-ganglia-lgpl/pom.xml | 2 +-
 graphx/pom.xml| 2 +-
 mllib/pom.xml | 2 +-
 network/common/pom.xml| 2 +-
 network/shuffle/pom.xml   | 2 +-
 network/yarn/pom.xml  | 2 +-
 pom.xml   | 2 +-
 repl/pom.xml  | 2 +-
 sql/catalyst/pom.xml  | 2 +-
 sql/core/pom.xml  | 2 +-
 sql/hive-thriftserver/pom.xml | 2 +-
 sql/hive/pom.xml  | 2 +-
 streaming/pom.xml | 2 +-
 tools/pom.xml | 2 +-
 yarn/pom.xml  | 2 +-
 28 files changed, 28 insertions(+), 28 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/0dcb5d9f/assembly/pom.xml
--
diff --git a/assembly/pom.xml b/assembly/pom.xml
index 114dde7..67bebfc 100644
--- a/assembly/pom.xml
+++ b/assembly/pom.xml
@@ -21,7 +21,7 @@
   parent
 groupIdorg.apache.spark/groupId
 artifactIdspark-parent_2.10/artifactId
-version1.3.1-SNAPSHOT/version
+version1.3.1/version
 relativePath../pom.xml/relativePath
   /parent
 

http://git-wip-us.apache.org/repos/asf/spark/blob/0dcb5d9f/bagel/pom.xml
--
diff --git a/bagel/pom.xml b/bagel/pom.xml
index dea41f8..c7750a2 100644
--- a/bagel/pom.xml
+++ b/bagel/pom.xml
@@ -21,7 +21,7 @@
   parent
 groupIdorg.apache.spark/groupId
 artifactIdspark-parent_2.10/artifactId
-version1.3.1-SNAPSHOT/version
+version1.3.1/version
 relativePath../pom.xml/relativePath
   /parent
 

http://git-wip-us.apache.org/repos/asf/spark/blob/0dcb5d9f/core/pom.xml
--
diff --git a/core/pom.xml b/core/pom.xml
index 928159d..745b8be 100644
--- a/core/pom.xml
+++ b/core/pom.xml
@@ -21,7 +21,7 @@
   parent
 groupIdorg.apache.spark/groupId
 artifactIdspark-parent_2.10/artifactId
-version1.3.1-SNAPSHOT/version
+version1.3.1/version
 relativePath../pom.xml/relativePath
   /parent
 

http://git-wip-us.apache.org/repos/asf/spark/blob/0dcb5d9f/examples/pom.xml
--
diff --git a/examples/pom.xml b/examples/pom.xml
index 73ab234..9f03cbd 100644
--- a/examples/pom.xml
+++ b/examples/pom.xml
@@ -21,7 +21,7 @@
   parent
 groupIdorg.apache.spark/groupId
 artifactIdspark-parent_2.10/artifactId
-version1.3.1-SNAPSHOT/version
+version1.3.1/version
 relativePath../pom.xml/relativePath
   /parent
 

http://git-wip-us.apache.org/repos/asf/spark/blob/0dcb5d9f/external/flume-sink/pom.xml
--
diff --git a/external/flume-sink/pom.xml b/external/flume-sink/pom.xml
index 1a5aaf5..46adbe2 100644
--- a/external/flume-sink/pom.xml
+++ b/external/flume-sink/pom.xml
@@ -21,7 +21,7 @@
   parent
 groupIdorg.apache.spark/groupId
 artifactIdspark-parent_2.10/artifactId
-version1.3.1-SNAPSHOT/version
+version1.3.1/version
 relativePath../../pom.xml/relativePath
   /parent
 

http://git-wip-us.apache.org/repos/asf/spark/blob/0dcb5d9f/external/flume/pom.xml
--
diff --git a/external/flume/pom.xml b/external/flume/pom.xml
index d5539d9..5fc589f 100644
--- a/external/flume/pom.xml
+++ b/external/flume/pom.xml
@@ -21,7 +21,7 @@
   parent
 groupIdorg.apache.spark/groupId
 artifactIdspark-parent_2.10/artifactId
-version1.3.1-SNAPSHOT/version
+

[2/2] spark git commit: Preparing development version 1.3.2-SNAPSHOT

2015-04-04 Thread pwendell
Preparing development version 1.3.2-SNAPSHOT


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/728c1f92
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/728c1f92
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/728c1f92

Branch: refs/heads/branch-1.3
Commit: 728c1f927822eb6b12f04dc47109feb6fbe02ec2
Parents: 0dcb5d9
Author: Patrick Wendell patr...@databricks.com
Authored: Sat Apr 4 20:25:26 2015 +
Committer: Patrick Wendell patr...@databricks.com
Committed: Sat Apr 4 20:25:26 2015 +

--
 assembly/pom.xml  | 2 +-
 bagel/pom.xml | 2 +-
 core/pom.xml  | 2 +-
 examples/pom.xml  | 2 +-
 external/flume-sink/pom.xml   | 2 +-
 external/flume/pom.xml| 2 +-
 external/kafka-assembly/pom.xml   | 2 +-
 external/kafka/pom.xml| 2 +-
 external/mqtt/pom.xml | 2 +-
 external/twitter/pom.xml  | 2 +-
 external/zeromq/pom.xml   | 2 +-
 extras/java8-tests/pom.xml| 2 +-
 extras/kinesis-asl/pom.xml| 2 +-
 extras/spark-ganglia-lgpl/pom.xml | 2 +-
 graphx/pom.xml| 2 +-
 mllib/pom.xml | 2 +-
 network/common/pom.xml| 2 +-
 network/shuffle/pom.xml   | 2 +-
 network/yarn/pom.xml  | 2 +-
 pom.xml   | 2 +-
 repl/pom.xml  | 2 +-
 sql/catalyst/pom.xml  | 2 +-
 sql/core/pom.xml  | 2 +-
 sql/hive-thriftserver/pom.xml | 2 +-
 sql/hive/pom.xml  | 2 +-
 streaming/pom.xml | 2 +-
 tools/pom.xml | 2 +-
 yarn/pom.xml  | 2 +-
 28 files changed, 28 insertions(+), 28 deletions(-)
--


http://git-wip-us.apache.org/repos/asf/spark/blob/728c1f92/assembly/pom.xml
--
diff --git a/assembly/pom.xml b/assembly/pom.xml
index 67bebfc..0952cd2 100644
--- a/assembly/pom.xml
+++ b/assembly/pom.xml
@@ -21,7 +21,7 @@
   parent
 groupIdorg.apache.spark/groupId
 artifactIdspark-parent_2.10/artifactId
-version1.3.1/version
+version1.3.2-SNAPSHOT/version
 relativePath../pom.xml/relativePath
   /parent
 

http://git-wip-us.apache.org/repos/asf/spark/blob/728c1f92/bagel/pom.xml
--
diff --git a/bagel/pom.xml b/bagel/pom.xml
index c7750a2..602cc7b 100644
--- a/bagel/pom.xml
+++ b/bagel/pom.xml
@@ -21,7 +21,7 @@
   parent
 groupIdorg.apache.spark/groupId
 artifactIdspark-parent_2.10/artifactId
-version1.3.1/version
+version1.3.2-SNAPSHOT/version
 relativePath../pom.xml/relativePath
   /parent
 

http://git-wip-us.apache.org/repos/asf/spark/blob/728c1f92/core/pom.xml
--
diff --git a/core/pom.xml b/core/pom.xml
index 745b8be..d95847d 100644
--- a/core/pom.xml
+++ b/core/pom.xml
@@ -21,7 +21,7 @@
   parent
 groupIdorg.apache.spark/groupId
 artifactIdspark-parent_2.10/artifactId
-version1.3.1/version
+version1.3.2-SNAPSHOT/version
 relativePath../pom.xml/relativePath
   /parent
 

http://git-wip-us.apache.org/repos/asf/spark/blob/728c1f92/examples/pom.xml
--
diff --git a/examples/pom.xml b/examples/pom.xml
index 9f03cbd..e1a3ecc 100644
--- a/examples/pom.xml
+++ b/examples/pom.xml
@@ -21,7 +21,7 @@
   parent
 groupIdorg.apache.spark/groupId
 artifactIdspark-parent_2.10/artifactId
-version1.3.1/version
+version1.3.2-SNAPSHOT/version
 relativePath../pom.xml/relativePath
   /parent
 

http://git-wip-us.apache.org/repos/asf/spark/blob/728c1f92/external/flume-sink/pom.xml
--
diff --git a/external/flume-sink/pom.xml b/external/flume-sink/pom.xml
index 46adbe2..f46a2a0 100644
--- a/external/flume-sink/pom.xml
+++ b/external/flume-sink/pom.xml
@@ -21,7 +21,7 @@
   parent
 groupIdorg.apache.spark/groupId
 artifactIdspark-parent_2.10/artifactId
-version1.3.1/version
+version1.3.2-SNAPSHOT/version
 relativePath../../pom.xml/relativePath
   /parent
 

http://git-wip-us.apache.org/repos/asf/spark/blob/728c1f92/external/flume/pom.xml
--
diff --git a/external/flume/pom.xml b/external/flume/pom.xml
index 5fc589f..02331e8 100644
--- a/external/flume/pom.xml
+++ b/external/flume/pom.xml
@@ -21,7 +21,7 @@
   parent
 groupIdorg.apache.spark/groupId
 artifactIdspark-parent_2.10/artifactId
-version1.3.1/version
+version1.3.2-SNAPSHOT/version
 relativePath../../pom.xml/relativePath
   /parent
 


Git Push Summary

2015-04-04 Thread pwendell
Repository: spark
Updated Tags:  refs/tags/v1.3.1-rc1 [created] 0dcb5d9f3

-
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org