Repository: spark
Updated Branches:
refs/heads/branch-1.0 4708eff67 -> 39ac62d6c
[SQL] SPARK-1732 - Support for null primitive values.
I also removed a println that I bumped into.
Author: Michael Armbrust
Closes #658 from marmbrus/nullPrimitives and squashes the following commits:
a3ec4f3
Repository: spark
Updated Branches:
refs/heads/master a2262cdb7 -> 3c64750bd
[SQL] SPARK-1732 - Support for null primitive values.
I also removed a println that I bumped into.
Author: Michael Armbrust
Closes #658 from marmbrus/nullPrimitives and squashes the following commits:
a3ec4f3 [Mic
Repository: spark
Updated Branches:
refs/heads/master 6d721c5f7 -> a2262cdb7
[SPARK-1735] Add the missing special profiles to make-distribution.sh
73b0cbcc241cca3d318ff74340e80b02f884acbd introduced a few special profiles that
are not covered in the `make-distribution.sh`. This affects hadoop
Repository: spark
Updated Branches:
refs/heads/branch-1.0 2853e56f6 -> 4708eff67
[SPARK-1735] Add the missing special profiles to make-distribution.sh
73b0cbcc241cca3d318ff74340e80b02f884acbd introduced a few special profiles that
are not covered in the `make-distribution.sh`. This affects ha
Repository: spark
Updated Branches:
refs/heads/branch-1.0 32c960a01 -> 2853e56f6
[SPARK-1678][SPARK-1679] In-memory compression bug fix and made compression
configurable, disabled by default
In-memory compression is now configurable in `SparkConf` by the
`spark.sql.inMemoryCompression.enable
Repository: spark
Updated Branches:
refs/heads/master 98750a74d -> 6d721c5f7
[SPARK-1678][SPARK-1679] In-memory compression bug fix and made compression
configurable, disabled by default
In-memory compression is now configurable in `SparkConf` by the
`spark.sql.inMemoryCompression.enabled` p
Repository: spark
Updated Branches:
refs/heads/branch-1.0 a5f765cab -> 32c960a01
http://git-wip-us.apache.org/repos/asf/spark/blob/32c960a0/mllib/src/test/scala/org/apache/spark/mllib/tree/DecisionTreeSuite.scala
--
diff --git
Repository: spark
Updated Branches:
refs/heads/master ea10b3126 -> 98750a74d
http://git-wip-us.apache.org/repos/asf/spark/blob/98750a74/mllib/src/test/scala/org/apache/spark/mllib/tree/DecisionTreeSuite.scala
--
diff --git
a/m
Repository: spark
Updated Branches:
refs/heads/master 8e724dcba -> ea10b3126
Expose SparkListeners and relevant classes as DeveloperApi
Hopefully this can go into 1.0, as a few people on the user list have asked for
this.
Author: Andrew Or
Closes #648 from andrewor14/expose-listeners and s
Repository: spark
Updated Branches:
refs/heads/branch-1.0 01e3ff01d -> a5f765cab
Expose SparkListeners and relevant classes as DeveloperApi
Hopefully this can go into 1.0, as a few people on the user list have asked for
this.
Author: Andrew Or
Closes #648 from andrewor14/expose-listeners a
Repository: spark
Updated Branches:
refs/heads/branch-1.0 4d0dd5079 -> 01e3ff01d
SPARK-1728. JavaRDDLike.mapPartitionsWithIndex requires ClassTag
Author: Sandy Ryza
Closes #657 from sryza/sandy-spark-1728 and squashes the following commits:
4751443 [Sandy Ryza] SPARK-1728. JavaRDDLike.mapPa
Repository: spark
Updated Branches:
refs/heads/master cf0a8f020 -> 8e724dcba
SPARK-1728. JavaRDDLike.mapPartitionsWithIndex requires ClassTag
Author: Sandy Ryza
Closes #657 from sryza/sandy-spark-1728 and squashes the following commits:
4751443 [Sandy Ryza] SPARK-1728. JavaRDDLike.mapPartit
Repository: spark
Updated Branches:
refs/heads/branch-1.0 1fac4ecbd -> 4d0dd5079
[SPARK-1681] Include datanucleus jars in Spark Hive distribution
This copies the datanucleus jars over from `lib_managed` into `dist/lib`, if
any. The `CLASSPATH` must also be updated to reflect this change.
Aut
Repository: spark
Updated Branches:
refs/heads/master a975a19f2 -> cf0a8f020
[SPARK-1681] Include datanucleus jars in Spark Hive distribution
This copies the datanucleus jars over from `lib_managed` into `dist/lib`, if
any. The `CLASSPATH` must also be updated to reflect this change.
Author:
Repository: spark
Updated Branches:
refs/heads/branch-1.0 80f4360e7 -> 1fac4ecbd
[SPARK-1504], [SPARK-1505], [SPARK-1558] Updated Spark Streaming guide
- SPARK-1558: Updated custom receiver guide to match it with the new API
- SPARK-1504: Added deployment and monitoring subsection to streaming
Repository: spark
Updated Branches:
refs/heads/master 3292e2a71 -> a975a19f2
[SPARK-1504], [SPARK-1505], [SPARK-1558] Updated Spark Streaming guide
- SPARK-1558: Updated custom receiver guide to match it with the new API
- SPARK-1504: Added deployment and monitoring subsection to streaming
- S
Repository: spark
Updated Branches:
refs/heads/branch-1.0 5d72283ba -> 80f4360e7
SPARK-1721: Reset the thread classLoader in the Mesos Executor
This is because Mesos calls it with a different environment or something, the
result is that the Spark jar is missing and it can't load classes.
Thi
Repository: spark
Updated Branches:
refs/heads/master 73b0cbcc2 -> 3292e2a71
SPARK-1721: Reset the thread classLoader in the Mesos Executor
This is because Mesos calls it with a different environment or something, the
result is that the Spark jar is missing and it can't load classes.
This fi
Repository: spark
Updated Branches:
refs/heads/branch-1.0 6be72265a -> 5d72283ba
SPARK-1556. jets3t dep doesn't update properly with newer Hadoop versions
See related discussion at https://github.com/apache/spark/pull/468
This PR may still overstep what you have in mind, but let me put it on
Repository: spark
Updated Branches:
refs/heads/master f2eb070ac -> 73b0cbcc2
SPARK-1556. jets3t dep doesn't update properly with newer Hadoop versions
See related discussion at https://github.com/apache/spark/pull/468
This PR may still overstep what you have in mind, but let me put it on the
Repository: spark
Updated Branches:
refs/heads/master bb2bb0cf6 -> f2eb070ac
Updated doc for spark.closure.serializer to indicate only Java serializer work.
See discussion from
http://apache-spark-developers-list.1001551.n3.nabble.com/bug-using-kryo-as-closure-serializer-td6473.html
Author:
Repository: spark
Updated Branches:
refs/heads/branch-1.0 b5c62c887 -> 6be72265a
Updated doc for spark.closure.serializer to indicate only Java serializer work.
See discussion from
http://apache-spark-developers-list.1001551.n3.nabble.com/bug-using-kryo-as-closure-serializer-td6473.html
Auth
22 matches
Mail list logo