Repository: spark
Updated Branches:
refs/heads/branch-2.0 7dc3fb6ae -> 42f2ee6c5
[SPARK-11395][SPARKR] Support over and window specification in SparkR.
This PR:
1. Implement WindowSpec S4 class.
2. Implement Window.partitionBy() and Window.orderBy() as utility functions to
create WindowSpec
Repository: spark
Updated Branches:
refs/heads/branch-2.0 a1887f213 -> 7dc3fb6ae
[HOTFIX] Fix MLUtils compile
Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/7dc3fb6a
Tree:
Repository: spark
Updated Branches:
refs/heads/master bbb777343 -> 7f5922aa4
[HOTFIX] Fix MLUtils compile
Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/7f5922aa
Tree:
Repository: spark
Updated Branches:
refs/heads/branch-2.0 1064a3303 -> a1887f213
[SPARK-15152][DOC][MINOR] Scaladoc and Code style Improvements
## What changes were proposed in this pull request?
Minor doc and code style fixes
## How was this patch tested?
local build
Author: Jacek
Repository: spark
Updated Branches:
refs/heads/master 02c07e899 -> bbb777343
[SPARK-15152][DOC][MINOR] Scaladoc and Code style Improvements
## What changes were proposed in this pull request?
Minor doc and code style fixes
## How was this patch tested?
local build
Author: Jacek Laskowski
Repository: spark
Updated Branches:
refs/heads/branch-2.0 80a4bfa4d -> 1064a3303
[SPARK-14893][SQL] Re-enable HiveSparkSubmitSuite SPARK-8489 test after
HiveContext is removed
## What changes were proposed in this pull request?
Enable the test that was disabled when HiveContext was removed.
Repository: spark
Updated Branches:
refs/heads/master 08db49126 -> 02c07e899
[SPARK-14893][SQL] Re-enable HiveSparkSubmitSuite SPARK-8489 test after
HiveContext is removed
## What changes were proposed in this pull request?
Enable the test that was disabled when HiveContext was removed.
##
Repository: spark
Updated Branches:
refs/heads/branch-2.0 19a14e841 -> 80a4bfa4d
[SPARK-9926] Parallelize partition logic in UnionRDD.
This patch has the new logic from #8512 that uses a parallel collection to
compute partitions in UnionRDD. The rest of #8512 added an alternative code
path
Repository: spark
Updated Branches:
refs/heads/master 5c47db065 -> 08db49126
[SPARK-9926] Parallelize partition logic in UnionRDD.
This patch has the new logic from #8512 that uses a parallel collection to
compute partitions in UnionRDD. The rest of #8512 added an alternative code
path for
Repository: spark
Updated Branches:
refs/heads/branch-2.0 8b4ab590c -> 19a14e841
[SPARK-15158][CORE] downgrade shouldRollover message to debug level
## What changes were proposed in this pull request?
set log level to debug when check shouldRollover
## How was this patch tested?
It's tested
Repository: spark
Updated Branches:
refs/heads/master 2c170dd3d -> 5c47db065
[SPARK-15158][CORE] downgrade shouldRollover message to debug level
## What changes were proposed in this pull request?
set log level to debug when check shouldRollover
## How was this patch tested?
It's tested
[SPARK-15134][EXAMPLE] Indent SparkSession builder patterns and update
binary_classification_metrics_example.py
## What changes were proposed in this pull request?
This issue addresses the comments in SPARK-15031 and also fix java-linter
errors.
- Use multiline format in SparkSession builder
[SPARK-15134][EXAMPLE] Indent SparkSession builder patterns and update
binary_classification_metrics_example.py
## What changes were proposed in this pull request?
This issue addresses the comments in SPARK-15031 and also fix java-linter
errors.
- Use multiline format in SparkSession builder
Repository: spark
Updated Branches:
refs/heads/master bb9991dec -> 2c170dd3d
http://git-wip-us.apache.org/repos/asf/spark/blob/2c170dd3/examples/src/main/python/ml/vector_indexer_example.py
--
diff --git
Repository: spark
Updated Branches:
refs/heads/branch-2.0 e78b31b72 -> 8b4ab590c
http://git-wip-us.apache.org/repos/asf/spark/blob/8b4ab590/examples/src/main/python/ml/vector_indexer_example.py
--
diff --git
Repository: spark
Updated Branches:
refs/heads/branch-2.0 59fa480b6 -> e78b31b72
[SPARK-15135][SQL] Make sure SparkSession thread safe
## What changes were proposed in this pull request?
Went through SparkSession and its members and fixed non-thread-safe classes
used by SparkSession
## How
Repository: spark
Updated Branches:
refs/heads/master ed6f3f8a5 -> bb9991dec
[SPARK-15135][SQL] Make sure SparkSession thread safe
## What changes were proposed in this pull request?
Went through SparkSession and its members and fixed non-thread-safe classes
used by SparkSession
## How was
Repository: spark
Updated Branches:
refs/heads/branch-2.0 fe268ee1e -> 59fa480b6
[SPARK-15072][SQL][REPL][EXAMPLES] Remove SparkSession.withHiveSupport
## What changes were proposed in this pull request?
Removing the `withHiveSupport` method of `SparkSession`, instead use
`enableHiveSupport`
Repository: spark
Updated Branches:
refs/heads/master 8cba57a75 -> ed6f3f8a5
[SPARK-15072][SQL][REPL][EXAMPLES] Remove SparkSession.withHiveSupport
## What changes were proposed in this pull request?
Removing the `withHiveSupport` method of `SparkSession`, instead use
`enableHiveSupport`
##
Repository: spark
Updated Branches:
refs/heads/branch-2.0 b063d9b71 -> fe268ee1e
[SPARK-14124][SQL][FOLLOWUP] Implement Database-related DDL Commands
What changes were proposed in this pull request?
First, a few test cases failed in mac OS X because the property value of
Repository: spark
Updated Branches:
refs/heads/master 63db2bd28 -> 8cba57a75
[SPARK-14124][SQL][FOLLOWUP] Implement Database-related DDL Commands
What changes were proposed in this pull request?
First, a few test cases failed in mac OS X because the property value of
`java.io.tmpdir`
Repository: spark
Updated Branches:
refs/heads/branch-2.0 c2b100e50 -> b063d9b71
[MINOR][BUILD] Adds spark-warehouse/ to .gitignore
## What changes were proposed in this pull request?
Adds spark-warehouse/ to `.gitignore`.
## How was this patch tested?
N/A
Author: Cheng Lian
Repository: spark
Updated Branches:
refs/heads/master 6fcc9 -> 63db2bd28
[MINOR][BUILD] Adds spark-warehouse/ to .gitignore
## What changes were proposed in this pull request?
Adds spark-warehouse/ to `.gitignore`.
## How was this patch tested?
N/A
Author: Cheng Lian
Repository: spark
Updated Branches:
refs/heads/branch-1.6 2db19a3af -> bf3c0608f
[SPARK-14915][CORE] Don't re-queue a task if another attempt has already
succeeded
Don't re-queue a task if another attempt has already succeeded. This currently
happens when a speculative task is denied from
Repository: spark
Updated Branches:
refs/heads/branch-2.0 4ec5d9345 -> c2b100e50
[SPARK-15110] [SPARKR] Implement repartitionByColumn for SparkR DataFrames
## What changes were proposed in this pull request?
Implement repartitionByColumn on DataFrame.
This will allow us to run R functions on
Repository: spark
Updated Branches:
refs/heads/master ac12b35d3 -> 6fcc9
[SPARK-15110] [SPARKR] Implement repartitionByColumn for SparkR DataFrames
## What changes were proposed in this pull request?
Implement repartitionByColumn on DataFrame.
This will allow us to run R functions on
Repository: spark
Updated Branches:
refs/heads/branch-2.0 346811141 -> 4ec5d9345
[SPARK-15148][SQL] Upgrade Univocity library from 2.0.2 to 2.1.0
## What changes were proposed in this pull request?
https://issues.apache.org/jira/browse/SPARK-15148
Mainly it improves the performance roughtly
Repository: spark
Updated Branches:
refs/heads/master 77361a433 -> 55cc1c991
[SPARK-14139][SQL] RowEncoder should preserve schema nullability
## What changes were proposed in this pull request?
The problem is: In `RowEncoder`, we use `Invoke` to get the field of an
external row, which lose
Repository: spark
Updated Branches:
refs/heads/branch-2.0 666eb0118 -> 80b49be91
[SPARK-14915][CORE] Don't re-queue a task if another attempt has already
succeeded
## What changes were proposed in this pull request?
Don't re-queue a task if another attempt has already succeeded. This
Repository: spark
Updated Branches:
refs/heads/master 104430223 -> 77361a433
[SPARK-14915][CORE] Don't re-queue a task if another attempt has already
succeeded
## What changes were proposed in this pull request?
Don't re-queue a task if another attempt has already succeeded. This currently
Repository: spark
Updated Branches:
refs/heads/master 4c0d827cf -> 104430223
[SPARK-14589][SQL] Enhance DB2 JDBC Dialect docker tests
## What changes were proposed in this pull request?
Enhance the DB2 JDBC Dialect docker tests as they seemed to have had some
issues on previous merge
Repository: spark
Updated Branches:
refs/heads/branch-2.0 0c4e42bea -> 743f07d74
[SPARK-15106][PYSPARK][ML] Add PySpark package doc for ML component & remove
"BETA"
## What changes were proposed in this pull request?
Copy the package documentation from Scala/Java to Python for ML package
Repository: spark
Updated Branches:
refs/heads/master b7fdc23cc -> 4c0d827cf
[SPARK-15106][PYSPARK][ML] Add PySpark package doc for ML component & remove
"BETA"
## What changes were proposed in this pull request?
Copy the package documentation from Scala/Java to Python for ML package and
Repository: spark
Updated Branches:
refs/heads/branch-2.0 433bc34b1 -> 0c4e42bea
[SPARK-12154] Upgrade to Jersey 2
## What changes were proposed in this pull request?
Replace com.sun.jersey with org.glassfish.jersey. Changes to the Spark Web UI
code were required to compile. The changes
Repository: spark
Updated Branches:
refs/heads/master 592fc4556 -> b7fdc23cc
[SPARK-12154] Upgrade to Jersey 2
## What changes were proposed in this pull request?
Replace com.sun.jersey with org.glassfish.jersey. Changes to the Spark Web UI
code were required to compile. The changes were
Repository: spark
Updated Branches:
refs/heads/branch-2.0 e28d21d3f -> 433bc34b1
[SPARK-15123] upgrade org.json4s to 3.2.11 version
## What changes were proposed in this pull request?
We had the issue when using snowplow in our Spark applications. Snowplow
requires json4s version 3.2.11
Repository: spark
Updated Branches:
refs/heads/master 1a5c6fcef -> 592fc4556
[SPARK-15123] upgrade org.json4s to 3.2.11 version
## What changes were proposed in this pull request?
We had the issue when using snowplow in our Spark applications. Snowplow
requires json4s version 3.2.11 while
37 matches
Mail list logo