http://git-wip-us.apache.org/repos/asf/spark/blob/04e44b37/examples/src/main/python/sort.py
--
diff --git a/examples/src/main/python/sort.py b/examples/src/main/python/sort.py
index bb686f1..f6b0ecb 100755
---
Author: pwendell
Date: Fri Apr 17 05:52:53 2015
New Revision: 1674217
URL: http://svn.apache.org/r1674217
Log:
Adding docs for 1.2.2 and 1.3.1
[This commit notification would consist of 1529 parts,
which exceeds the limit of 50 ones, so it was shortened to the summary.]
Repository: spark
Updated Branches:
refs/heads/master 55f553a97 - 04e44b37c
http://git-wip-us.apache.org/repos/asf/spark/blob/04e44b37/python/pyspark/tests.py
--
diff --git a/python/pyspark/tests.py b/python/pyspark/tests.py
Repository: spark
Updated Branches:
refs/heads/master 4527761bc - f6a9a57a7
[SPARK-6952] Handle long args when detecting PID reuse
sbin/spark-daemon.sh used
ps -p $TARGET_PID -o args=
to figure out whether the process running with the expected PID is actually a
Spark
daemon. When
Repository: spark
Updated Branches:
refs/heads/master 8220d5265 - f7a25644e
SPARK-6846 [WEBUI] Stage kill URL easy to accidentally trigger and possibility
for security issue
kill endpoints now only accept a POST (kill stage, master kill app, master kill
driver); kill link now POSTs
Author:
Repository: spark
Updated Branches:
refs/heads/branch-1.3 6d3c4d8b0 - 47fb78c62
[SPARK-6952] Handle long args when detecting PID reuse
sbin/spark-daemon.sh used
ps -p $TARGET_PID -o args=
to figure out whether the process running with the expected PID is actually a
Spark
daemon. When
Repository: spark
Updated Branches:
refs/heads/branch-1.2 9677b4435 - e1e7fc017
[SPARK-6952] Handle long args when detecting PID reuse
sbin/spark-daemon.sh used
ps -p $TARGET_PID -o args=
to figure out whether the process running with the expected PID is actually a
Spark
daemon. When
Repository: spark
Updated Branches:
refs/heads/master f6a9a57a7 - dc48ba9f9
[SPARK-6604][PySpark]Specify ip of python server scoket
In driver now will start a server socket and use a wildcard ip, use 127.0.0.0
is more reasonable, as we only use it by local Python process.
/cc davies
Author:
Repository: spark
Updated Branches:
refs/heads/master c84d91692 - 50ab8a654
[SPARK-2669] [yarn] Distribute client configuration to AM.
Currently, when Spark launches the Yarn AM, the process will use
the local Hadoop configuration on the node where the AM launches,
if one is present. A more
[SPARK-6113] [ml] Stabilize DecisionTree API
This is a PR for cleaning up and finalizing the DecisionTree API. PRs for
ensembles will follow once this is merged.
### Goal
Here is the description copied from the JIRA (for both trees and ensembles):
**Issue**: The APIs for DecisionTree and
Repository: spark
Updated Branches:
refs/heads/master 50ab8a654 - a83571acc
http://git-wip-us.apache.org/repos/asf/spark/blob/a83571ac/mllib/src/main/scala/org/apache/spark/mllib/tree/model/treeEnsembleModels.scala
--
diff
Repository: spark
Updated Branches:
refs/heads/master 59e206deb - d305e686b
SPARK-6988 : Fix documentation regarding DataFrames using the Java API
This patch includes :
* adding how to use map after an sql query using javaRDD
* fixing the first few java examples that were written in Scala
Repository: spark
Updated Branches:
refs/heads/master a83571acc - 59e206deb
[SPARK-6807] [SparkR] Merge recent SparkR-pkg changes
This PR pulls in recent changes in SparkR-pkg, including
cartesian, intersection, sampleByKey, subtract, subtractByKey, except, and some
API for StructType and
Repository: spark
Updated Branches:
refs/heads/master d305e686b - a452c5921
Minor fix to SPARK-6958: Improve Python docstring for DataFrame.sort.
As a follow up PR to #5544.
cc davies
Author: Reynold Xin r...@databricks.com
Closes #5558 from rxin/sort-doc-improvement and squashes the
Repository: spark
Updated Branches:
refs/heads/branch-1.3 47fb78c62 - 6b528dc13
SPARK-6988 : Fix documentation regarding DataFrames using the Java API
This patch includes :
* adding how to use map after an sql query using javaRDD
* fixing the first few java examples that were written in
Added: spark/site/news/spark-1-2-2-released.html
URL:
http://svn.apache.org/viewvc/spark/site/news/spark-1-2-2-released.html?rev=1674414view=auto
==
--- spark/site/news/spark-1-2-2-released.html (added)
+++
Modified: spark/site/releases/spark-release-1-2-0.html
URL:
http://svn.apache.org/viewvc/spark/site/releases/spark-release-1-2-0.html?rev=1674414r1=1674413r2=1674414view=diff
==
---
Author: pwendell
Date: Fri Apr 17 22:45:03 2015
New Revision: 1674414
URL: http://svn.apache.org/r1674414
Log:
Adding 1.2.2 and 1.3.1 releases
Added:
spark/site/news/spark-1-2-2-released.html
spark/site/releases/spark-release-1-2-2.html
spark/site/releases/spark-release-1-3-1.html
Repository: spark
Updated Branches:
refs/heads/master dc48ba9f9 - c84d91692
[SPARK-6957] [SPARK-6958] [SQL] improve API compatibility to pandas
```
select(['cola', 'colb'])
groupby(['colA', 'colB'])
groupby([df.colA, df.colB])
df.sort('A', ascending=True)
df.sort(['A', 'B'], ascending=True)
Repository: spark
Updated Branches:
refs/heads/master 6fbeb82e1 - 199133733
[SPARK-5933] [core] Move config deprecation warnings to SparkConf.
I didn't find many deprecated configs after a grep-based search,
but the ones I could find were moved to the centralized location
in SparkConf.
While
Repository: spark
Updated Branches:
refs/heads/master 199133733 - d850b4bd3
[SPARK-6975][Yarn] Fix argument validation error
`numExecutors` checking is failed when dynamic allocation is enabled with
default configuration. Details can be seen is
Repository: spark
Updated Branches:
refs/heads/master c5ed51013 - 6fbeb82e1
[SPARK-6350][Mesos] Make mesosExecutorCores configurable in mesos
fine-grained mode
- Defined executorCores from spark.mesos.executor.cores
- Changed the amount of mesosExecutor's cores to executorCores.
- Added new
Repository: spark
Updated Branches:
refs/heads/master a452c5921 - c5ed51013
[SPARK-6703][Core] Provide a way to discover existing SparkContext's
I've added a static getOrCreate method to the static SparkContext object that
allows one to either retrieve a previously created SparkContext or to
23 matches
Mail list logo