Repository: spark
Updated Branches:
refs/heads/branch-1.4 b0460f414 - 869a52d9c
[SPARK-7403] [WEBUI] Link URL in objects on Timeline View is wrong in case of
running on YARN
When we use Spark on YARN and have AllJobPage via ResourceManager's proxy, the
link URL in objects which represent
Repository: spark
Updated Branches:
refs/heads/master 12b95abc7 - 7d0f17208
[STREAMING] [DOCS] Fix wrong url about API docs of StreamingListener
A little fix about wrong url of the API document.
(org.apache.spark.streaming.scheduler.StreamingListener)
Author: dobashim
Repository: spark
Updated Branches:
refs/heads/branch-1.4 869a52d9c - 5dbc7bbe3
[STREAMING] [DOCS] Fix wrong url about API docs of StreamingListener
A little fix about wrong url of the API document.
(org.apache.spark.streaming.scheduler.StreamingListener)
Author: dobashim
Repository: spark
Updated Branches:
refs/heads/branch-1.4 063451068 - 93af96a2f
[SPARK-6653] [YARN] New config to specify port for sparkYarnAM actor system
Author: shekhar.bansal shekhar.ban...@guavus.com
Closes #5719 from zuxqoj/master and squashes the following commits:
5574ff7
Repository: spark
Updated Branches:
refs/heads/master 4d29867ed - fc8feaa8e
[SPARK-6653] [YARN] New config to specify port for sparkYarnAM actor system
Author: shekhar.bansal shekhar.ban...@guavus.com
Closes #5719 from zuxqoj/master and squashes the following commits:
5574ff7
Repository: spark
Updated Branches:
refs/heads/branch-1.4 82be68f10 - 62308097b
[SPARK-7490] [CORE] [Minor] MapOutputTracker.deserializeMapStatuses: close
input streams
GZIPInputStream allocates native memory that is not freed until close() or
when the finalizer runs. It is best to close()
Repository: spark
Updated Branches:
refs/heads/master 4b3bb0e43 - 25889d8d9
[SPARK-7490] [CORE] [Minor] MapOutputTracker.deserializeMapStatuses: close
input streams
GZIPInputStream allocates native memory that is not freed until close() or
when the finalizer runs. It is best to close() these
Repository: spark
Updated Branches:
refs/heads/master 8e935b0a2 - 5438f49cc
[SPARK-2018] [CORE] Upgrade LZF library to fix endian serialization pâ¦
â¦roblem
Pick up newer version of dependency with fix for SPARK-2018. The update
involved patching the ning/compress LZF library to handle
Repository: spark
Updated Branches:
refs/heads/branch-1.3 b152c6cc2 - 92fe5b649
[SPARK-2018] [CORE] Upgrade LZF library to fix endian serialization pâ¦
â¦roblem
Pick up newer version of dependency with fix for SPARK-2018. The update
involved patching the ning/compress LZF library to
Repository: spark
Updated Branches:
refs/heads/master c1080b6fd - 7fb715de6
[SPARK-7249] Updated Hadoop dependencies due to inconsistency in the versions
Updated Hadoop dependencies due to inconsistency in the versions. Now the
global properties are the ones used by the hadoop-2.2 profile,
Repository: spark
Updated Branches:
refs/heads/branch-1.4 856619d48 - 8bde352bd
[BUILD] update jblas dependency version to 1.2.4
jblas 1.2.4 includes native library support for PPC64LE.
Author: Matthew Brandyberry mbra...@us.ibm.com
Closes #6199 from mtbrandy/jblas-1.2.4 and squashes the
Repository: spark
Updated Branches:
refs/heads/master ce6391296 - 1b4e710e5
[BUILD] update jblas dependency version to 1.2.4
jblas 1.2.4 includes native library support for PPC64LE.
Author: Matthew Brandyberry mbra...@us.ibm.com
Closes #6199 from mtbrandy/jblas-1.2.4 and squashes the
Repository: spark
Updated Branches:
refs/heads/branch-1.4 898be6248 - 0feb3ded2
[SPARK-7669] Builds against Hadoop 2.6+ get inconsistent curator dependâ¦
This adds a new profile, `hadoop-2.6`, copying over the hadoop-2.4 properties,
updating ZK to 3.4.6 and making the curator version a
Repository: spark
Updated Branches:
refs/heads/master e676fc0c6 - 3cd9ad240
[MINOR] Enhance SizeEstimator to detect IBM compressed refs and s390 â¦
â¦arch.
- zSeries 64-bit Java reports its architecture as s390x, so enhance the 64-bit
check to accommodate that value.
- SizeEstimator
Repository: spark
Updated Branches:
refs/heads/branch-1.4 bcb2c5d16 - a17a0ee77
[SPARK-7503] [YARN] Resources in .sparkStaging directory can't be cleaned up on
error
When we run applications on YARN with cluster mode, uploaded resources on
.sparkStaging directory can't be cleaned up in case
Repository: spark
Updated Branches:
refs/heads/master d41ae4344 - 1fd33815f
[SPARK-4556] [BUILD] binary distribution assembly can't run in local mode
Add note on building a runnable distribution with make-distribution.sh
Author: Sean Owen so...@cloudera.com
Closes #6186 from srowen/SPARK
Repository: spark
Updated Branches:
refs/heads/branch-1.4 7e3f9fea6 - 1fc35607d
[SPARK-4556] [BUILD] binary distribution assembly can't run in local mode
Add note on building a runnable distribution with make-distribution.sh
Author: Sean Owen so...@cloudera.com
Closes #6186 from srowen
Repository: spark
Updated Branches:
refs/heads/branch-1.4 1fc35607d - e7607e5cb
[SPARK-7672] [CORE] Use int conversion in translating kryoserializer.buffer.mb
to kryoserializer.buffer
In translating spark.kryoserializer.buffer.mb to spark.kryoserializer.buffer,
use of toDouble will lead to
Repository: spark
Updated Branches:
refs/heads/master 1fd33815f - 0ac8b01a0
[SPARK-7672] [CORE] Use int conversion in translating kryoserializer.buffer.mb
to kryoserializer.buffer
In translating spark.kryoserializer.buffer.mb to spark.kryoserializer.buffer,
use of toDouble will lead to
Repository: spark
Updated Branches:
refs/heads/branch-1.4 9da55b570 - 7e3f9fea6
[SPARK-7671] Fix wrong URLs in MLlib Data Types Documentation
There is a mistake in the URL of Matrices in the MLlib Data Types documentation
(Local matrix scala section), the URL points to
Repository: spark
Updated Branches:
refs/heads/branch-1.4 a1d896b85 - 0748263a2
Fixing a few basic typos in the Programming Guide.
Just a few minor fixes in the guide, so a new JIRA issue was not created per
the guidelines.
Author: Mike Dusenberry dusenberr...@gmail.com
Closes #6240 from
Repository: spark
Updated Branches:
refs/heads/master 6008ec14e - 61f164d3f
Fixing a few basic typos in the Programming Guide.
Just a few minor fixes in the guide, so a new JIRA issue was not created per
the guidelines.
Author: Mike Dusenberry dusenberr...@gmail.com
Closes #6240 from
Repository: spark
Updated Branches:
refs/heads/branch-1.4 31f5d53e9 - 6834d1af4
[SPARK-7723] Fix string interpolation in pipeline examples
https://issues.apache.org/jira/browse/SPARK-7723
Author: Saleem Ansari tux...@gmail.com
Closes #6258 from tuxdna/master and squashes the following
Repository: spark
Updated Branches:
refs/heads/master 27fa88b9b - df34793ad
[SPARK-7723] Fix string interpolation in pipeline examples
https://issues.apache.org/jira/browse/SPARK-7723
Author: Saleem Ansari tux...@gmail.com
Closes #6258 from tuxdna/master and squashes the following commits:
stage target again
in Maven; output results in target not project root; update to scalastyle 0.7.0
Author: Sean Owen so...@cloudera.com
Closes #5471 from srowen/SPARK-6861 and squashes the following commits:
acac637 [Sean Owen] Oops, add back execution but leave it at the default verify
phase
Repository: spark
Updated Branches:
refs/heads/master 585638e81 - d5f1b9650
[SPARK-2312] Logging Unhandled messages
The previous solution has changed based on
https://github.com/apache/spark/pull/2048 discussions.
Author: Isaias Barroso isaias.barr...@gmail.com
Closes #2055 from
this exception is thrown.
CC mateiz aarondav as those who may have last touched this code.
Author: Sean Owen so...@cloudera.com
Closes #5492 from srowen/SPARK-4783 and squashes the following commits:
60dc682 [Sean Owen] Avoid System.exit(1) in TaskSchedulerImpl and convert to
SparkException
Repository: spark
Updated Branches:
refs/heads/master 6179a9483 - de4fa6b6d
[SPARK-4194] [core] Make SparkContext initialization exception-safe.
SparkContext has a very long constructor, where multiple things are
initialized, multiple threads are spawned, and multiple opportunities
for
Repository: spark
Updated Branches:
refs/heads/master 57cd1e86d - 837055059
[Streaming][minor] Remove additional quote and unneeded imports
Author: jerryshao saisai.s...@intel.com
Closes #5540 from jerryshao/minor-fix and squashes the following commits:
ebaa646 [jerryshao] Minor fix
Repository: spark
Updated Branches:
refs/heads/master 4527761bc - f6a9a57a7
[SPARK-6952] Handle long args when detecting PID reuse
sbin/spark-daemon.sh used
ps -p $TARGET_PID -o args=
to figure out whether the process running with the expected PID is actually a
Spark
daemon. When
: Sean Owen so...@cloudera.com
Closes #5528 from srowen/SPARK-6846 and squashes the following commits:
137ac9f [Sean Owen] Oops, fix scalastyle line length probelm
7c5f961 [Sean Owen] Add Imran's test of kill link
59f447d [Sean Owen] kill endpoints now only accept a POST (kill stage, master
kill
Repository: spark
Updated Branches:
refs/heads/branch-1.3 6d3c4d8b0 - 47fb78c62
[SPARK-6952] Handle long args when detecting PID reuse
sbin/spark-daemon.sh used
ps -p $TARGET_PID -o args=
to figure out whether the process running with the expected PID is actually a
Spark
daemon. When
Repository: spark
Updated Branches:
refs/heads/branch-1.2 9677b4435 - e1e7fc017
[SPARK-6952] Handle long args when detecting PID reuse
sbin/spark-daemon.sh used
ps -p $TARGET_PID -o args=
to figure out whether the process running with the expected PID is actually a
Spark
daemon. When
Repository: spark
Updated Branches:
refs/heads/master f6a9a57a7 - dc48ba9f9
[SPARK-6604][PySpark]Specify ip of python server scoket
In driver now will start a server socket and use a wildcard ip, use 127.0.0.0
is more reasonable, as we only use it by local Python process.
/cc davies
Author:
Repository: spark
Updated Branches:
refs/heads/master 68d1faa3c - 950645d59
[SPARK-6868][YARN] Fix broken container log link on executor page when
HTTPS_ONLY.
Correct http schema in YARN container log link in Spark UI when container logs
when YARN is configured to be HTTPS_ONLY.
Uses the
Repository: spark
Updated Branches:
refs/heads/master cadd7d72c - 14ce3ea2c
[SPARK-6860][Streaming][WebUI] Fix the possible inconsistency of StreamingPage
Because `StreamingPage.render` doesn't hold the `listener` lock when generating
the content, the different parts of content may have some
Repository: spark
Updated Branches:
refs/heads/branch-1.3 8d4176132 - 8e5caa227
[SPARK-6868][YARN] Fix broken container log link on executor page when
HTTPS_ONLY.
Correct http schema in YARN container log link in Spark UI when container logs
when YARN is configured to be HTTPS_ONLY.
Uses
Repository: spark
Updated Branches:
refs/heads/master 240ea03fa - 202ebf06e
[SPARK-6870][Yarn] Catch InterruptedException when yarn application state
monitor thread been interrupted
On PR #5305 we interrupt the monitor thread but forget to catch the
InterruptedException, then in the log
Repository: spark
Updated Branches:
refs/heads/master 14ce3ea2c - 9d117cee0
[SPARK-6440][CORE]Handle IPv6 addresses properly when constructing URI
Author: nyaapa nya...@gmail.com
Closes #5424 from nyaapa/master and squashes the following commits:
6b717aa [nyaapa] [SPARK-6440][CORE] Remove
Repository: spark
Updated Branches:
refs/heads/master 9d117cee0 - 240ea03fa
[SPARK-6671] Add status command for spark daemons
SPARK-6671
Currently using the spark-daemon.sh script we can start and stop the spark
demons. But we cannot get the status of the daemons. It will be nice to include
Repository: spark
Updated Branches:
refs/heads/master 95a07591b - 694aef0d7
[hotfix] [build] Make sure JAVA_HOME is set for tests.
This is needed at least for YARN integration tests, since `$JAVA_HOME` is used
to launch the executors.
Author: Marcelo Vanzin van...@cloudera.com
Closes #5441
Repository: spark
Updated Branches:
refs/heads/master e9445b187 - ddc17431a
[SPARK-6843][core]Add volatile for the state
Fix potential visibility problem for the state of Executor
The field of state is shared and modified by multiple threads. i.e:
```scala
Within ExecutorRunner.scala
(1)
Repository: spark
Updated Branches:
refs/heads/master 6ac8eea2f - 04bcd67cf
[MINOR] a typo: coalesce
Author: Daoyuan Wang daoyuan.w...@intel.com
Closes #5482 from adrian-wang/typo and squashes the following commits:
e65ef6f [Daoyuan Wang] typo
Project:
Repository: spark
Updated Branches:
refs/heads/master 6be918942 - 29aabdd6c
[HOTFIX] [SPARK-6896] [SQL] fix compile error in hive-thriftserver
SPARK-6440 #5424 import guava but did not promote guava dependency to compile
level.
[INFO] compiler plugin:
Repository: spark
Updated Branches:
refs/heads/branch-1.3 0e5ca9e09 - 2954a1e21
[SPARK-6860][Streaming][WebUI] Fix the possible inconsistency of StreamingPage
Because `StreamingPage.render` doesn't hold the `listener` lock when generating
the content, the different parts of content may have
Repository: spark
Updated Branches:
refs/heads/branch-1.3 2954a1e21 - ec0e817ee
SPARK-4924 addendum. Minor assembly directory fix in load-spark-env-sh
Set the current dir path $FWDIR and same at $ASSEMBLY_DIR1, $ASSEMBLY_DIR2
otherwise $SPARK_HOME cannot be visible from spark-env.sh -- no
Repository: spark
Updated Branches:
refs/heads/branch-1.2 5845a6236 - 964f54478
SPARK-4924 addendum. Minor assembly directory fix in load-spark-env-sh
Set the current dir path $FWDIR and same at $ASSEMBLY_DIR1, $ASSEMBLY_DIR2
otherwise $SPARK_HOME cannot be visible from spark-env.sh -- no
Repository: spark
Updated Branches:
refs/heads/master 0b5d028a9 - 49f38824a
[SPARK-6673] spark-shell.cmd can't start in Windows even when spark was built
added equivalent script to load-spark-env.sh
Author: Masayoshi TSUZUKI tsudu...@oss.nttdata.co.jp
Closes #5328 from
srowen/SPARK-6569 and squashes the following commits:
8a5b992 [Sean Owen] Reduce is the same as ending offset message to INFO level
per JIRA discussion
Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/9fe41252
Tree: http://git
Repository: spark
Updated Branches:
refs/heads/master 2f482d706 - 86403f552
[SPARK-5242]: Add --private-ips flag to EC2 script
The `spark_ec2.py` script currently references the `ip_address` and
`public_dns_name` attributes of an instance. On private networks, these fields
aren't set, so we
Repository: spark
Updated Branches:
refs/heads/branch-1.3 4453c591a - ec3e76f1e
[SPARK-6343] Doc driver-worker network reqs
Attempt at making the driver-worker networking requirement more explicit and
up-front in the documentation (see
https://issues.apache.org/jira/browse/SPARK-6343).
Repository: spark
Updated Branches:
refs/heads/master 53f6bb1df - 470d7453a
[minor] [examples] Avoid packaging duplicate classes.
Add exclusions and explicit dependencies so that the examples
assembly does not duplicate classes already packaged in the main
assembly.
Also avoid relocating the
Repository: spark
Updated Branches:
refs/heads/master 2fe0a1aae - b9c51c049
[SPARK-6343] Doc driver-worker network reqs
Attempt at making the driver-worker networking requirement more explicit and
up-front in the documentation (see
https://issues.apache.org/jira/browse/SPARK-6343).
Update
Repository: spark
Updated Branches:
refs/heads/branch-1.4 2cce6bfea - 8567d29ef
[SPARK-7704] Updating Programming Guides per SPARK-4397
The change per SPARK-4397 makes implicit objects in SparkContext to be found by
the compiler automatically. So that we don't need to import the
Repository: spark
Updated Branches:
refs/heads/master e50546059 - 2777ed394
[DOC][Minor]Specify the common sources available for collecting
I was wondering what else common sources available until search the source
code. Maybe better to make this clear.
Author: Yijie Shen
Repository: spark
Updated Branches:
refs/heads/master 3a5c4da47 - da20c8ca3
[MINOR] [BUILD] Change link to jenkins builds on github.
Link to the tail of the console log, instead of the full log. That's
bound to have the info the user is looking for, and at the same time
loads way more quickly
Repository: spark
Updated Branches:
refs/heads/branch-1.4 90cf68638 - 9b3e4c187
[MINOR] [BUILD] Use custom temp directory during build.
Even with all the efforts to cleanup the temp directories created by
unit tests, Spark leaves a lot of garbage in /tmp after a test run.
This change
Repository: spark
Updated Branches:
refs/heads/master b16b5434f - 019dc9f55
[STREAMING] Update streaming-kafka-integration.md
Fixed the broken links (Examples) in the documentation.
Author: Akhil Das ak...@darktech.ca
Closes # from akhld/patch-2 and squashes the following commits:
Repository: spark
Updated Branches:
refs/heads/branch-1.4 9b3e4c187 - 0ef2e9d35
[STREAMING] Update streaming-kafka-integration.md
Fixed the broken links (Examples) in the documentation.
Author: Akhil Das ak...@darktech.ca
Closes # from akhld/patch-2 and squashes the following commits:
Repository: spark
Updated Branches:
refs/heads/branch-1.3 5b96b6933 - 5185ea9b4
[MINOR] [BUILD] Use custom temp directory during build.
Even with all the efforts to cleanup the temp directories created by
unit tests, Spark leaves a lot of garbage in /tmp after a test run.
This change
Repository: spark
Updated Branches:
refs/heads/master da20c8ca3 - b16b5434f
[MINOR] [BUILD] Use custom temp directory during build.
Even with all the efforts to cleanup the temp directories created by
unit tests, Spark leaves a lot of garbage in /tmp after a test run.
This change overrides
Repository: spark
Updated Branches:
refs/heads/master 019dc9f55 - 700312e12
[SPARK-6324] [CORE] Centralize handling of script usage messages.
Reorganize code so that the launcher library handles most of the work
of printing usage messages, instead of having an awkward protocol between
the
Repository: spark
Updated Branches:
refs/heads/branch-1.4 69197c3e3 - 99c2a5734
[SPARK-8126] [BUILD] Use custom temp directory during build.
Even with all the efforts to cleanup the temp directories created by
unit tests, Spark leaves a lot of garbage in /tmp after a test run.
This change
Repository: spark
Updated Branches:
refs/heads/master e84815dc3 - b127ff8a0
[SPARK-2808] [STREAMING] [KAFKA] cleanup tests from
see if requiring producer acks eliminates the need for waitUntilLeaderOffset
calls in tests
Author: cody koeninger c...@koeninger.org
Closes #5921 from
from srowen/SPARK-7733 and squashes the following commits:
59bda4e [Sean Owen] Update build to use Java 7, and remove some comments and
special-case support for Java 6
Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/e84815dc
Repository: spark
Updated Branches:
refs/heads/master 8c321d66d - ca8dafcc9
[SPARK-7042] [BUILD] use the standard akka artifacts with hadoop-2.x
Both akka 2.3.x and hadoop-2.x use protobuf 2.5 so only hadoop-1 build needs
custom 2.3.4-spark akka version that shades protobuf-2.5
This change
Repository: spark
Updated Branches:
refs/heads/master ca8dafcc9 - 835f1380d
[DOC] [TYPO] Fix typo in standalone deploy scripts description
Author: Yijie Shen henry.yijies...@gmail.com
Closes #6691 from yijieshen/patch-2 and squashes the following commits:
b40a4b0 [Yijie Shen] [DOC][TYPO]
Repository: spark
Updated Branches:
refs/heads/master 7658eb28a - 0902a1194
[SPARK-8101] [CORE] Upgrade netty to avoid memory leak accord to netty #3837
issues
Update to Netty 4.0.28-Final
Author: Sean Owen so...@cloudera.com
Closes #6701 from srowen/SPARK-8101 and squashes the following
Repository: spark
Updated Branches:
refs/heads/branch-1.3 582f437e9 - 9480aa31e
[SPARK-8126] [BUILD] Use custom temp directory during build.
Even with all the efforts to cleanup the temp directories created by
unit tests, Spark leaves a lot of garbage in /tmp after a test run.
This change
Repository: spark
Updated Branches:
refs/heads/master a71be0a36 - a8077e5cf
[SPARK-6973] remove skipped stage ID from completed set on the allJobsPage
Though totalStages = allStages - skippedStages is understandable. But consider
the problem [SPARK-6973], I think totalStages = allStages is
Repository: spark
Updated Branches:
refs/heads/master 10fc2f6f5 - eacd4a929
[SPARK-7705] [YARN] Cleanup of .sparkStaging directory fails if application is
killed
As I have tested, if we cancel or kill the app then the final status may be
undefined, killed or succeeded, so clean up staging
Repository: spark
Updated Branches:
refs/heads/branch-1.4 58bfdd621 - a3afc2cba
[SPARK-7705] [YARN] Cleanup of .sparkStaging directory fails if application is
killed
As I have tested, if we cancel or kill the app then the final status may be
undefined, killed or succeeded, so clean up
Repository: spark
Updated Branches:
refs/heads/master 03ef6be9c - a1d9e5cc6
[SPARK-8126] [BUILD] Use custom temp directory during build.
Even with all the efforts to cleanup the temp directories created by
unit tests, Spark leaves a lot of garbage in /tmp after a test run.
This change
Repository: spark
Updated Branches:
refs/heads/master e3e9c7038 - 149d1b28e
[SMALL FIX] Return null if catch EOFException
Return null if catch EOFException, just like function asKeyValueIterator in
this class
Author: Mingfei mingfei@intel.com
Closes #6703 from shimingfei/returnNull and
Repository: spark
Updated Branches:
refs/heads/master a1d9e5cc6 - e3e9c7038
[SPARK-8140] [MLLIB] Remove empty model check in StreamingLinearAlgorithm
1. Prevent creating a map of data to find numFeatures
2. If model is empty, then initialize with a zero vector of numFeature
Author: MechCoder
Repository: spark
Updated Branches:
refs/heads/master d38cf217e - 28dbde387
[SPARK-7983] [MLLIB] Add require for one-based indices in loadLibSVMFile
jira: https://issues.apache.org/jira/browse/SPARK-7983
Customers frequently use zero-based indices in their LIBSVM files. No warnings
or
Author: srowen
Date: Wed Jun 3 17:14:40 2015
New Revision: 1683391
URL: http://svn.apache.org/r1683391
Log:
Fix two Java example typos
Modified:
spark/examples.md
spark/site/examples.html
Modified: spark/examples.md
URL:
http://svn.apache.org/viewvc/spark/examples.md?rev=1683391r1
Repository: spark
Updated Branches:
refs/heads/master 9982d453c - 10ba18808
Fix maxTaskFailures comment
If maxTaskFailures is 1, the task set is aborted after 1 task failure. Other
documentation and the code supports this reading, I think it's just this
comment that was off. It's easy to
Repository: spark
Updated Branches:
refs/heads/branch-1.4 84da65319 - daf9451a4
Fix maxTaskFailures comment
If maxTaskFailures is 1, the task set is aborted after 1 task failure. Other
documentation and the code supports this reading, I think it's just this
comment that was off. It's easy
Repository: spark
Updated Branches:
refs/heads/master e6fb6cedf - 6c1723abe
[SPARK-8140] [MLLIB] Remove construct to get weights in StreamingLinearAlgorithm
Author: MechCoder manojkumarsivaraj...@gmail.com
Closes #6720 from MechCoder/empty_model_check and squashes the following
commits:
Repository: spark
Updated Branches:
refs/heads/master 6c1723abe - 490d5a72e
[SPARK-8274] [DOCUMENTATION-MLLIB] Fix wrong URLs in MLlib Frequent Pattern
Mining Documentation
There is a mistake in the URLs of the Scala section of FP-Growth in the MLlib
Frequent Pattern Mining documentation.
Repository: spark
Updated Branches:
refs/heads/branch-1.4 0a9383dec - a7b7a194a
[SPARK-8274] [DOCUMENTATION-MLLIB] Fix wrong URLs in MLlib Frequent Pattern
Mining Documentation
There is a mistake in the URLs of the Scala section of FP-Growth in the MLlib
Frequent Pattern Mining
Repository: spark
Updated Branches:
refs/heads/master 54557f353 - 93360dc3c
[SPARK-7913] [CORE] Make AppendOnlyMap use the same growth strategy of
OpenHashSet and consistent exception message
This is a follow up PR for #6456 to make AppendOnlyMap consistent with
OpenHashSet.
/cc srowen
Repository: spark
Updated Branches:
refs/heads/master ebd363aec - 47af7c1eb
[SPARK-8389] [STREAMING] [KAFKA] Example of getting offset ranges out oâ¦
â¦f the existing java direct stream api
Author: cody koeninger c...@koeninger.org
Closes #6846 from koeninger/SPARK-8389 and squashes the
Repository: spark
Updated Branches:
refs/heads/master 93360dc3c - ebd363aec
[SPARK-7265] Improving documentation for Spark SQL Hive support
Please review this pull request.
Author: Jihong MA linlin200...@gmail.com
Closes #5933 from JihongMA/SPARK-7265 and squashes the following commits:
Repository: spark
Updated Branches:
refs/heads/master fdf63f124 - 54557f353
[SPARK-8387] [FOLLOWUP ] [WEBUI] Update driver log URL to show only 4096 bytes
This is to follow up #6834 , update the driver log URL as well for consistency.
Author: Carson Wang carson.w...@intel.com
Closes #6878
Repository: spark
Updated Branches:
refs/heads/master b5a6663da - d48e78934
[SPARK-3629] [YARN] [DOCS]: Improvement of the Running Spark on YARN document
As per the description in the JIRA, I moved the contents of the page and added
a few additional content.
Author: Neelesh Srinivas Salian
Repository: spark
Updated Branches:
refs/heads/master 9d1181776 - b5a6663da
[SPARK-8639] [DOCS] Fixed Minor Typos in Documentation
Ticket: [SPARK-8639](https://issues.apache.org/jira/browse/SPARK-8639)
fixed minor typos in docs/README.md and docs/api.md
Author: Rosstin astera...@gmail.com
Repository: spark
Updated Branches:
refs/heads/branch-1.4 2579948bf - 68907d272
[SPARK-8639] [DOCS] Fixed Minor Typos in Documentation
Ticket: [SPARK-8639](https://issues.apache.org/jira/browse/SPARK-8639)
fixed minor typos in docs/README.md and docs/api.md
Author: Rosstin
Repository: spark
Updated Branches:
refs/heads/branch-1.4 68907d272 - a2dbb4807
[SPARK-3629] [YARN] [DOCS]: Improvement of the Running Spark on YARN document
As per the description in the JIRA, I moved the contents of the page and added
a few additional content.
Author: Neelesh Srinivas
Repository: spark
Updated Branches:
refs/heads/branch-1.4 2846a357f - 59fc3f197
[SPARK-8200] [MLLIB] Check for empty RDDs in StreamingLinearAlgorithm
Test cases for both StreamingLinearRegression and StreamingLogisticRegression,
and code fix.
Edit:
This contribution is my original work and
Repository: spark
Updated Branches:
refs/heads/master 96a7c888d - b928f5438
[SPARK-8200] [MLLIB] Check for empty RDDs in StreamingLinearAlgorithm
Test cases for both StreamingLinearRegression and StreamingLogisticRegression,
and code fix.
Edit:
This contribution is my original work and I
Repository: spark
Updated Branches:
refs/heads/master 4bd10fd50 - cebf24118
[SPARK-8126] [BUILD] Make sure temp dir exists when running tests.
If you ran clean at the top-level sbt project, the temp dir would
go away, so running test without restarting sbt would fail. This
fixes that by
Repository: spark
Updated Branches:
refs/heads/branch-1.4 4da068650 - b9e5d3cad
[SPARK-8126] [BUILD] Make sure temp dir exists when running tests.
If you ran clean at the top-level sbt project, the temp dir would
go away, so running test without restarting sbt would fail. This
fixes that by
Repository: spark
Updated Branches:
refs/heads/branch-1.3 9480aa31e - 5f1a8e74b
[SPARK-8126] [BUILD] Make sure temp dir exists when running tests.
If you ran clean at the top-level sbt project, the temp dir would
go away, so running test without restarting sbt would fail. This
fixes that by
Repository: spark
Updated Branches:
refs/heads/master 29c5025a7 - dc455b883
[SPARK-DOCS] [SPARK-SQL] Update sql-programming-guide.md
Typo in thriftserver section
Author: Moussa Taifi mouta...@gmail.com
Closes #6847 from moutai/patch-1 and squashes the following commits:
1bd29df [Moussa
Repository: spark
Updated Branches:
refs/heads/master 658814c89 - 29c5025a7
[SPARK-8387] [WEBUI] Only show 4096 bytes content for executor log instead of
show all
Author: hushan[è¡ç] hus...@xiaomi.com
Closes #6834 from suyanNone/small-display and squashes the following commits:
744212f
Repository: spark
Updated Branches:
refs/heads/branch-1.4 f287f7ea1 - 1378bdc4a
[SPARK-DOCS] [SPARK-SQL] Update sql-programming-guide.md
Typo in thriftserver section
Author: Moussa Taifi mouta...@gmail.com
Closes #6847 from moutai/patch-1 and squashes the following commits:
1bd29df [Moussa
Repository: spark
Updated Branches:
refs/heads/master dc455b883 - 4bd10fd50
[SQL] [DOC] improved a comment
[SQL][DOC] I found it a bit confusing when I came across it for the first time
in the docs
Author: Radek Ostrowski dest.haw...@gmail.com
Author: radek radek@radeks-MacBook-Pro-2.local
Repository: spark
Updated Branches:
refs/heads/branch-1.4 1378bdc4a - 4da068650
[SQL] [DOC] improved a comment
[SQL][DOC] I found it a bit confusing when I came across it for the first time
in the docs
Author: Radek Ostrowski dest.haw...@gmail.com
Author: radek
301 - 400 of 4820 matches
Mail list logo