Github user kbzod commented on the pull request:
https://github.com/apache/spark/pull/1375#issuecomment-55500191
I can try - there was a lot of refactoring done since my initial PR.
On Fri, Sep 5, 2014 at 8:52 PM, andrewor14 notificati...@github.com wrote:
@kbzod
Github user kbzod commented on the pull request:
https://github.com/apache/spark/pull/1465#issuecomment-53260871
@JoshRosen You are right, the `local[N, maxfailures]` mechanism works
already, but the filer of
[SPARK-2083](https://issues.apache.org/jira/browse/SPARK-2083) stated
Github user kbzod commented on a diff in the pull request:
https://github.com/apache/spark/pull/1465#discussion_r15245692
--- Diff: docs/configuration.md ---
@@ -599,6 +599,15 @@ Apart from these, the following properties are also
available, and may be useful
td
Github user kbzod commented on a diff in the pull request:
https://github.com/apache/spark/pull/1465#discussion_r15245710
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -1477,7 +1478,8 @@ object SparkContext extends Logging {
def localCpuCount
Github user kbzod commented on a diff in the pull request:
https://github.com/apache/spark/pull/1465#discussion_r15245702
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -1463,12 +1463,13 @@ object SparkContext extends Logging {
// Regular
GitHub user kbzod opened a pull request:
https://github.com/apache/spark/pull/1465
SPARK-2083 Add support for spark.local.maxFailures configuration property
The logic in `SparkContext` for creating a new task scheduler now looks for
a spark.local.maxFailures property to specify
GitHub user kbzod opened a pull request:
https://github.com/apache/spark/pull/1375
[WIP] SPARK-2450: Add YARN executor log links to UI executors page
This adds a new column in the Executors page of the Spark UI called Logs,
visible only when running under YARN. After getting