[GitHub] spark pull request #19717: [SPARK-22646] [Submission] Spark on Kubernetes - ...

2017-12-08 Thread jason-dai
Github user jason-dai commented on a diff in the pull request: https://github.com/apache/spark/pull/19717#discussion_r155787931 --- Diff: resource-managers/kubernetes/docker/src/main/dockerfiles/spark-base/Dockerfile --- @@ -0,0 +1,47 @@ +# +# Licensed to the Apache

[GitHub] spark pull request #19717: [SPARK-22646] [Submission] Spark on Kubernetes - ...

2017-12-07 Thread jason-dai
Github user jason-dai commented on a diff in the pull request: https://github.com/apache/spark/pull/19717#discussion_r155710219 --- Diff: resource-managers/kubernetes/docker/src/main/dockerfiles/spark-base/Dockerfile --- @@ -0,0 +1,47 @@ +# +# Licensed to the Apache

[GitHub] spark pull request: [SPARK-2365] Add IndexedRDD, an efficient upda...

2015-03-13 Thread jason-dai
Github user jason-dai commented on the pull request: https://github.com/apache/spark/pull/1297#issuecomment-79776523 @jegonzal I wonder if you can share more details on your stack overflow issue. We were considering a general fix (e.g., as I outlined in https://issues.apache.org/jira

[GitHub] spark pull request: [SPARK-4672][Core]Checkpoint() should clear f ...

2014-12-02 Thread jason-dai
Github user jason-dai commented on the pull request: https://github.com/apache/spark/pull/3545#issuecomment-65360084 I believe ClosureCleaner.clean() is defined to deal with exactly this issue: scala may capture the entire class in closure, even if only one member variable is used

[GitHub] spark pull request: [SPARK-4672][Core]Checkpoint() should clear f ...

2014-12-02 Thread jason-dai
Github user jason-dai commented on the pull request: https://github.com/apache/spark/pull/3545#issuecomment-65345271 Maybe we can try something like: class ZippedPartitionsRDD2 (sc, f, …) { val cleanF(part1, part2, ctx) = sc.clean(f(rdd1.iterator(part1, ctx

[GitHub] spark pull request: [SPARK-4672][GraphX]Perform checkpoint() on Pa...

2014-12-02 Thread jason-dai
Github user jason-dai commented on the pull request: https://github.com/apache/spark/pull/3549#issuecomment-65343799 Maybe we can try something like: class ZippedPartitionsRDD2 (sc, f, …) { val cleanF(part1, part2, ctx) = sc.clean(f(rdd1.iterator(part1, ctx