Github user liyichao closed the pull request at:
https://github.com/apache/spark/pull/18093
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature
Github user liyichao commented on the issue:
https://github.com/apache/spark/pull/18070
Oh, I did not notice that, since @nlyu follows up, I will close this pr now.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your
Github user liyichao closed the pull request at:
https://github.com/apache/spark/pull/18070
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature
Github user liyichao commented on the issue:
https://github.com/apache/spark/pull/18070
I will update the pr in a day.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled
Github user liyichao commented on the issue:
https://github.com/apache/spark/pull/18093
Sorry about that, I will test it when I have time.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user liyichao commented on a diff in the pull request:
https://github.com/apache/spark/pull/18092#discussion_r122621671
--- Diff:
core/src/test/scala/org/apache/spark/storage/BlockManagerSuite.scala ---
@@ -1281,6 +1286,61 @@ class BlockManagerSuite extends SparkFunSuite
Github user liyichao commented on a diff in the pull request:
https://github.com/apache/spark/pull/18092#discussion_r122620600
--- Diff:
core/src/test/scala/org/apache/spark/storage/BlockManagerSuite.scala ---
@@ -1281,6 +1286,59 @@ class BlockManagerSuite extends SparkFunSuite
Github user liyichao commented on a diff in the pull request:
https://github.com/apache/spark/pull/18092#discussion_r122620196
--- Diff:
core/src/test/scala/org/apache/spark/storage/BlockManagerSuite.scala ---
@@ -1281,6 +1286,59 @@ class BlockManagerSuite extends SparkFunSuite
Github user liyichao commented on the issue:
https://github.com/apache/spark/pull/18092
ping @jiangxb1987
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
Github user liyichao closed the pull request at:
https://github.com/apache/spark/pull/18144
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature
Github user liyichao commented on the issue:
https://github.com/apache/spark/pull/18144
As the idea is not that good, this is closed.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user liyichao commented on a diff in the pull request:
https://github.com/apache/spark/pull/18084#discussion_r121902338
--- Diff:
core/src/test/scala/org/apache/spark/deploy/master/MasterSuite.scala ---
@@ -588,6 +633,70 @@ class MasterSuite extends SparkFunSuite
Github user liyichao commented on the issue:
https://github.com/apache/spark/pull/18084
OK, another scenario:
* driver with driverId1 started on worker1
* worker1 lost
* master add driverId1 to waitingDrivers
* worker1 reconnects and sends DriverStateChanged
Github user liyichao commented on the issue:
https://github.com/apache/spark/pull/18084
Hi, add a workerId may not work. For example, this scenario:
* driver with driverId1 started on worker1
* worker1 lost
* master add driverId1 to waitingDrivers
* worker1
Github user liyichao commented on the issue:
https://github.com/apache/spark/pull/18092
@JoshRosen Could you please see the failed test? It seems unrelated to this
pr.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well
Github user liyichao commented on a diff in the pull request:
https://github.com/apache/spark/pull/18092#discussion_r121281612
--- Diff:
core/src/test/scala/org/apache/spark/storage/BlockManagerSuite.scala ---
@@ -1281,6 +1285,57 @@ class BlockManagerSuite extends SparkFunSuite
Github user liyichao commented on the issue:
https://github.com/apache/spark/pull/18070
How about Letting TaskCommitDenied and TaskKilled extend a same trait (for
example, TaskKilledReason)? This way when accounting metrics, TaskCommitDenied
and TaskKilled are all contributing
Github user liyichao commented on the issue:
https://github.com/apache/spark/pull/18084
ping @jiangxb1987
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
Github user liyichao commented on the issue:
https://github.com/apache/spark/pull/18070
ping @tgravescs
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
Github user liyichao commented on the issue:
https://github.com/apache/spark/pull/18092
ping @JoshRosen
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
Github user liyichao commented on a diff in the pull request:
https://github.com/apache/spark/pull/18129#discussion_r119309080
--- Diff:
resource-managers/yarn/src/test/scala/org/apache/spark/deploy/yarn/ClientSuite.scala
---
@@ -116,15 +116,16 @@ class ClientSuite extends
GitHub user liyichao opened a pull request:
https://github.com/apache/spark/pull/18144
[SPARK-20912][SQL] Allow column name in map functions.
## What changes were proposed in this pull request?
`map` function only accepts Column values only. It'd be very helpful to
have
GitHub user liyichao opened a pull request:
https://github.com/apache/spark/pull/18129
Remove LocalSchem when add path to ClassPath.
## What changes were proposed in this pull request?
In Spark on YARN, when configuring "spark.yarn.jars" with local jars (jar
Github user liyichao commented on the issue:
https://github.com/apache/spark/pull/18084
Hi, I've thought more thoroughly about this.
The main state involved here is Master.workers, Master.idToWorker, and
WorkerInfo.drivers. Say `driverId1` runs on Worker A. Assume
Github user liyichao commented on the issue:
https://github.com/apache/spark/pull/18084
Thanks for the reply. I have add some more tests to verify the state of
master and worker after relaunching.
I will try think about if there are ways to reuse the old driver struct
Github user liyichao commented on a diff in the pull request:
https://github.com/apache/spark/pull/18092#discussion_r118671187
--- Diff: core/src/main/scala/org/apache/spark/storage/BlockManager.scala
---
@@ -170,11 +170,17 @@ private[spark] class BlockManager(
// service
Github user liyichao commented on the issue:
https://github.com/apache/spark/pull/18092
Sorry, I thought it not necessary to duplicate message in JIRA, thanks for
the suggestion.
PR is updated. As to the test plan, the modification seems straightforward,
and I can not think
Github user liyichao commented on a diff in the pull request:
https://github.com/apache/spark/pull/18084#discussion_r118436939
--- Diff:
core/src/test/scala/org/apache/spark/deploy/master/MasterSuite.scala ---
@@ -499,4 +500,103 @@ class MasterSuite extends SparkFunSuite
Github user liyichao commented on a diff in the pull request:
https://github.com/apache/spark/pull/18084#discussion_r118424700
--- Diff: core/src/main/scala/org/apache/spark/deploy/master/Master.scala
---
@@ -796,9 +796,12 @@ private[deploy] class Master
GitHub user liyichao opened a pull request:
https://github.com/apache/spark/pull/18093
[SPARK-20774][SQL] Cancel all jobs when QueryExection throws.
see https://issues.apache.org/jira/browse/SPARK-20774?filter=12340455
## What changes were proposed in this pull request
GitHub user liyichao opened a pull request:
https://github.com/apache/spark/pull/18092
Make rpc timeout and retry for shuffle registration configurable.
## What changes were proposed in this pull request?
As title said
## How was this patch tested
GitHub user liyichao opened a pull request:
https://github.com/apache/spark/pull/18084
[SPARK-19900][core]Remove driver when relaunching.
This is https://github.com/apache/spark/pull/17888 .
cc @cloud-fan @jiangxb1987
You can merge this pull request into a Git repository
Github user liyichao commented on the issue:
https://github.com/apache/spark/pull/17888
Sorry, it seems I make a mistake when rebase. I will open another pr.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your
Github user liyichao closed the pull request at:
https://github.com/apache/spark/pull/17888
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature
Github user liyichao commented on the issue:
https://github.com/apache/spark/pull/17888
Thanks for reviewing. Basically, the problem is that when relaunching a
driver and later the original driver reconnect, there will be an application
which does not have a corresponding driver. I
Github user liyichao commented on a diff in the pull request:
https://github.com/apache/spark/pull/18070#discussion_r118050668
--- Diff: core/src/main/scala/org/apache/spark/executor/Executor.scala ---
@@ -338,6 +340,9 @@ private[spark] class Executor
GitHub user liyichao opened a pull request:
https://github.com/apache/spark/pull/18070
Convert CommitDenied to TaskKilled.
## What changes were proposed in this pull request?
In executor, `CommitDeniedException` is converted to `TaskKilledException`
to avoid
GitHub user liyichao opened a pull request:
https://github.com/apache/spark/pull/17888
[SPARK-19900][core]Remove driver when relaunching.
## What changes were proposed in this pull request?
* remove failed apps when worker down
* do not reuse driver id when relaunching
38 matches
Mail list logo