Github user jisookim0513 closed the pull request at:
https://github.com/apache/spark/pull/16714
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature
Github user jisookim0513 commented on the issue:
https://github.com/apache/spark/pull/16714
Ok, not including the updated blocks in task metrics reduced the size of
our event logs. But I am closing this PR as the current implementation doesn't
seem to be in the right way. Thanks
Github user jisookim0513 commented on the issue:
https://github.com/apache/spark/pull/16714
@vanzin @ajbozarth if you guys think having an option to skip logging
internal accumulators (in my case I don't use the SQL UI) and completely
getting rid of updated block statues
Github user jisookim0513 commented on a diff in the pull request:
https://github.com/apache/spark/pull/16714#discussion_r113776549
--- Diff: core/src/main/scala/org/apache/spark/util/JsonProtocol.scala ---
@@ -343,10 +376,14 @@ private[spark] object JsonProtocol
Github user jisookim0513 commented on the issue:
https://github.com/apache/spark/pull/16714
I would still like not to have internal accumulators in the event logs, as
well as updated block statuses. @vanzin would you be ok with eliminating all
internal accumulators and have an option
Github user jisookim0513 commented on a diff in the pull request:
https://github.com/apache/spark/pull/16714#discussion_r103569094
--- Diff: core/src/main/scala/org/apache/spark/util/JsonProtocol.scala ---
@@ -62,18 +62,21 @@ private[spark] object JsonProtocol {
* JSON
Github user jisookim0513 commented on a diff in the pull request:
https://github.com/apache/spark/pull/16714#discussion_r103565658
--- Diff: core/src/main/scala/org/apache/spark/util/JsonProtocol.scala ---
@@ -97,61 +100,80 @@ private[spark] object JsonProtocol {
case
Github user jisookim0513 commented on a diff in the pull request:
https://github.com/apache/spark/pull/16714#discussion_r103564868
--- Diff:
core/src/main/scala/org/apache/spark/scheduler/EventLoggingListener.scala ---
@@ -64,6 +64,12 @@ private[spark] class EventLoggingListener
Github user jisookim0513 commented on the issue:
https://github.com/apache/spark/pull/12436
@sitalkedia have you had a chance to work on this issue and open a new PR?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well
Github user jisookim0513 commented on a diff in the pull request:
https://github.com/apache/spark/pull/16714#discussion_r101442539
--- Diff:
core/src/main/scala/org/apache/spark/scheduler/EventLoggingListener.scala ---
@@ -64,6 +64,12 @@ private[spark] class EventLoggingListener
Github user jisookim0513 commented on a diff in the pull request:
https://github.com/apache/spark/pull/16714#discussion_r101438216
--- Diff:
core/src/main/scala/org/apache/spark/scheduler/EventLoggingListener.scala ---
@@ -64,6 +64,12 @@ private[spark] class EventLoggingListener
Github user jisookim0513 commented on the issue:
https://github.com/apache/spark/pull/16714
Not sure why the second test build failed at PySpark unit tests. I only
changed the comments.
---
If your project is set up for it, you can reply to this email and have your
reply appear
GitHub user jisookim0513 opened a pull request:
https://github.com/apache/spark/pull/16714
[SPARK-16333][Core] Enable EventLoggingListener to log less
## What changes were proposed in this pull request?
Starting from Spark 2.0, task metrics are in the form of an accumulator
Github user jisookim0513 commented on the issue:
https://github.com/apache/spark/pull/10212
@vanzin thanks a lot!
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled
Github user jisookim0513 commented on the issue:
https://github.com/apache/spark/pull/10212
@vanzin thanks, I was about to ask for a retest :)
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user jisookim0513 commented on a diff in the pull request:
https://github.com/apache/spark/pull/10212#discussion_r80184799
--- Diff: core/src/test/scala/org/apache/spark/util/JsonProtocolSuite.scala
---
@@ -1097,7 +1100,9 @@ private[spark] object JsonProtocolSuite extends
Github user jisookim0513 commented on a diff in the pull request:
https://github.com/apache/spark/pull/10212#discussion_r80184744
--- Diff:
core/src/test/resources/HistoryServerExpectations/complete_stage_list_json_expectation.json
---
@@ -6,6 +6,7 @@
"numComplete
Github user jisookim0513 commented on a diff in the pull request:
https://github.com/apache/spark/pull/10212#discussion_r80156532
--- Diff: core/src/main/scala/org/apache/spark/util/JsonProtocol.scala ---
@@ -759,7 +761,15 @@ private[spark] object JsonProtocol {
return
Github user jisookim0513 commented on a diff in the pull request:
https://github.com/apache/spark/pull/10212#discussion_r80156386
--- Diff:
core/src/test/resources/HistoryServerExpectations/complete_stage_list_json_expectation.json
---
@@ -6,6 +6,7 @@
"numComplete
Github user jisookim0513 commented on a diff in the pull request:
https://github.com/apache/spark/pull/10212#discussion_r80155278
--- Diff: core/src/test/scala/org/apache/spark/util/JsonProtocolSuite.scala
---
@@ -1097,7 +1100,9 @@ private[spark] object JsonProtocolSuite extends
Github user jisookim0513 commented on the issue:
https://github.com/apache/spark/pull/10212
@vanzin could you merge this? Thanks!
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
Github user jisookim0513 commented on the issue:
https://github.com/apache/spark/pull/10212
@vanzin this PR had passed all tests. Could you merge it if I fix the
recently introduced conflicts?
---
If your project is set up for it, you can reply to this email and have your
reply
Github user jisookim0513 commented on the issue:
https://github.com/apache/spark/pull/10212
@vanzin I updated the patch
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled
Github user jisookim0513 commented on the issue:
https://github.com/apache/spark/pull/10212
@vanzin sure will do
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled
Github user jisookim0513 commented on a diff in the pull request:
https://github.com/apache/spark/pull/10212#discussion_r54951079
--- Diff: core/src/main/scala/org/apache/spark/util/JsonProtocol.scala ---
@@ -718,6 +719,7 @@ private[spark] object JsonProtocol
GitHub user jisookim0513 opened a pull request:
https://github.com/apache/spark/pull/10212
add cpu time to metrics
Currently task metrics don't support executor CPU time, so there's no way
to calculate how much CPU time a stage/task took from History Server metrics.
This PR
26 matches
Mail list logo