Github user zhuoliu commented on the issue:
https://github.com/apache/spark/pull/18351
@fjh100456 this change looks good to me. Could you please add a screenshot
for your test?
Regarding your question of " the application of an exception abort (such as
the applic
Github user zhuoliu commented on the issue:
https://github.com/apache/spark/pull/13670
Generally looks great to me, I am good with it as long as @tgravescs
verifies. One minor concern is that we may move the utility functions from
executorspage.js and historypage.ps to a common
Github user zhuoliu commented on the pull request:
https://github.com/apache/spark/pull/12075#issuecomment-217971974
sure, thanks @srowen! Closing.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does
Github user zhuoliu closed the pull request at:
https://github.com/apache/spark/pull/12075
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature
GitHub user zhuoliu opened a pull request:
https://github.com/apache/spark/pull/12075
[SPARK-13064] Make sure attemptId not none for Rest API
## What changes were proposed in this pull request?
For some application that does not set attemptId for their attempts, e.g.,
spark
Github user zhuoliu commented on the pull request:
https://github.com/apache/spark/pull/11867#issuecomment-201047774
I am +1 to your change. @tgravescs @rxin , could any of you verify this
patch?
---
If your project is set up for it, you can reply to this email and have your
reply
Github user zhuoliu commented on the pull request:
https://github.com/apache/spark/pull/11867#issuecomment-199888492
Hi @paragpc , since this is related to UI, could you upload some
screenshots for your changes? Thanks.
---
If your project is set up for it, you can reply
GitHub user zhuoliu opened a pull request:
https://github.com/apache/spark/pull/11608
[SPARK-13775] History page sorted by completed time desc by default.
## What changes were proposed in this pull request?
Originally the page is sorted by AppID by default.
After tests
Github user zhuoliu commented on the pull request:
https://github.com/apache/spark/pull/11357#issuecomment-189516371
This basically provides a default view of sorting when the page is loaded.
User can easily sort by start time or other columns by clicking the column name
GitHub user zhuoliu opened a pull request:
https://github.com/apache/spark/pull/11357
[SPARK-13481] Desc order of appID by default for history server page.
## What changes were proposed in this pull request?
Now by default, it shows as ascending order of appId. We might
Github user zhuoliu commented on the pull request:
https://github.com/apache/spark/pull/11060#issuecomment-188343621
@tgravescs Now Closing this one since we have another pull request up for
this issue in:
https://github.com/apache/spark/pull/11327
---
If your project is set up
Github user zhuoliu closed the pull request at:
https://github.com/apache/spark/pull/11060
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature
Github user zhuoliu commented on the pull request:
https://github.com/apache/spark/pull/11259#issuecomment-187487562
Jenkins, test this please
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does
Github user zhuoliu commented on the pull request:
https://github.com/apache/spark/pull/11259#issuecomment-186360098
Hi @rxin , yes, I see the pull request template and now updated the
description.
@tgravescs , I addressed your comment. Not find a javascript function I can
GitHub user zhuoliu opened a pull request:
https://github.com/apache/spark/pull/11259
[SPARK-13364] Sort appId as num rather than str in history page.
## What changes were proposed in this pull request?
History page now sorts the appID as a string, which can lead to unexpected
GitHub user zhuoliu opened a pull request:
https://github.com/apache/spark/pull/11060
[SPARK-11316] setupGroups in coalescedRDD causes super long delay.
In coalescedRDD, the setupGroups causes super long delay due to the O(n^2)
loop in the second while. That while is used to make
Github user zhuoliu commented on the pull request:
https://github.com/apache/spark/pull/11060#issuecomment-179589375
@alig Hi Ali, would you mind having a look?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your
GitHub user zhuoliu opened a pull request:
https://github.com/apache/spark/pull/11029
[SPARK-13126] fix the right margin of history page.
The right margin of the history page is little bit off. A simple fix for
that issue.
You can merge this pull request into a Git repository
Github user zhuoliu commented on the pull request:
https://github.com/apache/spark/pull/11029#issuecomment-178816316
![history-margin](https://cloud.githubusercontent.com/assets/11683054/12764141/b62f88a0-c9bd-11e5-9cbf-260b94b8ec3c.png)
Sure.
---
If your project is set up
Github user zhuoliu commented on the pull request:
https://github.com/apache/spark/pull/11038#issuecomment-178984171
Tested also for history page. Looks good to me.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your
Github user zhuoliu commented on the pull request:
https://github.com/apache/spark/pull/11029#issuecomment-178983626
Hi @ajbozarth , the screenshot you pasted is a little different from mine
since your date and app-Id are wrapped into two lines while there is still lots
of space
Github user zhuoliu commented on the pull request:
https://github.com/apache/spark/pull/10648#issuecomment-176303757
JIRA file here: https://issues.apache.org/jira/browse/SPARK-13064
---
If your project is set up for it, you can reply to this email and have your
reply appear
Github user zhuoliu commented on the pull request:
https://github.com/apache/spark/pull/10648#issuecomment-175966503
Hi @tgravescs , finally fixed the paging stuff in RowsGrouping. :)
---
If your project is set up for it, you can reply to this email and have your
reply appear
Github user zhuoliu commented on the pull request:
https://github.com/apache/spark/pull/9946#issuecomment-175121155
Sure.
A brief summary is that: (picked a few conclusive points from above):
"At the point we call System.exit here all user code is done a
Github user zhuoliu commented on a diff in the pull request:
https://github.com/apache/spark/pull/10648#discussion_r50931113
--- Diff:
core/src/main/resources/org/apache/spark/ui/static/jquery.cookies.2.2.0.min.js
---
@@ -0,0 +1,18 @@
+/**
+ * Copyright (c) 2005 - 2010
Github user zhuoliu commented on the pull request:
https://github.com/apache/spark/pull/9946#issuecomment-173706813
Hi @vanzin , do we want to amend the commit message to something like this?
"Call system.exit explicitly to make sure non-daemon user threads
terminate.
Wi
Github user zhuoliu commented on the pull request:
https://github.com/apache/spark/pull/9946#issuecomment-173732459
Thanks @vanzin , commit message updated.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project
Github user zhuoliu commented on the pull request:
https://github.com/apache/spark/pull/9946#issuecomment-173737073
Sorry for that. Just updated the first comment and changed the commit
message back to original.
---
If your project is set up for it, you can reply to this email
Github user zhuoliu commented on the pull request:
https://github.com/apache/spark/pull/9946#issuecomment-173686628
Now reopen.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
GitHub user zhuoliu reopened a pull request:
https://github.com/apache/spark/pull/9946
[SPARK-10911] Executors should System.exit on clean shutdown.
https://issues.apache.org/jira/browse/SPARK-10911
You can merge this pull request into a Git repository by running:
$ git pull
Github user zhuoliu closed the pull request at:
https://github.com/apache/spark/pull/9946
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature
Github user zhuoliu commented on the pull request:
https://github.com/apache/spark/pull/9946#issuecomment-173274389
Thanks, closed.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user zhuoliu commented on the pull request:
https://github.com/apache/spark/pull/9946#issuecomment-173268512
@vanzin @srowen @andrewor14 , I think your concerns make sense. Also,
adding timeout might have another issue that resources cannot be released
immediately even
Github user zhuoliu commented on the pull request:
https://github.com/apache/spark/pull/10648#issuecomment-169811351
Hi @vanzin , I added two new fields (duration Long, lastUpdated Date) into
ApplicationAttemptInfo in api.scala.
The jenkins compile is complaining about mima binary
Github user zhuoliu commented on the pull request:
https://github.com/apache/spark/pull/10648#issuecomment-169825389
Hi @vanzin , thanks for the info.
BTW, I have fixed the issue you mentioned about messed zebra striping. Now
the zebra striping is exactly as what we have
GitHub user zhuoliu opened a pull request:
https://github.com/apache/spark/pull/9946
[SPARK-10911] Executors should System.exit on clean shutdown.
https://issues.apache.org/jira/browse/SPARK-10911
You can merge this pull request into a Git repository by running:
$ git pull
Github user zhuoliu commented on the pull request:
https://github.com/apache/spark/pull/8398#issuecomment-135452755
Sure. Docs updated.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user zhuoliu commented on the pull request:
https://github.com/apache/spark/pull/8398#issuecomment-135194775
Thanks @vanzin, comments addressed.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does
Github user zhuoliu commented on a diff in the pull request:
https://github.com/apache/spark/pull/8398#discussion_r37881755
--- Diff: core/src/test/scala/org/apache/spark/SecurityManagerSuite.scala
---
@@ -125,6 +125,27 @@ class SecurityManagerSuite extends SparkFunSuite
GitHub user zhuoliu opened a pull request:
https://github.com/apache/spark/pull/8398
Spark-4223 Support * in acls.
SPARK-4223.
Currently we support setting view and modify acls but you have to specify a
list of users. It would be nice to support * meaning all
40 matches
Mail list logo