Github user paragpc commented on the issue:
https://github.com/apache/spark/pull/11867
thanks @squito :)
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
Github user paragpc commented on the issue:
https://github.com/apache/spark/pull/11867
@squito I have update PR based on your comments. Also, added documentation
change. Can you take a look?
---
If your project is set up for it, you can reply to this email and have your
reply appear
Github user paragpc commented on the issue:
https://github.com/apache/spark/pull/16473
Thanks @squito :)
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
Github user paragpc commented on a diff in the pull request:
https://github.com/apache/spark/pull/11867#discussion_r96997288
--- Diff:
core/src/main/scala/org/apache/spark/status/api/v1/ApplicationListResource.scala
---
@@ -43,11 +45,24 @@ private[v1] class
Github user paragpc commented on the issue:
https://github.com/apache/spark/pull/16473
Thanks @squito. Actually I have resolved the merge conflicts and started a
new test build about half an hour ago. Currently, I don't see any merge
conflicts.
---
If your project is set up f
Github user paragpc commented on the issue:
https://github.com/apache/spark/pull/11867
I am not sure why the build is failing with following error,
stderr: fatal: unable to access 'https://github.com/apache/spark.git/':
Failed connect to github.com:443; Operat
Github user paragpc commented on a diff in the pull request:
https://github.com/apache/spark/pull/16473#discussion_r96532383
--- Diff: core/src/main/scala/org/apache/spark/ui/jobs/UIData.scala ---
@@ -127,6 +127,14 @@ private[spark] object UIData {
def updateTaskMetrics
Github user paragpc commented on the issue:
https://github.com/apache/spark/pull/16473
cc @zsxwing, @vanzin, @coderxiang
Can you take a look and suggest next steps?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as
Github user paragpc commented on the issue:
https://github.com/apache/spark/pull/16473
cc @zsxwing, @vanzin
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
Github user paragpc commented on the issue:
https://github.com/apache/spark/pull/16473
cc @vanzin, @zsxwing, @rxin
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and
Github user paragpc commented on the issue:
https://github.com/apache/spark/pull/16473
```[error] * method
this(Long,Int,Int,java.util.Date,java.lang.String,java.lang.String,java.lang.String,Boolean,scala.collection.Seq,scala.Option,scala.Option)Unit
in class
Github user paragpc commented on the issue:
https://github.com/apache/spark/pull/16473
cc @zsxwing, @vanzin, @rxin
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and
GitHub user paragpc opened a pull request:
https://github.com/apache/spark/pull/16473
[SPARK-19069] [CORE] Expose task 'status' and 'duration' in spark history
server REST API.
## What changes were proposed in this pull request?
Although Spark history
Github user paragpc commented on the issue:
https://github.com/apache/spark/pull/11867
Hi @zsxwing,
I have updated the PR. Can you kindly verify it?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your
Github user paragpc commented on the pull request:
https://github.com/apache/spark/pull/11867#issuecomment-215841999
Hi @tgravescs / @rxin,
Can you kindly verify this patch or provide any suggestions?
---
If your project is set up for it, you can reply to this email and have
Github user paragpc commented on the pull request:
https://github.com/apache/spark/pull/11867#issuecomment-200943460
Hi @zhuoliu,
Can you or any other admin verify this patch?
---
If your project is set up for it, you can reply to this email and have your
reply appear on
Github user paragpc commented on the pull request:
https://github.com/apache/spark/pull/11867#issuecomment-199892050
Hi @zhuoliu,
In this pull request, my proposal is specific to spark history server REST
API. This is intended for the users who use REST API directly. As you
Github user paragpc commented on the pull request:
https://github.com/apache/spark/pull/11867#issuecomment-199882584
cc @squito @zhuoliu
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this
GitHub user paragpc opened a pull request:
https://github.com/apache/spark/pull/11867
[SPARK-14049] [CORE] Add functionality in spark history sever API to query
applications by end time
## What changes were proposed in this pull request?
Currently, spark history server
Github user paragpc closed the pull request at:
https://github.com/apache/spark/pull/11736
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
Github user paragpc commented on the pull request:
https://github.com/apache/spark/pull/11736#issuecomment-196983711
The only available alternative which I know is manually copy files to
desired back up location. Manual copy of files doesn't provide real-time
events, especial
GitHub user paragpc opened a pull request:
https://github.com/apache/spark/pull/11736
[SPARK-13914] [Scheduler] Add functionality to optionally configure a
backup path for the Spark event logs
## What changes were proposed in this pull request?
Added
22 matches
Mail list logo