[
https://issues.apache.org/jira/browse/SPARK-21141?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
michael procopio reopened SPARK-21141:
--
My apologies, I mean spark-submit --version.
> spark-update --version is hard to parse
>
[
https://issues.apache.org/jira/browse/SPARK-21140?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
michael procopio reopened SPARK-21140:
--
I am not sure what detail you are looking for. I provided the test code I was
using. See
[
https://issues.apache.org/jira/browse/SPARK-21140?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16054043#comment-16054043
]
michael procopio commented on SPARK-21140:
--
I disagree executor memory does depe
michael procopio created SPARK-21141:
Summary: spark-update --version is hard to parse
Key: SPARK-21141
URL: https://issues.apache.org/jira/browse/SPARK-21141
Project: Spark
Issue Type: I
michael procopio created SPARK-21140:
Summary: Reduce collect high memory requrements
Key: SPARK-21140
URL: https://issues.apache.org/jira/browse/SPARK-21140
Project: Spark
Issue Type: Im
michael procopio created SPARK-19030:
Summary: Dropped event errors being reported after SparkContext
has been stopped
Key: SPARK-19030
URL: https://issues.apache.org/jira/browse/SPARK-19030
Proje
Michael Procopio created SPARK-10453:
Summary: There's now way to use spark.dynmicAllocation.enabled
with pyspark
Key: SPARK-10453
URL: https://issues.apache.org/jira/browse/SPARK-10453
Project: S
Michael Procopio created SPARK-10452:
Summary: Pyspark worker security issue
Key: SPARK-10452
URL: https://issues.apache.org/jira/browse/SPARK-10452
Project: Spark
Issue Type: Bug