[ 
https://issues.apache.org/jira/browse/SPARK-3431?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14234702#comment-14234702
 ] 

Nicholas Chammas commented on SPARK-3431:
-----------------------------------------

I think I'm on to something, but I need some help. I think I understand how to 
tell SBT to fork JVMs for tests, and I also think I got how to specify how the 
tests should be grouped in the various forked JVMs.

It's not working because I think the forked JVMs are not getting passed all the 
options they need. Basically, I don't think that the reference to 
{{javaOptions}} [here in this 
line|https://github.com/nchammas/spark/blob/ab127b798dbfa9399833d546e627f9651b060918/project/SparkBuild.scala#L429]
 actually has all the options [defined 
earlier|https://github.com/nchammas/spark/blob/ab127b798dbfa9399833d546e627f9651b060918/project/SparkBuild.scala#L403-L418].

I don't know much Scala. If anyone could review what I have so far give me some 
pointers, that would be great!

You can see all the variations I've tried along with the associated output in 
[the open pull request|https://github.com/apache/spark/pull/3564]. I know we 
want to get this working with Maven, but I figured getting it to work first 
with SBT wouldn't be a bad thing.

> Parallelize execution of tests
> ------------------------------
>
>                 Key: SPARK-3431
>                 URL: https://issues.apache.org/jira/browse/SPARK-3431
>             Project: Spark
>          Issue Type: Improvement
>          Components: Build
>            Reporter: Nicholas Chammas
>
> Running all the tests in {{dev/run-tests}} takes up to 2 hours. A common 
> strategy to cut test time down is to parallelize the execution of the tests. 
> Doing that may in turn require some prerequisite changes to be made to how 
> certain tests run.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to