[ 
https://issues.apache.org/jira/browse/SPARK-27063?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16784714#comment-16784714
 ] 

Stavros Kontopoulos edited comment on SPARK-27063 at 3/5/19 5:53 PM:
---------------------------------------------------------------------

Yes some other thing that I noticed is when the images are pulled this may take 
time and tests will expire (if you dont use the local deamon to build stuff for 
whatever reason).
Also in this [PR|https://github.com/apache/spark/pull/23514] I set patience 
differently because some tests may run too fast for good or bad.



was (Author: skonto):
Yes some other thing that I noticed is when the images are pulled this may take 
time and tests will expire.
Also in this [PR|https://github.com/apache/spark/pull/23514] I set patience 
differently because some tests may run too fast for good or bad.


> Spark on K8S Integration Tests timeouts are too short for some test clusters
> ----------------------------------------------------------------------------
>
>                 Key: SPARK-27063
>                 URL: https://issues.apache.org/jira/browse/SPARK-27063
>             Project: Spark
>          Issue Type: Improvement
>          Components: Kubernetes
>    Affects Versions: 2.4.0
>            Reporter: Rob Vesse
>            Priority: Minor
>
> As noted during development for SPARK-26729 there are a couple of integration 
> test timeouts that are too short when running on slower clusters e.g. 
> developers laptops, small CI clusters etc
> [~skonto] confirmed that he has also experienced this behaviour in the 
> discussion on PR [PR 
> 23846|https://github.com/apache/spark/pull/23846#discussion_r262564938]
> We should up the defaults of this timeouts as an initial step and longer term 
> consider making the timeouts themselves configurable



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to