[ 
https://issues.apache.org/jira/browse/SPARK-1358?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14221605#comment-14221605
 ] 

Andrew Ash commented on SPARK-1358:
-----------------------------------

Those machines sound more than powerful enough to handle the types of 
long-running tests we're talking about on this ticket.

As for disk space, I've found that there's typically a decent multiplier from 
memory to disk, so for this hardware to be representative of what I'm used to 
working with I'd expect somewhere between 4 and 10TB, typically on the higher 
end of that scale.  We probably don't need that much for testing, but at least 
for my onsite use that's standard.

For reference also the AWS instances that Databricks used for the Terasort 
record had 8x800GB SSDs in RAID0

> Continuous integrated test should be involved in Spark ecosystem 
> -----------------------------------------------------------------
>
>                 Key: SPARK-1358
>                 URL: https://issues.apache.org/jira/browse/SPARK-1358
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>            Reporter: xiajunluan
>
> Currently, Spark only contains unit test and performance test, but I think it 
> is not enough for customer to evaluate status about their cluster and spark 
> version they  will used, and it is necessary to build continuous integrated 
> test for spark development , it could included 
> 1. complex applications test cases for spark/spark streaming/graphx....
> 2. stresss test cases
> 3. fault tolerance test cases
> 4......



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to