[ https://issues.apache.org/jira/browse/SPARK-14808?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15378335#comment-15378335 ]
Sean Owen commented on SPARK-14808: ----------------------------------- What's the counter-argument though? either some piece of doc is important for 2.0.0 and must be ready with it like anything else targeted for 2.0.0, or it isn't. I think the answer in many cases it that there are updates that aren't essential for 2.0.0 actually but then they shouldn't be described as "for 2.0". It's fine to say that an RC goes out before an essential 2.0 doc change goes in, but, it's still a Blocker. I think anything critical for 2.0.0 is definitionally a blocker. > Spark MLlib, GraphX, SparkR 2.0 QA umbrella > ------------------------------------------- > > Key: SPARK-14808 > URL: https://issues.apache.org/jira/browse/SPARK-14808 > Project: Spark > Issue Type: Umbrella > Components: Documentation, GraphX, ML, MLlib, SparkR > Reporter: Joseph K. Bradley > Assignee: Joseph K. Bradley > Priority: Critical > > This JIRA lists tasks for the next Spark release's QA period for MLlib, > GraphX, and SparkR. > The list below gives an overview of what is involved, and the corresponding > JIRA issues are linked below that. > h2. API > * Check binary API compatibility for Scala/Java > * Audit new public APIs (from the generated html doc) > ** Scala > ** Java compatibility > ** Python coverage > ** R > * Check Experimental, DeveloperApi tags > h2. Algorithms and performance > *Performance* > * _List any other missing performance tests from spark-perf here_ > * perf-tests for transformers (SPARK-2838) > * MultilayerPerceptron (SPARK-11911) > h2. Documentation and example code > * For new algorithms, create JIRAs for updating the user guide sections & > examples > * Update Programming Guide > * Update website -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org