Yi Zhou created SPARK-3480: ------------------------------ Summary: Throws out Not a valid command 'yarn-alpha/scalastyle' in dev/scalastyle for sbt build tool during 'Running Scala style checks' Key: SPARK-3480 URL: https://issues.apache.org/jira/browse/SPARK-3480 Project: Spark Issue Type: Bug Components: Build Reporter: Yi Zhou Priority: Minor
Symptom: Run ./dev/run-tests and dump outputs as following: SBT_MAVEN_PROFILES_ARGS="-Pyarn -Phadoop-2.3 -Dhadoop.version=2.3.0 -Pkinesis-asl" [Warn] Java 8 tests will not run because JDK version is < 1.8. ========================================================================= Running Apache RAT checks ========================================================================= RAT checks passed. ========================================================================= Running Scala style checks ========================================================================= Scalastyle checks failed at following occurrences: [error] Expected ID character [error] Not a valid command: yarn-alpha [error] Expected project ID [error] Expected configuration [error] Expected ':' (if selecting a configuration) [error] Expected key [error] Not a valid key: yarn-alpha [error] yarn-alpha/scalastyle [error] ^ Possible Cause: I checked the dev/scalastyle, found that there are 2 parameters 'yarn-alpha/scalastyle' and 'yarn/scalastyle' separately,like echo -e "q\n" | sbt/sbt -Pyarn -Phadoop-0.23 -Dhadoop.version=0.23.9 yarn-alpha/scalastyle \ >> scalastyle.txt # Check style with YARN built too echo -e "q\n" | sbt/sbt -Pyarn -Phadoop-2.2 -Dhadoop.version=2.2.0 yarn/scalastyle \ >> scalastyle.txt >From above error message, sbt seems to complain them due to '/' separator. So >it can be run through after I manually modified original ones to >'yarn-alpha:scalastyle' and 'yarn:scalastyle'.. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org