HyukjinKwon commented on a change in pull request #32015: URL: https://github.com/apache/spark/pull/32015#discussion_r604948654
########## File path: core/src/test/scala/org/apache/spark/benchmark/Benchmarks.scala ########## @@ -30,44 +31,64 @@ import com.google.common.reflect.ClassPath * * {{{ * 1. with spark-submit - * bin/spark-submit --class <this class> --jars <all spark test jars> <spark core test jar> + * bin/spark-submit --class <this class> --jars <all spark test jars> + * <spark external package jar> <spark core test jar> <glob pattern for class> * 2. generate result: * SPARK_GENERATE_BENCHMARK_FILES=1 bin/spark-submit --class <this class> --jars - * <all spark test jars> <spark core test jar> + * <all spark test jars> <spark external package jar> + * <spark core test jar> <glob pattern for class> * Results will be written to all corresponding files under "benchmarks/". * Notice that it detects the sub-project's directories from jar's paths so the provided jars * should be properly placed under target (Maven build) or target/scala-* (SBT) when you * generate the files. * }}} * * In Mac, you can use a command as below to find all the test jars. + * Make sure to do not select duplicated jars created by different versions of builds or tools. * {{{ - * find . -name "*3.2.0-SNAPSHOT-tests.jar" | paste -sd ',' - + * find . -name '*-SNAPSHOT-tests.jar' | paste -sd ',' - * }}} * - * Full command example: + * The example below runs all benchmarks and generates the results: * {{{ * SPARK_GENERATE_BENCHMARK_FILES=1 bin/spark-submit --class \ - * org.apache.spark.benchmark.Benchmarks --jars \ - * "`find . -name "*3.2.0-SNAPSHOT-tests.jar" | paste -sd ',' -`" \ - * ./core/target/scala-2.12/spark-core_2.12-3.2.0-SNAPSHOT-tests.jar + * org.apache.spark.benchmark.Benchmarks \ + * --jars "`find . -name '*-SNAPSHOT-tests.jar' | paste -sd ',' -`" \ + * --jars "`find . -name 'spark-avro*-SNAPSHOT.jar'`" \ + * "`find . -name 'spark-core*-SNAPSHOT-tests.jar'`" \ + * "*" + * }}} + * + * The example below runs all benchmarks under "org.apache.spark.sql.execution.datasources" + * {{{ + * bin/spark-submit --class \ + * org.apache.spark.benchmark.Benchmarks \ + * --jars "`find . -name '*-SNAPSHOT-tests.jar' | paste -sd ',' -`" \ + * --jars "`find . -name 'spark-avro*-SNAPSHOT.jar'`" \ + * "`find . -name 'spark-core*-SNAPSHOT-tests.jar'`" \ + * "org.apache.spark.sql.execution.datasources.*" * }}} */ object Benchmarks { def main(args: Array[String]): Unit = { + var isBenchmarkFound = false ClassPath.from( Thread.currentThread.getContextClassLoader Review comment: Ahh .. this returns a partial list of classes with JDK 11 .. (https://github.com/google/guava/issues/3249). I will take another look tomorrow .. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org