This works for me..

mvn test -Dtest=TestSparkCliDriver -Dqfile=join1.q -Phadoop-2
For multiple tests you might need to add quotes around the comma-separated
list.

I haven't seen that error, did you run from itests directory?  There are
some steps in pom to copy over the spark scripts needed to run, that look
like they were skipped as that script is not available in your run.

Thanks
Szehon

On Thu, Jul 2, 2015 at 10:31 AM, Sergey Shelukhin <ser...@hortonworks.com>
wrote:

> Hi. I am trying to run TestSparkCliDriver.
>
> 1) Spark tests do not appear to support specifying a query like other
> tests; when I run mvn test -Phadoop-2 -Dtest=TestSparkCliDriver tests run,
> but with
> mvn test -Phadoop-2 -Dtest=TestSparkCliDriver -Dqfile=foo.q,bar.q,.. test
> just instantly succeeds w/o running any queries. Is there some other way
> to specify those?
>
> 2) When I run all the test, they fail with the below exception
> I’ve done a full regular build (mvn clean install … in root and then
> itests). Are more steps necessary?
> The itests/qtest-spark/../../itests/qtest-spark/target/spark directory
> exists and has bunch of stuff, but bin/ subdirectory that it tries to run
> from is indeed empty.
>
> 2015-07-02 10:11:58,678 ERROR [main]: spark.SparkTask
> (SessionState.java:printError(987)) - Failed to execute spark task, with
> exception 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to
> create spark client.)'
> org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create spark
> client.
> at
> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSes
> sionImpl.java:57)
> at
> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSes
> sion(SparkSessionManagerImpl.java:114)
> at
> org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUt
> ilities.java:127)
> at
> org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:101)
> at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
> at
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:89)
> at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1672)
> at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1431)
> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1212)
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1063)
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1053)
> at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:213)
> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:165)
> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:376)
> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:311)
> at org.apache.hadoop.hive.ql.QTestUtil.createSources(QTestUtil.java:840)
> at
> org.apache.hadoop.hive.cli.TestSparkCliDriver.<clinit>(TestSparkCliDriver.j
> ava:59)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:6
> 2)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImp
> l.java:43)
> at java.lang.reflect.Method.invoke(Method.java:497)
> at
> org.junit.internal.runners.SuiteMethod.testFromSuiteMethod(SuiteMethod.java
> :35)
> at org.junit.internal.runners.SuiteMethod.<init>(SuiteMethod.java:24)
> at
> org.junit.internal.builders.SuiteMethodBuilder.runnerForClass(SuiteMethodBu
> ilder.java:11)
> at
> org.junit.runners.model.RunnerBuilder.safeRunnerForClass(RunnerBuilder.java
> :59)
> at
> org.junit.internal.builders.AllDefaultPossibilitiesBuilder.runnerForClass(A
> llDefaultPossibilitiesBuilder.java:26)
> at
> org.junit.runners.model.RunnerBuilder.safeRunnerForClass(RunnerBuilder.java
> :59)
> at org.junit.internal.requests.ClassRequest.getRunner(ClassRequest.java:26)
> at
> org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java
> :262)
> at
> org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provid
> er.java:153)
> at
> org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:
> 124)
> at
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoad
> er(ForkedBooter.java:200)
> at
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBoot
> er.java:153)
> at
> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
> Caused by: java.io.IOException: Cannot run program
> “[snip]/itests/qtest-spark/../../itests/qtest-spark/target/spark/bin/spark-
> submit": error=2, No such file or directory
> at java.lang.ProcessBuilder.start(ProcessBuilder.java:1048)
> at
> org.apache.hive.spark.client.SparkClientImpl.startDriver(SparkClientImpl.ja
> va:415)
> at
> org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:94
> )
> at
> org.apache.hive.spark.client.SparkClientFactory.createClient(SparkClientFac
> tory.java:80)
> at
> org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>(RemoteHiv
> eSparkClient.java:91)
> at
> org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSpark
> Client(HiveSparkClientFactory.java:65)
> at
> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSes
> sionImpl.java:55)
> ... 33 more
> Caused by: java.io.IOException: error=2, No such file or directory
> at java.lang.UNIXProcess.forkAndExec(Native Method)
> at java.lang.UNIXProcess.<init>(UNIXProcess.java:248)
> at java.lang.ProcessImpl.start(ProcessImpl.java:134)
> at java.lang.ProcessBuilder.start(ProcessBuilder.java:1029)
> ... 39 more
>
>
>
>

Reply via email to