RE: problems running spark tests

2015-07-02 Thread Li, Rui
Other guys also run into this on Mac.
The spark binary is downloaded to itests/thirdparty and then unpacked and 
copied to itests/qtest-spark/target/spark. Maybe you can manually do the 
process and check if anything goes wrong.

Cheers,
Rui Li

-Original Message-
From: Sergey Shelukhin [mailto:ser...@hortonworks.com] 
Sent: Friday, July 03, 2015 6:32 AM
To: dev@hive.apache.org
Subject: Re: problems running spark tests

I was able to get the tests to run with the parameter Hari suggested, on a 
different (Linux) machine.
However, on my Mac laptop, the bin/ part of spark directory is not regenerated. 
I guess I will do the usual shamanic dances like nuking the maven repo, 
re-cloning the code, etc., next time I need it. If that doesn’t work I might 
file a bug or revive this thread.

On 15/7/2, 11:40, "Szehon Ho"  wrote:

>This works for me..
>
>mvn test -Dtest=TestSparkCliDriver -Dqfile=join1.q -Phadoop-2 For 
>multiple tests you might need to add quotes around the comma-separated 
>list.
>
>I haven't seen that error, did you run from itests directory?  There 
>are some steps in pom to copy over the spark scripts needed to run, 
>that look like they were skipped as that script is not available in your run.
>
>Thanks
>Szehon
>
>On Thu, Jul 2, 2015 at 10:31 AM, Sergey Shelukhin 
>
>wrote:
>
>> Hi. I am trying to run TestSparkCliDriver.
>>
>> 1) Spark tests do not appear to support specifying a query like other  
>>tests; when I run mvn test -Phadoop-2 -Dtest=TestSparkCliDriver tests 
>>run,  but with  mvn test -Phadoop-2 -Dtest=TestSparkCliDriver 
>>-Dqfile=foo.q,bar.q,..
>>test
>> just instantly succeeds w/o running any queries. Is there some other 
>>way  to specify those?
>>
>> 2) When I run all the test, they fail with the below exception  I’ve 
>>done a full regular build (mvn clean install … in root and then  
>>itests). Are more steps necessary?
>> The itests/qtest-spark/../../itests/qtest-spark/target/spark 
>>directory  exists and has bunch of stuff, but bin/ subdirectory that 
>>it tries to run  from is indeed empty.
>>
>> 2015-07-02 10:11:58,678 ERROR [main]: spark.SparkTask
>> (SessionState.java:printError(987)) - Failed to execute spark task, 
>> with exception 
>> 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create spark 
>> client.)'
>> org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create 
>> spark client.
>> at
>> 
>>org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(Spa
>>rkS
>>es
>> sionImpl.java:57)
>> at
>> 
>>org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.g
>>etS
>>es
>> sion(SparkSessionManagerImpl.java:114)
>> at
>> 
>>org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(Sp
>>ark
>>Ut
>> ilities.java:127)
>> at
>> 
>>org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:
>>101
>>)
>> at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
>> at
>> 
>>org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.jav
>>a:8
>>9)
>> at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1672)
>> at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1431)
>> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1212)
>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1063)
>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1053)
>> at
>>org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:21
>>3)  at 
>>org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:165)
>> at 
>>org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:376)
>> at 
>>org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:311)
>> at 
>>org.apache.hadoop.hive.ql.QTestUtil.createSources(QTestUtil.java:840)
>> at
>> 
>>org.apache.hadoop.hive.cli.TestSparkCliDriver.(TestSparkCliDri
>>ver
>>.j
>> ava:59)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)  at
>> 
>>sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.j
>>ava
>>:6
>> 2)
>> at
>> 
>>sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccess
>>orI
>>mp
>> l.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:497)
>> at
>> 
>>org.junit.internal.runners.SuiteMethod.testFromSuiteMethod(SuiteMethod
>>.ja
>>va
>> :35)
>> at org.junit.internal.runners.SuiteMethod.(SuiteMethod.java:24)
>

Re: problems running spark tests

2015-07-02 Thread Sergey Shelukhin
I was able to get the tests to run with the parameter Hari suggested, on a
different (Linux) machine.
However, on my Mac laptop, the bin/ part of spark directory is not
regenerated. 
I guess I will do the usual shamanic dances like nuking the maven repo,
re-cloning the code, etc., next time I need it. If that doesn’t work I
might file a bug or revive this thread.

On 15/7/2, 11:40, "Szehon Ho"  wrote:

>This works for me..
>
>mvn test -Dtest=TestSparkCliDriver -Dqfile=join1.q -Phadoop-2
>For multiple tests you might need to add quotes around the comma-separated
>list.
>
>I haven't seen that error, did you run from itests directory?  There are
>some steps in pom to copy over the spark scripts needed to run, that look
>like they were skipped as that script is not available in your run.
>
>Thanks
>Szehon
>
>On Thu, Jul 2, 2015 at 10:31 AM, Sergey Shelukhin 
>wrote:
>
>> Hi. I am trying to run TestSparkCliDriver.
>>
>> 1) Spark tests do not appear to support specifying a query like other
>> tests; when I run mvn test -Phadoop-2 -Dtest=TestSparkCliDriver tests
>>run,
>> but with
>> mvn test -Phadoop-2 -Dtest=TestSparkCliDriver -Dqfile=foo.q,bar.q,..
>>test
>> just instantly succeeds w/o running any queries. Is there some other way
>> to specify those?
>>
>> 2) When I run all the test, they fail with the below exception
>> I’ve done a full regular build (mvn clean install … in root and then
>> itests). Are more steps necessary?
>> The itests/qtest-spark/../../itests/qtest-spark/target/spark directory
>> exists and has bunch of stuff, but bin/ subdirectory that it tries to
>>run
>> from is indeed empty.
>>
>> 2015-07-02 10:11:58,678 ERROR [main]: spark.SparkTask
>> (SessionState.java:printError(987)) - Failed to execute spark task, with
>> exception 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to
>> create spark client.)'
>> org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create spark
>> client.
>> at
>> 
>>org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkS
>>es
>> sionImpl.java:57)
>> at
>> 
>>org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getS
>>es
>> sion(SparkSessionManagerImpl.java:114)
>> at
>> 
>>org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(Spark
>>Ut
>> ilities.java:127)
>> at
>> 
>>org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:101
>>)
>> at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
>> at
>> 
>>org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:8
>>9)
>> at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1672)
>> at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1431)
>> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1212)
>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1063)
>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1053)
>> at 
>>org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:213)
>> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:165)
>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:376)
>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:311)
>> at org.apache.hadoop.hive.ql.QTestUtil.createSources(QTestUtil.java:840)
>> at
>> 
>>org.apache.hadoop.hive.cli.TestSparkCliDriver.(TestSparkCliDriver
>>.j
>> ava:59)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at
>> 
>>sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java
>>:6
>> 2)
>> at
>> 
>>sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorI
>>mp
>> l.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:497)
>> at
>> 
>>org.junit.internal.runners.SuiteMethod.testFromSuiteMethod(SuiteMethod.ja
>>va
>> :35)
>> at org.junit.internal.runners.SuiteMethod.(SuiteMethod.java:24)
>> at
>> 
>>org.junit.internal.builders.SuiteMethodBuilder.runnerForClass(SuiteMethod
>>Bu
>> ilder.java:11)
>> at
>> 
>>org.junit.runners.model.RunnerBuilder.safeRunnerForClass(RunnerBuilder.ja
>>va
>> :59)
>> at
>> 
>>org.junit.internal.builders.AllDefaultPossibilitiesBuilder.runnerForClass
>>(A
>> llDefaultPossibilitiesBuilder.java:26)
>> at
>> 
>>org.junit.runners.model.RunnerBuilder.safeRunnerForClass(RunnerBuilder.ja
>>va
>> :59)
>> at 
>>org.junit.internal.requests.ClassRequest.getRunner(ClassRequest.java:26)
>> at
>> 
>>org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.ja
>>va
>> :262)
>> at
>> 
>>org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Prov
>>id
>> er.java:153)
>> at
>> 
>>org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.jav
>>a:
>> 124)
>> at
>> 
>>org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLo
>>ad
>> er(ForkedBooter.java:200)
>> at
>> 
>>org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBo
>>ot
>> er.java:153)
>> at
>> 
>>org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
>> Caused by: ja

Re: problems running spark tests

2015-07-02 Thread Hari Subramaniyan
Can you try running with -Dspark.query.files instead of -Dqfile from itests 
directory

Thanks
Hari

> On Jul 2, 2015, at 10:32 AM, "Sergey Shelukhin"  
> wrote:
> 
> Hi. I am trying to run TestSparkCliDriver.
> 
> 1) Spark tests do not appear to support specifying a query like other
> tests; when I run mvn test -Phadoop-2 -Dtest=TestSparkCliDriver tests run,
> but with 
> mvn test -Phadoop-2 -Dtest=TestSparkCliDriver -Dqfile=foo.q,bar.q,.. test
> just instantly succeeds w/o running any queries. Is there some other way
> to specify those?
> 
> 2) When I run all the test, they fail with the below exception
> I’ve done a full regular build (mvn clean install … in root and then
> itests). Are more steps necessary?
> The itests/qtest-spark/../../itests/qtest-spark/target/spark directory
> exists and has bunch of stuff, but bin/ subdirectory that it tries to run
> from is indeed empty.
> 
> 2015-07-02 10:11:58,678 ERROR [main]: spark.SparkTask
> (SessionState.java:printError(987)) - Failed to execute spark task, with
> exception 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to
> create spark client.)'
> org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create spark
> client.
> at 
> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSes
> sionImpl.java:57)
> at 
> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSes
> sion(SparkSessionManagerImpl.java:114)
> at 
> org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUt
> ilities.java:127)
> at 
> org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:101)
> at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
> at 
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:89)
> at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1672)
> at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1431)
> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1212)
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1063)
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1053)
> at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:213)
> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:165)
> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:376)
> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:311)
> at org.apache.hadoop.hive.ql.QTestUtil.createSources(QTestUtil.java:840)
> at 
> org.apache.hadoop.hive.cli.TestSparkCliDriver.(TestSparkCliDriver.j
> ava:59)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:6
> 2)
> at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImp
> l.java:43)
> at java.lang.reflect.Method.invoke(Method.java:497)
> at 
> org.junit.internal.runners.SuiteMethod.testFromSuiteMethod(SuiteMethod.java
> :35)
> at org.junit.internal.runners.SuiteMethod.(SuiteMethod.java:24)
> at 
> org.junit.internal.builders.SuiteMethodBuilder.runnerForClass(SuiteMethodBu
> ilder.java:11)
> at 
> org.junit.runners.model.RunnerBuilder.safeRunnerForClass(RunnerBuilder.java
> :59)
> at 
> org.junit.internal.builders.AllDefaultPossibilitiesBuilder.runnerForClass(A
> llDefaultPossibilitiesBuilder.java:26)
> at 
> org.junit.runners.model.RunnerBuilder.safeRunnerForClass(RunnerBuilder.java
> :59)
> at org.junit.internal.requests.ClassRequest.getRunner(ClassRequest.java:26)
> at 
> org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java
> :262)
> at 
> org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provid
> er.java:153)
> at 
> org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:
> 124)
> at 
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoad
> er(ForkedBooter.java:200)
> at 
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBoot
> er.java:153)
> at 
> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
> Caused by: java.io.IOException: Cannot run program
> “[snip]/itests/qtest-spark/../../itests/qtest-spark/target/spark/bin/spark-
> submit": error=2, No such file or directory
> at java.lang.ProcessBuilder.start(ProcessBuilder.java:1048)
> at 
> org.apache.hive.spark.client.SparkClientImpl.startDriver(SparkClientImpl.ja
> va:415)
> at 
> org.apache.hive.spark.client.SparkClientImpl.(SparkClientImpl.java:94
> )
> at 
> org.apache.hive.spark.client.SparkClientFactory.createClient(SparkClientFac
> tory.java:80)
> at 
> org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.(RemoteHiv
> eSparkClient.java:91)
> at 
> org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSpark
> Client(HiveSparkClientFactory.java:65)
> at 
> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSes
> sionImpl.java:55)
> ... 33 more
> Caused by: java.io.IOException: error=2, No such file or 

Re: problems running spark tests

2015-07-02 Thread Szehon Ho
This works for me..

mvn test -Dtest=TestSparkCliDriver -Dqfile=join1.q -Phadoop-2
For multiple tests you might need to add quotes around the comma-separated
list.

I haven't seen that error, did you run from itests directory?  There are
some steps in pom to copy over the spark scripts needed to run, that look
like they were skipped as that script is not available in your run.

Thanks
Szehon

On Thu, Jul 2, 2015 at 10:31 AM, Sergey Shelukhin 
wrote:

> Hi. I am trying to run TestSparkCliDriver.
>
> 1) Spark tests do not appear to support specifying a query like other
> tests; when I run mvn test -Phadoop-2 -Dtest=TestSparkCliDriver tests run,
> but with
> mvn test -Phadoop-2 -Dtest=TestSparkCliDriver -Dqfile=foo.q,bar.q,.. test
> just instantly succeeds w/o running any queries. Is there some other way
> to specify those?
>
> 2) When I run all the test, they fail with the below exception
> I’ve done a full regular build (mvn clean install … in root and then
> itests). Are more steps necessary?
> The itests/qtest-spark/../../itests/qtest-spark/target/spark directory
> exists and has bunch of stuff, but bin/ subdirectory that it tries to run
> from is indeed empty.
>
> 2015-07-02 10:11:58,678 ERROR [main]: spark.SparkTask
> (SessionState.java:printError(987)) - Failed to execute spark task, with
> exception 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to
> create spark client.)'
> org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create spark
> client.
> at
> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSes
> sionImpl.java:57)
> at
> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSes
> sion(SparkSessionManagerImpl.java:114)
> at
> org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUt
> ilities.java:127)
> at
> org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:101)
> at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
> at
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:89)
> at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1672)
> at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1431)
> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1212)
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1063)
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1053)
> at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:213)
> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:165)
> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:376)
> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:311)
> at org.apache.hadoop.hive.ql.QTestUtil.createSources(QTestUtil.java:840)
> at
> org.apache.hadoop.hive.cli.TestSparkCliDriver.(TestSparkCliDriver.j
> ava:59)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:6
> 2)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImp
> l.java:43)
> at java.lang.reflect.Method.invoke(Method.java:497)
> at
> org.junit.internal.runners.SuiteMethod.testFromSuiteMethod(SuiteMethod.java
> :35)
> at org.junit.internal.runners.SuiteMethod.(SuiteMethod.java:24)
> at
> org.junit.internal.builders.SuiteMethodBuilder.runnerForClass(SuiteMethodBu
> ilder.java:11)
> at
> org.junit.runners.model.RunnerBuilder.safeRunnerForClass(RunnerBuilder.java
> :59)
> at
> org.junit.internal.builders.AllDefaultPossibilitiesBuilder.runnerForClass(A
> llDefaultPossibilitiesBuilder.java:26)
> at
> org.junit.runners.model.RunnerBuilder.safeRunnerForClass(RunnerBuilder.java
> :59)
> at org.junit.internal.requests.ClassRequest.getRunner(ClassRequest.java:26)
> at
> org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java
> :262)
> at
> org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provid
> er.java:153)
> at
> org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:
> 124)
> at
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoad
> er(ForkedBooter.java:200)
> at
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBoot
> er.java:153)
> at
> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
> Caused by: java.io.IOException: Cannot run program
> “[snip]/itests/qtest-spark/../../itests/qtest-spark/target/spark/bin/spark-
> submit": error=2, No such file or directory
> at java.lang.ProcessBuilder.start(ProcessBuilder.java:1048)
> at
> org.apache.hive.spark.client.SparkClientImpl.startDriver(SparkClientImpl.ja
> va:415)
> at
> org.apache.hive.spark.client.SparkClientImpl.(SparkClientImpl.java:94
> )
> at
> org.apache.hive.spark.client.SparkClientFactory.createClient(SparkClientFac
> tory.java:80)
> at
> org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.(RemoteHiv
> eSparkClient.java:91)
> at
> org.apache.hadoop.hive.ql.exec.s

problems running spark tests

2015-07-02 Thread Sergey Shelukhin
Hi. I am trying to run TestSparkCliDriver.

1) Spark tests do not appear to support specifying a query like other
tests; when I run mvn test -Phadoop-2 -Dtest=TestSparkCliDriver tests run,
but with 
mvn test -Phadoop-2 -Dtest=TestSparkCliDriver -Dqfile=foo.q,bar.q,.. test
just instantly succeeds w/o running any queries. Is there some other way
to specify those?

2) When I run all the test, they fail with the below exception
I’ve done a full regular build (mvn clean install … in root and then
itests). Are more steps necessary?
The itests/qtest-spark/../../itests/qtest-spark/target/spark directory
exists and has bunch of stuff, but bin/ subdirectory that it tries to run
from is indeed empty.

2015-07-02 10:11:58,678 ERROR [main]: spark.SparkTask
(SessionState.java:printError(987)) - Failed to execute spark task, with
exception 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to
create spark client.)'
org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create spark
client.
at 
org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSes
sionImpl.java:57)
at 
org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSes
sion(SparkSessionManagerImpl.java:114)
at 
org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUt
ilities.java:127)
at 
org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:101)
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
at 
org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:89)
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1672)
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1431)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1212)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1063)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1053)
at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:213)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:165)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:376)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:311)
at org.apache.hadoop.hive.ql.QTestUtil.createSources(QTestUtil.java:840)
at 
org.apache.hadoop.hive.cli.TestSparkCliDriver.(TestSparkCliDriver.j
ava:59)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:6
2)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImp
l.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at 
org.junit.internal.runners.SuiteMethod.testFromSuiteMethod(SuiteMethod.java
:35)
at org.junit.internal.runners.SuiteMethod.(SuiteMethod.java:24)
at 
org.junit.internal.builders.SuiteMethodBuilder.runnerForClass(SuiteMethodBu
ilder.java:11)
at 
org.junit.runners.model.RunnerBuilder.safeRunnerForClass(RunnerBuilder.java
:59)
at 
org.junit.internal.builders.AllDefaultPossibilitiesBuilder.runnerForClass(A
llDefaultPossibilitiesBuilder.java:26)
at 
org.junit.runners.model.RunnerBuilder.safeRunnerForClass(RunnerBuilder.java
:59)
at org.junit.internal.requests.ClassRequest.getRunner(ClassRequest.java:26)
at 
org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java
:262)
at 
org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provid
er.java:153)
at 
org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:
124)
at 
org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoad
er(ForkedBooter.java:200)
at 
org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBoot
er.java:153)
at 
org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: java.io.IOException: Cannot run program
“[snip]/itests/qtest-spark/../../itests/qtest-spark/target/spark/bin/spark-
submit": error=2, No such file or directory
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1048)
at 
org.apache.hive.spark.client.SparkClientImpl.startDriver(SparkClientImpl.ja
va:415)
at 
org.apache.hive.spark.client.SparkClientImpl.(SparkClientImpl.java:94
)
at 
org.apache.hive.spark.client.SparkClientFactory.createClient(SparkClientFac
tory.java:80)
at 
org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.(RemoteHiv
eSparkClient.java:91)
at 
org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSpark
Client(HiveSparkClientFactory.java:65)
at 
org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSes
sionImpl.java:55)
... 33 more
Caused by: java.io.IOException: error=2, No such file or directory
at java.lang.UNIXProcess.forkAndExec(Native Method)
at java.lang.UNIXProcess.(UNIXProcess.java:248)
at java.lang.ProcessImpl.start(ProcessImpl.java:134)
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1029)
... 39 more