I used the following:

val testHive = new org.apache.spark.sql.hive.test.TestHiveContext(sc,
*false*)
val hiveClient = testHive.sessionState.metadataHive
hiveClient.runSqlHive(ā€œā€¦.ā€)



On Fri, Jan 13, 2017 at 6:40 AM, Nicolas Tallineau <
nicolas.tallin...@ubisoft.com> wrote:

> I get a nullPointerException as soon as I try to execute a
> TestHive.sql(...) statement since migrating to Spark 2 because it's trying
> to load non existing "test tables". I couldn't find a way to switch to
> false the loadTestTables variable.
>
>
>
> Caused by: sbt.ForkMain$ForkError: java.lang.NullPointerException: null
>
>                 at org.apache.spark.sql.hive.test.TestHiveSparkSession.
> getHiveFile(TestHive.scala:190)
>
>                 at org.apache.spark.sql.hive.test.TestHiveSparkSession.org
> $apache$spark$sql$hive$test$TestHiveSparkSession$$
> quoteHiveFile(TestHive.scala:196)
>
>                 at org.apache.spark.sql.hive.test.TestHiveSparkSession.<
> init>(TestHive.scala:234)
>
>                 at org.apache.spark.sql.hive.test.TestHiveSparkSession.<
> init>(TestHive.scala:122)
>
>                 at org.apache.spark.sql.hive.test.TestHiveContext.<init>(
> TestHive.scala:80)
>
>                 at org.apache.spark.sql.hive.test.TestHive$.<init>(
> TestHive.scala:47)
>
>                 at org.apache.spark.sql.hive.test.TestHive$.<clinit>(
> TestHive.scala)
>
>
>
> Iā€™m using Spark 2.1.0 in this case.
>
>
>
> Am I missing something or should I create a bug in Jira?
>



-- 
Xin Wu
(650)392-9799

Reply via email to