I get a nullPointerException as soon as I try to execute a TestHive.sql(...) 
statement since migrating to Spark 2 because it's trying to load non existing 
"test tables". I couldn't find a way to switch to false the loadTestTables 
variable.

Caused by: sbt.ForkMain$ForkError: java.lang.NullPointerException: null
                at 
org.apache.spark.sql.hive.test.TestHiveSparkSession.getHiveFile(TestHive.scala:190)
                at 
org.apache.spark.sql.hive.test.TestHiveSparkSession.org$apache$spark$sql$hive$test$TestHiveSparkSession$$quoteHiveFile(TestHive.scala:196)
                at 
org.apache.spark.sql.hive.test.TestHiveSparkSession.<init>(TestHive.scala:234)
                at 
org.apache.spark.sql.hive.test.TestHiveSparkSession.<init>(TestHive.scala:122)
                at 
org.apache.spark.sql.hive.test.TestHiveContext.<init>(TestHive.scala:80)
                at 
org.apache.spark.sql.hive.test.TestHive$.<init>(TestHive.scala:47)
                at 
org.apache.spark.sql.hive.test.TestHive$.<clinit>(TestHive.scala)

I'm using Spark 2.1.0 in this case.

Am I missing something or should I create a bug in Jira?

Reply via email to