Need to create some hive test tables for pyCharm

SPARK_HOME is set up as

D:\temp\spark-3.0.1-bin-hadoop2.7

HADOOP_HOME is

c:\hadoop\

spark-shell works. Trying to run spark-sql, I get the following errors

PS C:\tmp\hive> spark-sql
log4j:WARN No appenders could be found for logger
(org.apache.hadoop.hive.conf.HiveConf).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for
more info.
Using Spark's default log4j profile:
org/apache/spark/log4j-defaults.properties
20/11/16 21:12:34 INFO SharedState: loading hive config file:
file:/D:/temp/spark-3.0.1-bin-hadoop2.7/conf/hive-site.xml
20/11/16 21:12:34 INFO SharedState: spark.sql.warehouse.dir is not set, but
hive.metastore.warehouse.dir is set. Setting spark.sql.warehouse.dir to the
value of hive.metastore.warehouse.dir ('/tmp/hive/warehouse').
20/11/16 21:12:34 INFO SharedState: Warehouse path is '/tmp/hive/warehouse'.
20/11/16 21:12:34 INFO SessionState: Created HDFS directory:
/tmp/hive/admin/e32257b3-6c6b-46d7-921e-2782e2c15546
Exception in thread "main" java.lang.UnsatisfiedLinkError:
org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode0(Ljava/lang/String;I)V
        at
org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode0(Native
Method)
        at
org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode(NativeIO.java:524)
        at
org.apache.hadoop.fs.RawLocalFileSystem.mkOneDirWithMode(RawLocalFileSystem.java:478)
        at
org.apache.hadoop.fs.RawLocalFileSystem.mkdirsWithOptionalPermission(RawLocalFileSystem.java:532)
        at
org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:509)
        at
org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:312)
        at
org.apache.hadoop.hive.ql.session.SessionState.createPath(SessionState.java:751)
        at
org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:688)
        at
org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:586)
        at
org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:548)
        at
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:135)
        at
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:497)
        at
org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
        at org.apache.spark.deploy.SparkSubmit.org
$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:928)
        at
org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
        at
org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
        at
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1007)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1016)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
20/11/16 21:12:34 INFO ShutdownHookManager: Shutdown hook called
20/11/16 21:12:34 INFO ShutdownHookManager: Deleting directory
C:\Users\admin\AppData\Local\Temp\spark-51bb8e21-5fd3-4b86-bb0b-5e331c92bd4f
PS C:\tmp\hive>


Any workarounds is appreciated.


Mich


LinkedIn * 
https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*





*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.

Reply via email to