[ 
https://issues.apache.org/jira/browse/SPARK-24612?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

A B updated SPARK-24612:
------------------------
    Environment: 
>python --version
Python 3.6.5 :: Anaconda, Inc.

>java -version
java version "1.8.0_144"
Java(TM) SE Runtime Environment (build 1.8.0_144-b01)
Java HotSpot(TM) 64-Bit Server VM (build 25.144-b01, mixed mode)

>jupyter --version
4.4.0

>conda -V
conda 4.5.4

spark-2.3.0-bin-hadoop2.7

SparkContext
Spark UI
Version
v2.3.1
Master
local[*]
AppName
PySparkShell

  was:
>python --version

Python 3.6.5 :: Anaconda, Inc.

 

>java -version

java version "1.8.0_144"
Java(TM) SE Runtime Environment (build 1.8.0_144-b01)
Java HotSpot(TM) 64-Bit Server VM (build 25.144-b01, mixed mode)

 

>jupyter --version

4.4.0

 

>conda -V

conda 4.5.4

 

spark-2.3.0-bin-hadoop2.7


> Running into "Py4JJavaError" while converting list to Dataframe using 
> Pyspark, Jupyter notebook
> -----------------------------------------------------------------------------------------------
>
>                 Key: SPARK-24612
>                 URL: https://issues.apache.org/jira/browse/SPARK-24612
>             Project: Spark
>          Issue Type: Question
>          Components: PySpark
>    Affects Versions: 2.3.1
>         Environment: >python --version
> Python 3.6.5 :: Anaconda, Inc.
> >java -version
> java version "1.8.0_144"
> Java(TM) SE Runtime Environment (build 1.8.0_144-b01)
> Java HotSpot(TM) 64-Bit Server VM (build 25.144-b01, mixed mode)
> >jupyter --version
> 4.4.0
> >conda -V
> conda 4.5.4
> spark-2.3.0-bin-hadoop2.7
> SparkContext
> Spark UI
> Version
> v2.3.1
> Master
> local[*]
> AppName
> PySparkShell
>            Reporter: A B
>            Priority: Major
>
> rdd=sc.parallelize([[1, "Alice", 50],[2,'Amanda','35']])
> rdd.collect()
> [[1, 'Alice', 50], [2, 'Amanda', '35']]
> However, when i run df=rdd.toDF()
> i run into the following error: Any help resolving this error is greatly 
> appreciated.
> --------------------------------------------------------------------------- 
> Py4JJavaError Traceback (most recent call last) 
> C:\Tools\spark-2.3.1-bin-hadoop2.7\python\pyspark\sql\utils.py in deco(*a, 
> **kw)  62 try: ---> 63 return f(*a, **kw)  64 except 
> py4j.protocol.Py4JJavaError as e: 
> C:\Tools\spark-2.3.1-bin-hadoop2.7\python\lib\py4j-0.10.7-src.zip\py4j\protocol.py
>  in get_return_value(answer, gateway_client, target_id, name)  327 "An error 
> occurred while calling \{0}{1}\{2}.\n". --> 328 format(target_id, ".", name), 
> value)  329 else: Py4JJavaError: An error occurred while calling 
> o24.applySchemaToPythonRDD. : org.apache.spark.sql.AnalysisException: 
> java.lang.RuntimeException: java.lang.RuntimeException: Error while running 
> command to get file permissions : java.io.IOException: (null) entry in 
> command string: null ls -F C:\tmp\hive at 
> org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:770) at 
> org.apache.hadoop.util.Shell.execCommand(Shell.java:866) at 
> org.apache.hadoop.util.Shell.execCommand(Shell.java:849) at 
> org.apache.hadoop.fs.FileUtil.execCommand(FileUtil.java:1097) at 
> org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.loadPermissionInfo(RawLocalFileSystem.java:659)
>  at 
> org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.getPermission(RawLocalFileSystem.java:634)
>  at 
> org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:599)
>  at 
> org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:554)
>  at 
> org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:508) 
> at 
> org.apache.spark.sql.hive.client.HiveClientImpl.newState(HiveClientImpl.scala:180)
>  at 
> org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:114)
>  at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source) at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source) at 
> java.lang.reflect.Constructor.newInstance(Unknown Source) at 
> org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
>  at 
> org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:385)
>  at 
> org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:287)
>  at 
> org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:66)
>  at 
> org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:65)
>  at 
> org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply$mcZ$sp(HiveExternalCatalog.scala:195)
>  at 
> org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:195)
>  at 
> org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:195)
>  at 
> org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97)
>  at 
> org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:194)
>  at 
> org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:114)
>  at 
> org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:102)
>  at 
> org.apache.spark.sql.hive.HiveSessionStateBuilder.externalCatalog(HiveSessionStateBuilder.scala:39)
>  at 
> org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog$lzycompute(HiveSessionStateBuilder.scala:54)
>  at 
> org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateBuilder.scala:52)
>  at 
> org.apache.spark.sql.hive.HiveSessionStateBuilder$$anon$1.<init>(HiveSessionStateBuilder.scala:69)
>  at 
> org.apache.spark.sql.hive.HiveSessionStateBuilder.analyzer(HiveSessionStateBuilder.scala:69)
>  at 
> org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293)
>  at 
> org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293)
>  at 
> org.apache.spark.sql.internal.SessionState.analyzer$lzycompute(SessionState.scala:79)
>  at 
> org.apache.spark.sql.internal.SessionState.analyzer(SessionState.scala:79) at 
> org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:57)
>  at 
> org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:55)
>  at 
> org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:47)
>  at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:74) at 
> org.apache.spark.sql.SparkSession.internalCreateDataFrame(SparkSession.scala:577)
>  at 
> org.apache.spark.sql.SparkSession.applySchemaToPythonRDD(SparkSession.scala:752)
>  at 
> org.apache.spark.sql.SparkSession.applySchemaToPythonRDD(SparkSession.scala:737)
>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at 
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) at 
> java.lang.reflect.Method.invoke(Unknown Source) at 
> py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244) at 
> py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357) at 
> py4j.Gateway.invoke(Gateway.java:282) at 
> py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) at 
> py4j.commands.CallCommand.execute(CallCommand.java:79) at 
> py4j.GatewayConnection.run(GatewayConnection.java:238) at 
> java.lang.Thread.run(Unknown Source) ; at 
> org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:106)
>  at 
> org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:194)
>  at 
> org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:114)
>  at 
> org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:102)
>  at 
> org.apache.spark.sql.hive.HiveSessionStateBuilder.externalCatalog(HiveSessionStateBuilder.scala:39)
>  at 
> org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog$lzycompute(HiveSessionStateBuilder.scala:54)
>  at 
> org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateBuilder.scala:52)
>  at 
> org.apache.spark.sql.hive.HiveSessionStateBuilder$$anon$1.<init>(HiveSessionStateBuilder.scala:69)
>  at 
> org.apache.spark.sql.hive.HiveSessionStateBuilder.analyzer(HiveSessionStateBuilder.scala:69)
>  at 
> org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293)
>  at 
> org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293)
>  at 
> org.apache.spark.sql.internal.SessionState.analyzer$lzycompute(SessionState.scala:79)
>  at 
> org.apache.spark.sql.internal.SessionState.analyzer(SessionState.scala:79) at 
> org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:57)
>  at 
> org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:55)
>  at 
> org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:47)
>  at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:74) at 
> org.apache.spark.sql.SparkSession.internalCreateDataFrame(SparkSession.scala:577)
>  at 
> org.apache.spark.sql.SparkSession.applySchemaToPythonRDD(SparkSession.scala:752)
>  at 
> org.apache.spark.sql.SparkSession.applySchemaToPythonRDD(SparkSession.scala:737)
>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at 
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) at 
> java.lang.reflect.Method.invoke(Unknown Source) at 
> py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244) at 
> py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357) at 
> py4j.Gateway.invoke(Gateway.java:282) at 
> py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) at 
> py4j.commands.CallCommand.execute(CallCommand.java:79) at 
> py4j.GatewayConnection.run(GatewayConnection.java:238) at 
> java.lang.Thread.run(Unknown Source) Caused by: java.lang.RuntimeException: 
> java.lang.RuntimeException: Error while running command to get file 
> permissions : java.io.IOException: (null) entry in command string: null ls -F 
> C:\tmp\hive at 
> org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:770) at 
> org.apache.hadoop.util.Shell.execCommand(Shell.java:866) at 
> org.apache.hadoop.util.Shell.execCommand(Shell.java:849) at 
> org.apache.hadoop.fs.FileUtil.execCommand(FileUtil.java:1097) at 
> org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.loadPermissionInfo(RawLocalFileSystem.java:659)
>  at 
> org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.getPermission(RawLocalFileSystem.java:634)
>  at 
> org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:599)
>  at 
> org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:554)
>  at 
> org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:508) 
> at 
> org.apache.spark.sql.hive.client.HiveClientImpl.newState(HiveClientImpl.scala:180)
>  at 
> org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:114)
>  at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source) at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source) at 
> java.lang.reflect.Constructor.newInstance(Unknown Source) at 
> org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
>  at 
> org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:385)
>  at 
> org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:287)
>  at 
> org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:66)
>  at 
> org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:65)
>  at 
> org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply$mcZ$sp(HiveExternalCatalog.scala:195)
>  at 
> org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:195)
>  at 
> org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:195)
>  at 
> org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97)
>  at 
> org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:194)
>  at 
> org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:114)
>  at 
> org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:102)
>  at 
> org.apache.spark.sql.hive.HiveSessionStateBuilder.externalCatalog(HiveSessionStateBuilder.scala:39)
>  at 
> org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog$lzycompute(HiveSessionStateBuilder.scala:54)
>  at 
> org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateBuilder.scala:52)
>  at 
> org.apache.spark.sql.hive.HiveSessionStateBuilder$$anon$1.<init>(HiveSessionStateBuilder.scala:69)
>  at 
> org.apache.spark.sql.hive.HiveSessionStateBuilder.analyzer(HiveSessionStateBuilder.scala:69)
>  at 
> org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293)
>  at 
> org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293)
>  at 
> org.apache.spark.sql.internal.SessionState.analyzer$lzycompute(SessionState.scala:79)
>  at 
> org.apache.spark.sql.internal.SessionState.analyzer(SessionState.scala:79) at 
> org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:57)
>  at 
> org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:55)
>  at 
> org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:47)
>  at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:74) at 
> org.apache.spark.sql.SparkSession.internalCreateDataFrame(SparkSession.scala:577)
>  at 
> org.apache.spark.sql.SparkSession.applySchemaToPythonRDD(SparkSession.scala:752)
>  at 
> org.apache.spark.sql.SparkSession.applySchemaToPythonRDD(SparkSession.scala:737)
>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at 
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) at 
> java.lang.reflect.Method.invoke(Unknown Source) at 
> py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244) at 
> py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357) at 
> py4j.Gateway.invoke(Gateway.java:282) at 
> py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) at 
> py4j.commands.CallCommand.execute(CallCommand.java:79) at 
> py4j.GatewayConnection.run(GatewayConnection.java:238) at 
> java.lang.Thread.run(Unknown Source) at 
> org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522) 
> at 
> org.apache.spark.sql.hive.client.HiveClientImpl.newState(HiveClientImpl.scala:180)
>  at 
> org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:114)
>  at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source) at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source) at 
> java.lang.reflect.Constructor.newInstance(Unknown Source) at 
> org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
>  at 
> org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:385)
>  at 
> org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:287)
>  at 
> org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:66)
>  at 
> org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:65)
>  at 
> org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply$mcZ$sp(HiveExternalCatalog.scala:195)
>  at 
> org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:195)
>  at 
> org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:195)
>  at 
> org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97)
>  ... 30 more Caused by: java.lang.RuntimeException: Error while running 
> command to get file permissions : java.io.IOException: (null) entry in 
> command string: null ls -F C:\tmp\hive at 
> org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:770) at 
> org.apache.hadoop.util.Shell.execCommand(Shell.java:866) at 
> org.apache.hadoop.util.Shell.execCommand(Shell.java:849) at 
> org.apache.hadoop.fs.FileUtil.execCommand(FileUtil.java:1097) at 
> org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.loadPermissionInfo(RawLocalFileSystem.java:659)
>  at 
> org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.getPermission(RawLocalFileSystem.java:634)
>  at 
> org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:599)
>  at 
> org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:554)
>  at 
> org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:508) 
> at 
> org.apache.spark.sql.hive.client.HiveClientImpl.newState(HiveClientImpl.scala:180)
>  at 
> org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:114)
>  at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source) at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source) at 
> java.lang.reflect.Constructor.newInstance(Unknown Source) at 
> org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)
>  at 
> org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:385)
>  at 
> org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:287)
>  at 
> org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:66)
>  at 
> org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:65)
>  at 
> org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply$mcZ$sp(HiveExternalCatalog.scala:195)
>  at 
> org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:195)
>  at 
> org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:195)
>  at 
> org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97)
>  at 
> org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:194)
>  at 
> org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:114)
>  at 
> org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:102)
>  at 
> org.apache.spark.sql.hive.HiveSessionStateBuilder.externalCatalog(HiveSessionStateBuilder.scala:39)
>  at 
> org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog$lzycompute(HiveSessionStateBuilder.scala:54)
>  at 
> org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateBuilder.scala:52)
>  at 
> org.apache.spark.sql.hive.HiveSessionStateBuilder$$anon$1.<init>(HiveSessionStateBuilder.scala:69)
>  at 
> org.apache.spark.sql.hive.HiveSessionStateBuilder.analyzer(HiveSessionStateBuilder.scala:69)
>  at 
> org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293)
>  at 
> org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293)
>  at 
> org.apache.spark.sql.internal.SessionState.analyzer$lzycompute(SessionState.scala:79)
>  at 
> org.apache.spark.sql.internal.SessionState.analyzer(SessionState.scala:79) at 
> org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:57)
>  at 
> org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:55)
>  at 
> org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:47)
>  at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:74) at 
> org.apache.spark.sql.SparkSession.internalCreateDataFrame(SparkSession.scala:577)
>  at 
> org.apache.spark.sql.SparkSession.applySchemaToPythonRDD(SparkSession.scala:752)
>  at 
> org.apache.spark.sql.SparkSession.applySchemaToPythonRDD(SparkSession.scala:737)
>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at 
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) at 
> java.lang.reflect.Method.invoke(Unknown Source) at 
> py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244) at 
> py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357) at 
> py4j.Gateway.invoke(Gateway.java:282) at 
> py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) at 
> py4j.commands.CallCommand.execute(CallCommand.java:79) at 
> py4j.GatewayConnection.run(GatewayConnection.java:238) at 
> java.lang.Thread.run(Unknown Source) at 
> org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.loadPermissionInfo(RawLocalFileSystem.java:699)
>  at 
> org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.getPermission(RawLocalFileSystem.java:634)
>  at 
> org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:599)
>  at 
> org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:554)
>  at 
> org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:508) 
> ... 45 more During handling of the above exception, another exception 
> occurred: AnalysisException Traceback (most recent call last) 
> <ipython-input-12-ed077f843ecf> in <module>() ----> 1 df=rdd.toDF() 
> C:\Tools\spark-2.3.1-bin-hadoop2.7\python\pyspark\sql\session.py in 
> toDF(self, schema, sampleRatio)  56 [Row(name=u'Alice', age=1)]  57 """ ---> 
> 58 return sparkSession.createDataFrame(self, schema, sampleRatio)  59  60 
> RDD.toDF = toDF 
> C:\Tools\spark-2.3.1-bin-hadoop2.7\python\pyspark\sql\session.py in 
> createDataFrame(self, data, schema, samplingRatio, verifySchema)  691 rdd, 
> schema = self._createFromLocal(map(prepare, data), schema)  692 jrdd = 
> self._jvm.SerDeUtil.toJavaArray(rdd._to_java_object_rdd()) --> 693 jdf = 
> self._jsparkSession.applySchemaToPythonRDD(jrdd.rdd(), schema.json())  694 df 
> = DataFrame(jdf, self._wrapped)  695 df._schema = schema 
> C:\Tools\spark-2.3.1-bin-hadoop2.7\python\lib\py4j-0.10.7-src.zip\py4j\java_gateway.py
>  in __call__(self, *args)  1255 answer = 
> self.gateway_client.send_command(command)  1256 return_value = 
> get_return_value( -> 1257 answer, self.gateway_client, self.target_id, 
> self.name)  1258  1259 for temp_arg in temp_args: 
> C:\Tools\spark-2.3.1-bin-hadoop2.7\python\pyspark\sql\utils.py in deco(*a, 
> **kw)  67 e.java_exception.getStackTrace()))  68 if 
> s.startswith('org.apache.spark.sql.AnalysisException: '): ---> 69 raise 
> AnalysisException(s.split(': ', 1)[1], stackTrace)  70 if 
> s.startswith('org.apache.spark.sql.catalyst.analysis'):  71 raise 
> AnalysisException(s.split(': ', 1)[1], stackTrace) AnalysisException: 
> 'java.lang.RuntimeException: java.lang.RuntimeException: Error while running 
> command to get file permissions : java.io.IOException: (null) entry in 
> command string: null ls -F C:\\tmp\\hive\r\n\tat 
> org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:770)\r\n\tat
>  org.apache.hadoop.util.Shell.execCommand(Shell.java:866)\r\n\tat 
> org.apache.hadoop.util.Shell.execCommand(Shell.java:849)\r\n\tat 
> org.apache.hadoop.fs.FileUtil.execCommand(FileUtil.java:1097)\r\n\tat 
> org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.loadPermissionInfo(RawLocalFileSystem.java:659)\r\n\tat
>  
> org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.getPermission(RawLocalFileSystem.java:634)\r\n\tat
>  
> org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:599)\r\n\tat
>  
> org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:554)\r\n\tat
>  
> org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:508)\r\n\tat
>  
> org.apache.spark.sql.hive.client.HiveClientImpl.newState(HiveClientImpl.scala:180)\r\n\tat
>  
> org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:114)\r\n\tat
>  sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native 
> Method)\r\n\tat sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown 
> Source)\r\n\tat 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown 
> Source)\r\n\tat java.lang.reflect.Constructor.newInstance(Unknown 
> Source)\r\n\tat 
> org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264)\r\n\tat
>  
> org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:385)\r\n\tat
>  
> org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:287)\r\n\tat
>  
> org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:66)\r\n\tat
>  
> org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:65)\r\n\tat
>  
> org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply$mcZ$sp(HiveExternalCatalog.scala:195)\r\n\tat
>  
> org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:195)\r\n\tat
>  
> org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:195)\r\n\tat
>  
> org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97)\r\n\tat
>  
> org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:194)\r\n\tat
>  
> org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:114)\r\n\tat
>  
> org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:102)\r\n\tat
>  
> org.apache.spark.sql.hive.HiveSessionStateBuilder.externalCatalog(HiveSessionStateBuilder.scala:39)\r\n\tat
>  
> org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog$lzycompute(HiveSessionStateBuilder.scala:54)\r\n\tat
>  
> org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateBuilder.scala:52)\r\n\tat
>  
> org.apache.spark.sql.hive.HiveSessionStateBuilder$$anon$1.<init>(HiveSessionStateBuilder.scala:69)\r\n\tat
>  
> org.apache.spark.sql.hive.HiveSessionStateBuilder.analyzer(HiveSessionStateBuilder.scala:69)\r\n\tat
>  
> org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293)\r\n\tat
>  
> org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293)\r\n\tat
>  
> org.apache.spark.sql.internal.SessionState.analyzer$lzycompute(SessionState.scala:79)\r\n\tat
>  
> org.apache.spark.sql.internal.SessionState.analyzer(SessionState.scala:79)\r\n\tat
>  
> org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:57)\r\n\tat
>  
> org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:55)\r\n\tat
>  
> org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:47)\r\n\tat
>  org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:74)\r\n\tat 
> org.apache.spark.sql.SparkSession.internalCreateDataFrame(SparkSession.scala:577)\r\n\tat
>  
> org.apache.spark.sql.SparkSession.applySchemaToPythonRDD(SparkSession.scala:752)\r\n\tat
>  
> org.apache.spark.sql.SparkSession.applySchemaToPythonRDD(SparkSession.scala:737)\r\n\tat
>  sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)\r\n\tat 
> sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)\r\n\tat 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)\r\n\tat 
> java.lang.reflect.Method.invoke(Unknown Source)\r\n\tat 
> py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)\r\n\tat 
> py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)\r\n\tat 
> py4j.Gateway.invoke(Gateway.java:282)\r\n\tat 
> py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)\r\n\tat 
> py4j.commands.CallCommand.execute(CallCommand.java:79)\r\n\tat 
> py4j.GatewayConnection.run(GatewayConnection.java:238)\r\n\tat 
> java.lang.Thread.run(Unknown Source)\r\n;'



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to