[ https://issues.apache.org/jira/browse/SPARK-21554?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16104634#comment-16104634 ]
Subhod Lagade commented on SPARK-21554: --------------------------------------- Thanks for quick reply @Hyukjin Kwon - We have spark installed with version 2.1.1 on EMR 5.7 cluster. - From any of the node when i try to submit pyspark job we are getting above error. Deploy command : spark-submit --master yarn --deploy-mode cluster spark_installed_dir\examples\src\main\python\sql\basic.py > Spark Hive reporting pyspark.sql.utils.AnalysisException: u'Table not found: > XXX' when run on yarn cluster > ---------------------------------------------------------------------------------------------------------- > > Key: SPARK-21554 > URL: https://issues.apache.org/jira/browse/SPARK-21554 > Project: Spark > Issue Type: Bug > Components: Deploy > Affects Versions: 2.1.1 > Environment: We are deploying pyspark scripts on EMR 5.7 > Reporter: Subhod Lagade > > Traceback (most recent call last): > File "Test.py", line 7, in <module> > hc = HiveContext(sc) > File > "/mnt/yarn/usercache/hadoop/appcache/application_1500357225179_0540/container_1500357225179_0540_02_000001/pyspark.zip/pyspark/sql/context.py", > line 514, in __init__ > File > "/mnt/yarn/usercache/hadoop/appcache/application_1500357225179_0540/container_1500357225179_0540_02_000001/pyspark.zip/pyspark/sql/session.py", > line 179, in getOrCreate > File > "/mnt/yarn/usercache/hadoop/appcache/application_1500357225179_0540/container_1500357225179_0540_02_000001/py4j-0.10.4-src.zip/py4j/java_gateway.py", > line 1133, in __call__ > File > "/mnt/yarn/usercache/hadoop/appcache/application_1500357225179_0540/container_1500357225179_0540_02_000001/pyspark.zip/pyspark/sql/utils.py", > line 79, in deco > pyspark.sql.utils.IllegalArgumentException: u"Error while instantiating > 'org.apache.spark.sql.hive.HiveSessionState':" -- This message was sent by Atlassian JIRA (v6.4.14#64029) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org