[ 
https://issues.apache.org/jira/browse/SPARK-9887?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14694092#comment-14694092
 ] 

Bolke de Bruin commented on SPARK-9887:
---------------------------------------

Yes actually. I will test the latest and will close issue if it resolves the 
issue. 

> After recent hive patches PySpark fails with IllegalArgumentException: Wrong 
> FS: hdfs:
> --------------------------------------------------------------------------------------
>
>                 Key: SPARK-9887
>                 URL: https://issues.apache.org/jira/browse/SPARK-9887
>             Project: Spark
>          Issue Type: Bug
>          Components: PySpark, SQL
>    Affects Versions: 1.5.0
>         Environment: Yarn Kerberos hive 1.2.1
>            Reporter: Bolke de Bruin
>
> After the recent hive patches (spark build 2 days ago) PySpark fails with 
> below error. We confirmed it is working with spark before these patches with 
> equal configurations. 
> hc.sql("CREATE TABLE %s AS SELECT * FROM name_pair_distances" % table_name)
>   File 
> "/usr/hdp/current/spark/python/lib/pyspark.zip/pyspark/sql/context.py", line 
> 552, in sql
>   File 
> "/usr/hdp/current/spark/python/lib/py4j-0.8.2.1-src.zip/py4j/java_gateway.py",
>  line 538, in __call__
>   File "/usr/hdp/current/spark/python/lib/pyspark.zip/pyspark/sql/utils.py", 
> line 36, in deco
>   File 
> "/usr/hdp/current/spark/python/lib/py4j-0.8.2.1-src.zip/py4j/protocol.py", 
> line 300, in get_return_value
> py4j.protocol.Py4JJavaError: An error occurred while calling o43.sql.
> : java.lang.IllegalArgumentException: Wrong FS: 
> hdfs://hdpnlcb/apps/hive/warehouse/wcs.db/target_table/.hive-staging_hive_2015-08-12_20-27-48_357_3158824749894776248-1/-ext-10000/part-00000,
>  expected: file:///



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to