[ 
https://issues.apache.org/jira/browse/SPARK-10499?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15083871#comment-15083871
 ] 

Josh Rosen commented on SPARK-10499:
------------------------------------

This will no longer be necessary once we remove the need to set the {{-Phive}} 
build flag (SPARK-8108), so I'm going to mark this as "Won't Fix." Feel free to 
submit a PR targeted at 1.6 if you really want this, but it seems like a low 
priority at this point.

> Improve error message when constructing a hive context in PySpark with 
> non-hive assembly
> ----------------------------------------------------------------------------------------
>
>                 Key: SPARK-10499
>                 URL: https://issues.apache.org/jira/browse/SPARK-10499
>             Project: Spark
>          Issue Type: Improvement
>          Components: PySpark, SQL
>            Reporter: holdenk
>            Priority: Minor
>
> A few times when I've been running local tests I've forgotten to build with 
> the Hive assembly and the error message isn't super clear (just a generic 
> py4j error message). Lets wrap that error message so its clearer about its 
> probably root cause.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to