[ 
https://issues.apache.org/jira/browse/SPARK-5792?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14319645#comment-14319645
 ] 

pengxu commented on SPARK-5792:
-------------------------------

I've already figured out the reason, it was caused by the jackson-core-asl 
version incompatible between spark and hadoop. 

you can get more information from 
https://issues.apache.org/jira/browse/SPARK-3955

It can be reproduced according the steps describe below
===========================================================================================
spark-shell>import org.apache.spark.sql.hive._
spark-shell>val sqlContext = new HiveContext(sc)
spark-shell>import sqlContext._
spark-shell>case class DemoTbl(f1:String, f2:String)
spark-shell>val rdd1 = sc.makeRDD(List(new 
DemoTbl("hello","{\"hello\":\"world\"}")),1)
spark-shell>rdd1.registerTempTable("demo_tbl")
spark-shell>sql("select json_tuple(f2,'hello') from demo_tbl")
==========================================================================================

> hive udfs like "get_json_object and json_tuple" doesnot work in spark 1.2.0
> ---------------------------------------------------------------------------
>
>                 Key: SPARK-5792
>                 URL: https://issues.apache.org/jira/browse/SPARK-5792
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.2.0
>            Reporter: pengxu
>
> I'm using spark 1.2.0 now. After several testing, I found that the hive udfs 
> like get_json_object and json_tuple doesnot take effect.
> the testing environment is like the below.
> beeline==>thriftServer==>Spark Cluster
> For example, the output of such query is null instead of expected value.
> 'select get_json_object('{"hello":"world"}','hello') from demo_tbl'
> I issued the same query in hive also, the return value is "world", which was 
> I expected.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to