[ 
https://issues.apache.org/jira/browse/SPARK-37857?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17477433#comment-17477433
 ] 

Franck Thang edited comment on SPARK-37857 at 1/17/22, 10:47 PM:
-----------------------------------------------------------------

As [~venk] is not answering, I can start taking a look if you want 
[~hyukjin.kwon] 
 

Post-update:

It seems the first implementation of Spark get_json_object does replicate Hive 
behaviour (https://github.com/apache/spark/pull/7901)

In hive documentation 
([https://cwiki.apache.org/confluence/display/hive/languagemanual+udf)] it is 
stated that recursive decent (i.e ".." notation) is not supported

I guess that's probably why it's not working

What do you think ?


was (Author: stelyus):
As [~venk] is not answering, I can start taking a look if you want 
[~hyukjin.kwon] 
 

> Any depth search not working in get_json_object ($..foo)
> --------------------------------------------------------
>
>                 Key: SPARK-37857
>                 URL: https://issues.apache.org/jira/browse/SPARK-37857
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 3.2.0
>            Reporter: Venk
>            Priority: Minor
>
> The following example should return a value _abc_ but instead returns null
> {code:java}
> spark.sql("""select get_json_object('{"k":{"value":"abc"}}', '$..value') as 
> j""").show(){code}
> I checked the [spark test suite for Json expression parser | 
> [spark/JsonExpressionsSuite.scala at f12793de205a46c75529ac20649e2e61edc0b144 
> · apache/spark · 
> GitHub|https://github.com/apache/spark/blob/f12793de205a46c75529ac20649e2e61edc0b144/sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/expressions/JsonExpressionsSuite.scala]]
>  and the example there seems to be for a negative test case (non-existent 
> key), so I'm not sure if this is expected.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to