*Component*: Spark
*Level*: Advanced
*Scenario*: How-to

-------------------------
*Problems Description*
I have nested json string value in someone field of spark dataframe, and I
would like to use from_json() to parse json object. Especially, if one of
key type is not match with our defined struct type, it will return null.
Based on this, could we find which key type is error? Related example
follow as:

*source dataframe:*
| original_json_string |
| -------------------------- |
| "{a:{b:"dn", c:"test"}}" |

ps. And we expected the value type of b should be double type. so we
predefined struct type for from_json() to use, but just directly return
null:

*result dataframe after from_json:*
| original_json_string |
| -------------------------- |
| null |

In this sample, because value of a have 2 keys, b and c, could we know is
value type of b is error instead of c, which can let me check data quickly
instead just return null.
If we would like to achieve this objective, how to implement it?
if you have and ideas, I will be appreciated it, thank you.

-- 
Best Regards,

Mars Su
*Phone*: 0988-661-013
*Email*: hueiyua...@gmail.com

Reply via email to