Hi,

Is it possible to query a data structure that is a dictionary within a
dictionary?

I have a parquet file with a a structure:
test
|____key1: {key_string: val_int}
|____key2: {key_string: val_int}

if I try to do:
 parquetFile.test
 --> Column<test>

 parquetFile.test.key2
 --> AttributeError: 'Column' object has no attribute 'key2'

Similarly, if I try to do a SQL query, it throws this error:

org.apache.spark.sql.AnalysisException: GetField is not valid on fields of
type MapType(StringType,MapType(StringType,IntegerType,true),true);

Is this at all possible with the Python API in Spark SQL?

Thanks,
Maria



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/SparkSQL-nested-dictionaries-tp23207.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to