SparkSQL nested dictionaries

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
1 message Options
mrm
Reply | Threaded
Open this post in threaded view
|

SparkSQL nested dictionaries

mrm
This post has NOT been accepted by the mailing list yet.
Hi,

Is it possible to query a data structure that is a dictionary within a dictionary?

I have a parquet file with a a structure:
test
|____key1: {key_string: val_int}
|____key2: {key_string: val_int}

if I try to do:
 parquetFile.test
 --> Column<test>

 parquetFile.test.key2
 --> AttributeError: 'Column' object has no attribute 'key2'

Similarly, if I try to do a SQL query, it throws this error:

org.apache.spark.sql.AnalysisException: GetField is not valid on fields of type MapType(StringType,MapType(StringType,IntegerType,true),true);

Is this at all possible with the Python API in Spark SQL?

Thanks,
Maria