Pyspark and snowflake Column Mapping

classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view

Pyspark and snowflake Column Mapping

Hi Team,

While working on the json data and we flattened the unstrucured data into
structured here we are having spark data types like
Array<STRUCT&lt;key:value,... >> fields and Array<string> data type columns
in the databricks delta table.

while loading the data from  databricks spark connector to snowflake we
noticed that the Array<STRUCT&lt;>> and Array<string> columns mapped to
variant type in snowflake.actually we are expecting as same array type in

how do we handle this case while loading into snowflake.

please share your ideas.

Sent from:

To unsubscribe e-mail: [hidden email]