Null array of cols

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view
|

Null array of cols

Mohit Anchlia
I am trying to understand the best way to handle the scenario where null array "[]" is passed. Can somebody suggest if there is a way to filter out such records. I've tried numerous things including using dataframe.head().isEmpty but pyspark doesn't recognize isEmpty even though I see it in the API docs.

pyspark.sql.utils.AnalysisException: u"cannot resolve '`timestamp`' given input columns: []; line 1 pos 0;\n'Filter isnotnull('timestamp)\n+- LogicalRDD\n"