Scala vs PySpark Inconsistency: SQLContext/SparkSession access from DataFrame/DataSet

classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view
|

Scala vs PySpark Inconsistency: SQLContext/SparkSession access from DataFrame/DataSet

Ben Roling
I've noticed that DataSet.sqlContext is public in Scala but the equivalent (DataFrame._sc) in PySpark is named as if it should be treated as private.

Is this intentional?  If so, what's the rationale?  If not, then it feels like a bug and DataFrame should have some form of public access back to the context/session.  I'm happy to log the bug but thought I would ask here first.  Thanks!