Use Spark extension points to implement row-level security

classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|

Use Spark extension points to implement row-level security

Richard Siebeling
Hi,

I'd like to implement some kind of row-level security and am thinking of adding additional filters to the logical plan possibly using the Spark extensions.
Would this be feasible, for example using the injectResolutionRule?

thanks in advance,
Richard 
Reply | Threaded
Open this post in threaded view
|

Re: Use Spark extension points to implement row-level security

Maximiliano Patricio Méndez
Hi,

I've added table level security using spark extensions based on the ongoing work proposed for ranger in RANGER-2128. Following the same logic, you could mask columns and work on the logical plan, but not filtering or skipping rows, as those are not present in these hooks.

The only difficult I found was integrating extensions with pyspark, since in python the SparkContext is always created through the constructor and not using the scala getOrCreate() method (I've sent an email regarding this). But other than that, it works.


On Fri, Aug 17, 2018, 03:56 Richard Siebeling <[hidden email]> wrote:
Hi,

I'd like to implement some kind of row-level security and am thinking of adding additional filters to the logical plan possibly using the Spark extensions.
Would this be feasible, for example using the injectResolutionRule?

thanks in advance,
Richard 
Reply | Threaded
Open this post in threaded view
|

Re: Use Spark extension points to implement row-level security

Richard Siebeling
Thanks, this looks promising. I am trying to do it without a dependency on Hive and was hoping that the extension hooks could be used to add a filter transformation to the logical plan. I've seen some other email saying that in the optimisation hook the logical is expected to stay the same (

But I'm still hoping that some other extension hook can be used to add the filter operation. Does anyone know that?

There is not much documentation on the extension hooks, could not make it up from the existing documentation.

Regards,
Richard


Op vr 17 aug. 2018 om 15:33 schreef Maximiliano Patricio Méndez <[hidden email]>
Hi,

I've added table level security using spark extensions based on the ongoing work proposed for ranger in RANGER-2128. Following the same logic, you could mask columns and work on the logical plan, but not filtering or skipping rows, as those are not present in these hooks.

The only difficult I found was integrating extensions with pyspark, since in python the SparkContext is always created through the constructor and not using the scala getOrCreate() method (I've sent an email regarding this). But other than that, it works.


On Fri, Aug 17, 2018, 03:56 Richard Siebeling <[hidden email]> wrote:
Hi,

I'd like to implement some kind of row-level security and am thinking of adding additional filters to the logical plan possibly using the Spark extensions.
Would this be feasible, for example using the injectResolutionRule?

thanks in advance,
Richard