Hi,

I've added table level security using spark extensions based on the ongoing
work proposed for ranger in RANGER-2128. Following the same logic, you
could mask columns and work on the logical plan, but not filtering or
skipping rows, as those are not present in these hooks.

The only difficult I found was integrating extensions with pyspark, since
in python the SparkContext is always created through the constructor and
not using the scala getOrCreate() method (I've sent an email regarding
this). But other than that, it works.


On Fri, Aug 17, 2018, 03:56 Richard Siebeling <rsiebel...@gmail.com> wrote:

> Hi,
>
> I'd like to implement some kind of row-level security and am thinking of
> adding additional filters to the logical plan possibly using the Spark
> extensions.
> Would this be feasible, for example using the injectResolutionRule?
>
> thanks in advance,
> Richard
>

Reply via email to