You could define access in Sentry and enable permissions sync with HDFS, so you could just grant access on Hive per-database or per-table basis. It should work for Spark too, as Sentry will propage "grants" to HDFS acls.
http://www.cloudera.com/content/cloudera/en/documentation/core/latest/topics/sg_hdfs_sentry_sync.html -- Ruslan Dautkhanov On Thu, Sep 3, 2015 at 1:46 PM, Daniel Schulz <danielschulz2...@hotmail.com> wrote: > Hi Matei, > > Thanks for your answer. > > My question is regarding simple authenticated Spark-on-YARN only, without > Kerberos. So when I run Spark on YARN and HDFS, Spark will pass through my > HDFS user and only be able to access files I am entitled to read/write? > Will it enforce HDFS ACLs and Ranger policies as well? > > Best regards, Daniel. > > > On 03 Sep 2015, at 21:16, Matei Zaharia <matei.zaha...@gmail.com> wrote: > > > > If you run on YARN, you can use Kerberos, be authenticated as the right > user, etc in the same way as MapReduce jobs. > > > > Matei > > > >> On Sep 3, 2015, at 1:37 PM, Daniel Schulz <danielschulz2...@hotmail.com> > wrote: > >> > >> Hi, > >> > >> I really enjoy using Spark. An obstacle to sell it to our clients > currently is the missing Kerberos-like security on a Hadoop with simple > authentication. Are there plans, a proposal, or a project to deliver a > Ranger plugin or something similar to Spark. The target is to differentiate > users and their privileges when reading and writing data to HDFS? Is > Kerberos my only option then? > >> > >> Kind regards, Daniel. > >> --------------------------------------------------------------------- > >> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org > >> For additional commands, e-mail: user-h...@spark.apache.org > > > > > > --------------------------------------------------------------------- > > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org > > For additional commands, e-mail: user-h...@spark.apache.org > > > > --------------------------------------------------------------------- > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org > For additional commands, e-mail: user-h...@spark.apache.org > >