Re: Access_Remote_Kerberized_Cluster_Through_Spark

2016-11-10 Thread KhajaAsmath Mohammed
Hi Ajay, I was able to resolve it by adding yarn user principal. here is complete code. def main(args: Array[String]) { // create Spark context with Spark configuration val cmdLine = Parse.commandLine(args) val configFile = cmdLine.getOptionValue("c") val propertyConfiguration

Re: Access_Remote_Kerberized_Cluster_Through_Spark

2016-11-09 Thread Ajay Chander
Hi Everyone, I am still trying to figure this one out. I am stuck with this error "java.io.IOException: Can't get Master Kerberos principal for use as renewer ". Below is my code. Can any of you please provide any insights on this? Thanks for your time. import java.io.{BufferedInputStream,

Re: Access_Remote_Kerberized_Cluster_Through_Spark

2016-11-07 Thread Ajay Chander
Did anyone use https://www.codatlas.com/github.com/apache/spark/HEAD/core/src/main/scala/org/apache/spark/deploy/SparkHadoopUtil.scala to interact with secured Hadoop from Spark ? Thanks, Ajay On Mon, Nov 7, 2016 at 4:37 PM, Ajay Chander wrote: > > Hi Everyone, > > I am

Access_Remote_Kerberized_Cluster_Through_Spark

2016-11-07 Thread Ajay Chander
Hi Everyone, I am trying to develop a simple codebase on my machine to read data from secured Hadoop cluster. We have a development cluster which is secured through Kerberos and I want to run a Spark job from my IntelliJ to read some sample data from the cluster. Has anyone done this before ? Can