Hi,
When I am trying to run a program in a remote spark machine I am getting
this below exception :
15/01/16 11:14:39 ERROR UserGroupInformation: PriviledgedActionException
as:user1 (auth:SIMPLE) cause:java.util.concurrent.TimeoutException: Futures
timed out after [30 seconds]
Exception in thread "main" java.lang.reflect.UndeclaredThrowableException:
Unknown exception in doAs
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1421)
at
org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:59)
at
org.apache.spark.executor.CoarseGrainedExecutorBackend$.run(CoarseGrainedExecutorBackend.scala:115)
at
org.apache.spark.executor.CoarseGrainedExecutorBackend$.main(CoarseGrainedExecutorBackend.scala:163)
at
org.apache.spark.executor.CoarseGrainedExecutorBackend.main(CoarseGrainedExecutorBackend.scala)
Caused by: java.security.PrivilegedActionException:
java.util.concurrent.TimeoutException: Futures timed out after [30 seconds]
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
... 4 more
Caused by: java.util.concurrent.TimeoutException: Futures timed out after
[30 seconds]
at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
at
scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
at
scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
at scala.concurrent.Await$.result(package.scala:107)
at
org.apache.spark.executor.CoarseGrainedExecutorBackend$$anonfun$run$1.apply$mcV$sp(CoarseGrainedExecutorBackend.scala:127)
at
org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:60)
at
org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:59)
I am executing this driver class in one machine say(abc) but the spark is
installed on another machine (xyz:7077).
When I execute the same driver class pointing to "local" it works fine.
public class TestSpark implements Serializable{
public static void main(String[] args) throws IOException {
TestSpark test = new TestSpark();
test.testReduce();
}
public void testReduce() throws IOException {
SparkConf conf = new
SparkConf().setMaster("spark://xyz:7077").setAppName("Sample App");
String[] pathToJar = {"/home/user1/Desktop/Jars/TestSpark.jar"};
//SparkConf conf = new
SparkConf().setMaster("spark://abc:7077").setAppName("Sample
App").setJars(pathToJar);
//SparkConf conf = new
SparkConf().setMaster("local").setAppName("Sample
App");
JavaSparkContext jsc = new JavaSparkContext(conf);
List<Integer> data = new ArrayList<Integer>();
for(int i=1;i<500;i++){
data.add(i);
}
System.out.println("Size : "+data.size());
JavaRDD<Integer> distData = jsc.parallelize(data);
Integer total = distData.reduce(new Function2<Integer, Integer,
Integer>()
{
@Override
public Integer call(Integer v1, Integer v2) throws
Exception {
String s1 = "v1 : " + v1 + " v2 : "+ v2 + "
-->" + this;
return v1 + v2;
}
});
System.out.println("-->"+total );
}
}
I have also tried setting spark.driver.host :: xyz & spark.driver.port ::
7077 while creating the spark context, but it did not help.
Please advice
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/UserGroupInformation-PriviledgedActionException-as-tp21182.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]