Sean, Thanks for your response. My MapReduce and Spark 1.0 (prepackaged in CDH5) jobs are running fine. It's only Spark1.2 jobs that I'm unable to run
NR On Dec 19, 2014 5:03 AM, "Sean Owen" <so...@cloudera.com> wrote: > You've got Kerberos enabled, and it's complaining that it YARN doesn't > like the Kerberos config. Have you verified this should be otherwise > working, sans Spark? > > On Fri, Dec 19, 2014 at 3:50 AM, maven <niranja...@gmail.com> wrote: > > All, > > > > I just built Spark-1.2 on my enterprise server (which has Hadoop 2.3 with > > YARN). Here're the steps I followed for the build: > > > > $ mvn -Pyarn -Phadoop-2.3 -Dhadoop.version=2.3.0 -DskipTests clean > package > > $ export SPARK_HOME=/path/to/spark/folder > > $ export HADOOP_CONF_DIR=/etc/hadoop/conf > > > > However, when I try to work with this installation either locally or on > > YARN, I get the following error: > > > > Exception in thread "main" java.lang.ExceptionInInitializerError > > at > > org.apache.spark.util.Utils$.getSparkOrYarnConfig(Utils.scala:1784) > > at > > org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:105) > > at > > org.apache.spark.storage.BlockManager.<init>(BlockManager.scala:180) > > at org.apache.spark.SparkEnv$.create(SparkEnv.scala:292) > > at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:159) > > at org.apache.spark.SparkContext.<init>(SparkContext.scala:232) > > at water.MyDriver$.main(MyDriver.scala:19) > > at water.MyDriver.main(MyDriver.scala) > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > > at > > > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) > > at > > > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > > at java.lang.reflect.Method.invoke(Method.java:606) > > at > > org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:360) > > at > org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75) > > at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) > > Caused by: org.apache.spark.SparkException: Unable to load YARN support > > at > > > org.apache.spark.deploy.SparkHadoopUtil$.liftedTree1$1(SparkHadoopUtil.scala:199) > > at > > > org.apache.spark.deploy.SparkHadoopUtil$.<init>(SparkHadoopUtil.scala:194) > > at > > org.apache.spark.deploy.SparkHadoopUtil$.<clinit>(SparkHadoopUtil.scala) > > ... 15 more > > Caused by: java.lang.IllegalArgumentException: Invalid rule: L > > RULE:[2:$1@$0](.*@XXXCOMPANY.COM)s/(.*)@XXXCOMPANY.COM/$1/L > > DEFAULT > > at > > > org.apache.hadoop.security.authentication.util.KerberosName.parseRules(KerberosName.java:321) > > at > > > org.apache.hadoop.security.authentication.util.KerberosName.setRules(KerberosName.java:386) > > at > > > org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:75) > > at > > > org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:247) > > at > > > org.apache.hadoop.security.UserGroupInformation.setConfiguration(UserGroupInformation.java:283) > > at > > org.apache.spark.deploy.SparkHadoopUtil.<init>(SparkHadoopUtil.scala:43) > > at > > > org.apache.spark.deploy.yarn.YarnSparkHadoopUtil.<init>(YarnSparkHadoopUtil.scala:45) > > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native > > Method) > > at > > > sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) > > at > > > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) > > at > java.lang.reflect.Constructor.newInstance(Constructor.java:526) > > at java.lang.Class.newInstance(Class.java:374) > > at > > > org.apache.spark.deploy.SparkHadoopUtil$.liftedTree1$1(SparkHadoopUtil.scala:196) > > ... 17 more > > > > I noticed that when I unset HADOOP_CONF_DIR, I'm able to work in the > local > > mode without any errors. I'm able to work with pre-installed Spark 1.0, > > locally and on yarn, without any issues. It looks like I may be missing a > > configuration step somewhere. Any thoughts on what may be causing this? > > > > NR > > > > > > > > -- > > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/java-lang-ExceptionInInitializerError-Unable-to-load-YARN-support-tp20775.html > > Sent from the Apache Spark User List mailing list archive at Nabble.com. > > > > --------------------------------------------------------------------- > > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org > > For additional commands, e-mail: user-h...@spark.apache.org > > >