No, it means you have a conflicting version of Xerces somewhere in
your classpath. Maybe your app is bundling an old version?

On Wed, Apr 1, 2015 at 4:46 PM, ARose <ashley.r...@telarix.com> wrote:
> Upon executing these two lines of code:
>
>         conf = new SparkConf().setAppName(appName).setMaster(master);
>         sc = new JavaSparkContext(conf);
>
> I get the following error message:
>
> ERROR Configuration: Failed to set setXIncludeAware(true) for parser
> org.apache.xerces.jaxp.DocumentBuilderFactoryImpl@32b260fa:java.lang.UnsupportedOperationException:
> setXIncludeAware is not supported on this JAXP implementation or earlier:
> class org.apache.xerces.jaxp.DocumentBuilderFactoryImpl
> java.lang.UnsupportedOperationException:  setXIncludeAware is not supported
> on this JAXP implementation or earlier: class
> org.apache.xerces.jaxp.DocumentBuilderFactoryImpl
>         at
> javax.xml.parsers.DocumentBuilderFactory.setXIncludeAware(DocumentBuilderFactory.java:584)
>         at
> org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2032)
>         at
> org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2001)
>         at 
> org.apache.hadoop.conf.Configuration.getProps(Configuration.java:1918)
>         at org.apache.hadoop.conf.Configuration.get(Configuration.java:893)
>         at
> org.apache.hadoop.security.SecurityUtil.getAuthenticationMethod(SecurityUtil.java:673)
>         at
> org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:224)
>         at
> org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:214)
>         at
> org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:669)
>         at
> org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:571)
>         at
> org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:1996)
>         at
> org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:1996)
>         at scala.Option.getOrElse(Option.scala:120)
>         at org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala:1996)
>         at org.apache.spark.SecurityManager.<init>(SecurityManager.scala:207)
>         at org.apache.spark.SparkEnv$.create(SparkEnv.scala:218)
>         at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:163)
>         at 
> org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:267)
>         at org.apache.spark.SparkContext.<init>(SparkContext.scala:270)
>         at
> org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:61)
>
>
> Is Spark missing a dependency? What's going on here?
>
>
>
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-1-3-0-missing-dependency-tp22339.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to