[ https://issues.apache.org/jira/browse/SPARK-1875?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14001336#comment-14001336 ]
Patrick Wendell commented on SPARK-1875: ---------------------------------------- The issue was caused by this patch. I need to look further to figure out what was going on. https://github.com/apache/spark/pull/754 > NoClassDefFoundError: StringUtils when building against Hadoop 1 > ---------------------------------------------------------------- > > Key: SPARK-1875 > URL: https://issues.apache.org/jira/browse/SPARK-1875 > Project: Spark > Issue Type: Bug > Reporter: Matei Zaharia > Priority: Blocker > Fix For: 1.0.0 > > > Maybe I missed something, but after building an assembly with Hadoop 1.2.1 > and Hive enabled, if I go into it and run spark-shell, I get this: > {code} > java.lang.NoClassDefFoundError: org/apache/commons/lang/StringUtils > at > org.apache.hadoop.metrics2.lib.MetricMutableStat.<init>(MetricMutableStat.java:59) > at > org.apache.hadoop.metrics2.impl.MetricsSystemImpl.<init>(MetricsSystemImpl.java:75) > at > org.apache.hadoop.metrics2.impl.MetricsSystemImpl.<init>(MetricsSystemImpl.java:120) > at > org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.<init>(DefaultMetricsSystem.java:37) > at > org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.<clinit>(DefaultMetricsSystem.java:34) > at > org.apache.hadoop.security.UgiInstrumentation.create(UgiInstrumentation.java:51) > at > org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:216) > at > org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:184) > at > org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:236) > at > org.apache.hadoop.security.KerberosName.<clinit>(KerberosName.java:79) > at > org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:209) > at > org.apache.hadoop.security.UserGroupInformation.setConfiguration(UserGroupInformation.java:226) > at > org.apache.spark.deploy.SparkHadoopUtil.<init>(SparkHadoopUtil.scala:36) > at > org.apache.spark.deploy.SparkHadoopUtil$.<init>(SparkHadoopUtil.scala:109) > at > org.apache.spark.deploy.SparkHadoopUtil$.<clinit>(SparkHadoopUtil.scala) > at org.apache.spark.SparkContext.<init>(SparkContext.scala:228) > {code} -- This message was sent by Atlassian JIRA (v6.2#6252)