I have installed spark2 parcel through cloudera CDH 12.0. I see some issue
there. Look like it didn't got configured properly.

$ spark2-shell
Exception in thread "main" java.lang.NoClassDefFoundError:
org/apache/hadoop/fs/FSDataInputStream
    at
org.apache.spark.deploy.SparkSubmitArguments$$anonfun$mergeDefaultSparkProperties$1.apply(SparkSubmitArguments.scala:118)
    at
org.apache.spark.deploy.SparkSubmitArguments$$anonfun$mergeDefaultSparkProperties$1.apply(SparkSubmitArguments.scala:118)
    at scala.Option.getOrElse(Option.scala:121)
    at
org.apache.spark.deploy.SparkSubmitArguments.mergeDefaultSparkProperties(SparkSubmitArguments.scala:118)
    at
org.apache.spark.deploy.SparkSubmitArguments.<init>(SparkSubmitArguments.scala:104)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException:
org.apache.hadoop.fs.FSDataInputStream
    at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:358)

 I have Hadoop version:

$ hadoop version
Hadoop 2.6.0-cdh5.12.0
Subversion http://github.com/cloudera/hadoop -r
dba647c5a8bc5e09b572d76a8d29481c78d1a0dd
Compiled by jenkins on 2017-06-29T11:31Z
Compiled with protoc 2.5.0
>From source with checksum 7c45ae7a4592ce5af86bc4598c5b4
This command was run using
/opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/jars/hadoop-common-2.6.0-cdh5.12.0.jar

also ,

$ ls /etc/spark/conf shows :

classpath.txt            __cloudera_metadata__
navigator.lineage.client.properties  spark-env.sh
__cloudera_generation__  log4j.properties
spark-defaults.conf                  yarn-conf


while, /etc/spark2/conf is empty .


How should I fix this ? Do I need to do any manual configuration ?



Regards,
Vikash

Reply via email to