Re: Spark2.1 installation issue

2017-07-27 Thread Vikash Kumar
Hi,

 I have posted to cloudera community also. But its spark2 installation,
thought I might get some pointers here.

Thank you

On Thu, Jul 27, 2017, 11:29 PM Marcelo Vanzin  wrote:

> Hello,
>
> This is a CDH-specific issue, please use the Cloudera forums / support
> line instead of the Apache group.
>
> On Thu, Jul 27, 2017 at 10:54 AM, Vikash Kumar
>  wrote:
> > I have installed spark2 parcel through cloudera CDH 12.0. I see some
> issue
> > there. Look like it didn't got configured properly.
> >
> > $ spark2-shell
> > Exception in thread "main" java.lang.NoClassDefFoundError:
> > org/apache/hadoop/fs/FSDataInputStream
> > at
> >
> org.apache.spark.deploy.SparkSubmitArguments$$anonfun$mergeDefaultSparkProperties$1.apply(SparkSubmitArguments.scala:118)
> > at
> >
> org.apache.spark.deploy.SparkSubmitArguments$$anonfun$mergeDefaultSparkProperties$1.apply(SparkSubmitArguments.scala:118)
> > at scala.Option.getOrElse(Option.scala:121)
> > at
> >
> org.apache.spark.deploy.SparkSubmitArguments.mergeDefaultSparkProperties(SparkSubmitArguments.scala:118)
> > at
> >
> org.apache.spark.deploy.SparkSubmitArguments.(SparkSubmitArguments.scala:104)
> > at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)
> > at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> > Caused by: java.lang.ClassNotFoundException:
> > org.apache.hadoop.fs.FSDataInputStream
> > at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
> > at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
> > at java.security.AccessController.doPrivileged(Native Method)
> > at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
> > at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
> > at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
> > at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
> >
> >  I have Hadoop version:
> >
> > $ hadoop version
> > Hadoop 2.6.0-cdh5.12.0
> > Subversion http://github.com/cloudera/hadoop -r
> > dba647c5a8bc5e09b572d76a8d29481c78d1a0dd
> > Compiled by jenkins on 2017-06-29T11:31Z
> > Compiled with protoc 2.5.0
> > From source with checksum 7c45ae7a4592ce5af86bc4598c5b4
> > This command was run using
> >
> /opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/jars/hadoop-common-2.6.0-cdh5.12.0.jar
> >
> > also ,
> >
> > $ ls /etc/spark/conf shows :
> >
> > classpath.txt__cloudera_metadata__
> > navigator.lineage.client.properties  spark-env.sh
> > __cloudera_generation__  log4j.properties   spark-defaults.conf
> > yarn-conf
> >
> >
> > while, /etc/spark2/conf is empty .
> >
> >
> > How should I fix this ? Do I need to do any manual configuration ?
> >
> >
> >
> > Regards,
> > Vikash
>
>
>
> --
> Marcelo
>


Re: Spark2.1 installation issue

2017-07-27 Thread Marcelo Vanzin
Hello,

This is a CDH-specific issue, please use the Cloudera forums / support
line instead of the Apache group.

On Thu, Jul 27, 2017 at 10:54 AM, Vikash Kumar
 wrote:
> I have installed spark2 parcel through cloudera CDH 12.0. I see some issue
> there. Look like it didn't got configured properly.
>
> $ spark2-shell
> Exception in thread "main" java.lang.NoClassDefFoundError:
> org/apache/hadoop/fs/FSDataInputStream
> at
> org.apache.spark.deploy.SparkSubmitArguments$$anonfun$mergeDefaultSparkProperties$1.apply(SparkSubmitArguments.scala:118)
> at
> org.apache.spark.deploy.SparkSubmitArguments$$anonfun$mergeDefaultSparkProperties$1.apply(SparkSubmitArguments.scala:118)
> at scala.Option.getOrElse(Option.scala:121)
> at
> org.apache.spark.deploy.SparkSubmitArguments.mergeDefaultSparkProperties(SparkSubmitArguments.scala:118)
> at
> org.apache.spark.deploy.SparkSubmitArguments.(SparkSubmitArguments.scala:104)
> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)
> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> Caused by: java.lang.ClassNotFoundException:
> org.apache.hadoop.fs.FSDataInputStream
> at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
> at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
> at java.security.AccessController.doPrivileged(Native Method)
> at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>
>  I have Hadoop version:
>
> $ hadoop version
> Hadoop 2.6.0-cdh5.12.0
> Subversion http://github.com/cloudera/hadoop -r
> dba647c5a8bc5e09b572d76a8d29481c78d1a0dd
> Compiled by jenkins on 2017-06-29T11:31Z
> Compiled with protoc 2.5.0
> From source with checksum 7c45ae7a4592ce5af86bc4598c5b4
> This command was run using
> /opt/cloudera/parcels/CDH-5.12.0-1.cdh5.12.0.p0.29/jars/hadoop-common-2.6.0-cdh5.12.0.jar
>
> also ,
>
> $ ls /etc/spark/conf shows :
>
> classpath.txt__cloudera_metadata__
> navigator.lineage.client.properties  spark-env.sh
> __cloudera_generation__  log4j.properties   spark-defaults.conf
> yarn-conf
>
>
> while, /etc/spark2/conf is empty .
>
>
> How should I fix this ? Do I need to do any manual configuration ?
>
>
>
> Regards,
> Vikash



-- 
Marcelo

-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org