The issue is that you don't have Hadoop classes in your compiler
classpath. In the first example, you are getting Hadoop classes from
the Spark assembly, which packages everything together.
In the second example, you are referencing Spark .jars as deployed in
a Hadoop cluster. They no longer
Hi Sean,Thanks for the quick reply. I moved to an sbt-based build and I was
able to build the project successfully.
In my /apps/sameert/software/approxstrmatch I see the following:
jar -tf
I was able to resolve this, In my spark-shell command I forgot to add a comma
in between two jar files.
From: ssti...@live.com
To: user@spark.apache.org
Subject: RE: error: bad symbolic reference. A signature in SparkContext.class
refers to term io in package org.apache.hadoop which is not