Ted,
spark-catalyst_2.11-1.2.1.jar is present in the class path. BTW, I am running 
the code locally in eclipse workspace.

Here’s complete exception stack trace - 

Exception in thread "main" scala.ScalaReflectionException: class 
org.apache.spark.sql.catalyst.ScalaReflection in JavaMirror with primordial 
classloader with boot classpath 
[/Applications/eclipse/plugins/org.scala-lang.scala-library_2.11.5.v20150101-184742-3fafbc204f.jar:/Applications/eclipse/plugins/org.scala-lang.scala-reflect_2.11.5.v20150101-184742-3fafbc204f.jar:/Applications/eclipse/plugins/org.scala-lang.scala-actors_2.11.5.v20150101-184742-3fafbc204f.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_31.jdk/Contents/Home/jre/lib/resources.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_31.jdk/Contents/Home/jre/lib/rt.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_31.jdk/Contents/Home/jre/lib/sunrsasign.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_31.jdk/Contents/Home/jre/lib/jsse.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_31.jdk/Contents/Home/jre/lib/jce.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_31.jdk/Contents/Home/jre/lib/charsets.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_31.jdk/Contents/Home/jre/lib/jfr.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_31.jdk/Contents/Home/jre/classes]
 not found.
        at 
scala.reflect.internal.Mirrors$RootsBase.staticClass(Mirrors.scala:123)
        at 
scala.reflect.internal.Mirrors$RootsBase.staticClass(Mirrors.scala:22)
        at 
org.apache.spark.sql.catalyst.ScalaReflection$$typecreator1$1.apply(ScalaReflection.scala:115)
        at 
scala.reflect.api.TypeTags$WeakTypeTagImpl.tpe$lzycompute(TypeTags.scala:232)
        at scala.reflect.api.TypeTags$WeakTypeTagImpl.tpe(TypeTags.scala:232)
        at scala.reflect.api.TypeTags$class.typeOf(TypeTags.scala:341)
        at scala.reflect.api.Universe.typeOf(Universe.scala:61)
        at 
org.apache.spark.sql.catalyst.ScalaReflection$class.schemaFor(ScalaReflection.scala:115)
        at 
org.apache.spark.sql.catalyst.ScalaReflection$.schemaFor(ScalaReflection.scala:33)
        at 
org.apache.spark.sql.catalyst.ScalaReflection$class.schemaFor(ScalaReflection.scala:100)
        at 
org.apache.spark.sql.catalyst.ScalaReflection$.schemaFor(ScalaReflection.scala:33)
        at 
org.apache.spark.sql.catalyst.ScalaReflection$class.attributesFor(ScalaReflection.scala:94)
        at 
org.apache.spark.sql.catalyst.ScalaReflection$.attributesFor(ScalaReflection.scala:33)
        at org.apache.spark.sql.SQLContext.createSchemaRDD(SQLContext.scala:111)
        ——————





> On Feb 28, 2015, at 9:31 AM, Ted Yu <yuzhih...@gmail.com> wrote:
> 
> Have you verified that spark-catalyst_2.10 jar was in the classpath ?
> 
> Cheers
> 
> On Sat, Feb 28, 2015 at 9:18 AM, Ashish Nigam <ashnigamt...@gmail.com 
> <mailto:ashnigamt...@gmail.com>> wrote:
> Hi,
> I wrote a very simple program in scala to convert an existing RDD to 
> SchemaRDD.
> But createSchemaRDD function is throwing exception 
> 
> Exception in thread "main" scala.ScalaReflectionException: class 
> org.apache.spark.sql.catalyst.ScalaReflection in JavaMirror with primordial 
> classloader with boot classpath [.....] not found
> 
> 
> Here's more info on the versions I am using - 
> 
> <scala.binary.version>2.11</scala.binary.version>
>     <spark.version>1.2.1</spark.version>
>     <scala.version>2.11.5</scala.version>
> 
> Please let me know how can I resolve this problem.
> 
> Thanks
> Ashish
> 

Reply via email to