Hi

I have written a few extensions for sparkSQL (for version 1.1.0) and I am 
trying to deploy my new jar files (one for catalyst and one for sql/core) on 
ec2.

My approach was to create a new spark/lib/spark-assembly-1.1.0-hadoop1.0.4.jar 
that merged the contents of the old one with the contents of my new jar files 
and I propagated the changes to workers. 

However when I tried the code snippet below I received the error message that I 
paste at the end of this email. I was wondering, do you guys have any 
suggestions on how to fix this?

thanks
Christos

the code is:

import org.apache.spark.{SparkContext, SparkConf}
import org.apache.spark.sql.SQLContext
val sqlContext = new SQLContext(sc)

the error message is:

error: bad symbolic reference. A signature in package.class refers to term 
scalalogging
in package com.typesafe which is not available.
It may be completely missing from the current classpath, or the version on
the classpath might be incompatible with the version used when compiling 
package.class.
<console>:14: error: bad symbolic reference. A signature in package.class 
refers to term slf4j
in value com.typesafe.scalalogging which is not available.
It may be completely missing from the current classpath, or the version on
the classpath might be incompatible with the version used when compiling 
package.class.
       val sqlContext = new SQLContext(sc)

Reply via email to