Have you upgraded the cluster where you are running this 1.0.1 as
well? A NoSuchMethodError
almost always means that the class files available at runtime are different
from those that were there when you compiled your program.
On Mon, Jul 14, 2014 at 7:06 PM, SK skrishna...@gmail.com wrote:
Hi,
I am using Spark 1.0.1. I am using the following piece of code to parse a
json file. It is based on the code snippet in the SparkSQL programming
guide. However, the compiler outputs an error stating:
java.lang.NoSuchMethodError:
org.apache.spark.sql.SQLContext.jsonRDD(Lorg/apache/spark/rdd/RDD;)Lorg/apache/spark/sql/SchemaRDD;
I get a similar error for jsonFile() as well. I have included the spark-sql
1.0.1 jar when building my program using sbt. What is the right library to
import for jsonRDD and jsonFile?
thanks
import org.apache.spark._
import org.apache.spark.sql._
import org.apache.spark.sql.json
object SQLExample{
def main(args : Array[String]) {
val sparkConf = new SparkConf().setAppName(JsonExample)
val sc = new SparkContext(sparkConf)
val sqlc = new org.apache.spark.sql.SQLContext(sc)
val jrdd = sc.textFile(args(0)).filter(r= r.trim != )
val data = sqlc.jsonRDD(jrdd)
data.printSchema()
}
}
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/jsonRDD-NoSuchMethodError-tp9688.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.