You need to add the spark-sql dependencies
<http://mvnrepository.com/artifact/org.apache.spark/spark-sql_2.10/1.3.1>
in your projects build file.

Thanks
Best Regards

On Wed, Sep 2, 2015 at 6:41 PM, rakesh sharma <rakeshsharm...@hotmail.com>
wrote:

> Error: application failed with exception
> java.lang.NoSuchMethodError:
> org.apache.spark.sql.SQLContext.<init>(Lorg/apache/spark/api/java/JavaSparkContext;)V
>         at
> examples.PersonRecordReader.getPersonRecords(PersonRecordReader.java:35)
>         at examples.PersonRecordReader.main(PersonRecordReader.java:17)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at
> org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:367)
>         at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:77)
>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>
>
>
> Hi All
>
> I am getting the above exception when I am using SQLContext in spark jobs.
> The error occurs only with the insertion of these statements. The rdd is
> fine and it prints all correctly.
> The error occurs when creating dataframes. I am using maven dependencies
> version 1.3.1
>
> public static void getPersonRecords(String...args) {
> SparkConf sparkConf = new SparkConf().setAppName("SQLContext");
> JavaSparkContext javaSparkContext = new JavaSparkContext(sparkConf);
> JavaRDD<String> lines = javaSparkContext.textFile(args[0], 1);
> JavaRDD<Person> personRecords = lines.map(new Function<String, Person>() {
>
> public Person call(String line) throws Exception {
> System.out.println(line);
> String[] rec = line.split(",");
> return new Person(Integer.parseInt(rec[1].trim()), rec[0]);
> }
> });
> for(Person p : personRecords.collect())  {
> System.out.println(p.getName());
> }
> SQLContext sqlContext = new SQLContext(javaSparkContext);
> DataFrame dataFrame = sqlContext.createDataFrame(personRecords,
> Person.class);
> }
>
>
> Please help me stuck with this since morning
>
> thanks
> rakesh
>

Reply via email to