And, also make sure your scala version is 2.11 for your build. 


> On Nov 16, 2015, at 3:43 PM, Fengdong Yu <fengdo...@everstring.com> wrote:
> 
> Ignore my inputs, I think HiveSpark.java is your main method located.
> 
> can you paste the whole pom.xml and your code?
> 
> 
> 
> 
>> On Nov 16, 2015, at 3:39 PM, Fengdong Yu <fengdo...@everstring.com> wrote:
>> 
>> The code looks good. can you check your ‘import’ in your code?  because it 
>> calls ‘honeywell.test’?
>> 
>> 
>> 
>> 
>> 
>>> On Nov 16, 2015, at 3:02 PM, Yogesh Vyas <informy...@gmail.com> wrote:
>>> 
>>> Hi,
>>> 
>>> While I am trying to read a json file using SQLContext, i get the
>>> following error:
>>> 
>>> Exception in thread "main" java.lang.NoSuchMethodError:
>>> org.apache.spark.sql.SQLContext.<init>(Lorg/apache/spark/api/java/JavaSparkContext;)V
>>>      at com.honeywell.test.testhive.HiveSpark.main(HiveSpark.java:15)
>>>      at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>      at 
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>>      at 
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>      at java.lang.reflect.Method.invoke(Method.java:597)
>>>      at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
>>>      at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
>>>      at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>> 
>>> 
>>> I am using pom.xml with following dependencies and versions:
>>> spark-core_2.11 with version 1.5.1
>>> spark-streaming_2.11 with version 1.5.1
>>> spark-sql_2.11 with version 1.5.1
>>> 
>>> Can anyone please help me out in resolving this ?
>>> 
>>> Regards,
>>> Yogesh
>>> 
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>> For additional commands, e-mail: user-h...@spark.apache.org
>>> 
>> 
> 


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to