spark version 1.6.2
scala version 2.10.5

> On 06-Apr-2017, at 8:05 PM, Jörn Franke <jornfra...@gmail.com> wrote:
> 
> And which version does your Spark cluster use?
> 
> On 6. Apr 2017, at 16:11, nayan sharma <nayansharm...@gmail.com 
> <mailto:nayansharm...@gmail.com>> wrote:
> 
>> scalaVersion := “2.10.5"
>> 
>> 
>> 
>> 
>>> On 06-Apr-2017, at 7:35 PM, Jörn Franke <jornfra...@gmail.com 
>>> <mailto:jornfra...@gmail.com>> wrote:
>>> 
>>> Maybe your Spark is based on scala 2.11, but you compile it for 2.10 or the 
>>> other way around?
>>> 
>>> On 6. Apr 2017, at 15:54, nayan sharma <nayansharm...@gmail.com 
>>> <mailto:nayansharm...@gmail.com>> wrote:
>>> 
>>>> In addition I am using spark version 1.6.2
>>>> Is there any chance of error coming because of Scala version or 
>>>> dependencies are not matching.?I just guessed.
>>>> 
>>>> Thanks,
>>>> Nayan
>>>> 
>>>>  
>>>>> On 06-Apr-2017, at 7:16 PM, nayan sharma <nayansharm...@gmail.com 
>>>>> <mailto:nayansharm...@gmail.com>> wrote:
>>>>> 
>>>>> Hi Jorn,
>>>>> Thanks for replying.
>>>>> 
>>>>> jar -tf catalyst-data-prepration-assembly-1.0.jar | grep csv
>>>>> 
>>>>> after doing this I have found a lot of classes under 
>>>>> com/databricks/spark/csv/
>>>>> 
>>>>> do I need to check for any specific class ??
>>>>> 
>>>>> Regards,
>>>>> Nayan
>>>>>> On 06-Apr-2017, at 6:42 PM, Jörn Franke <jornfra...@gmail.com 
>>>>>> <mailto:jornfra...@gmail.com>> wrote:
>>>>>> 
>>>>>> Is the library in your assembly jar?
>>>>>> 
>>>>>> On 6. Apr 2017, at 15:06, nayan sharma <nayansharm...@gmail.com 
>>>>>> <mailto:nayansharm...@gmail.com>> wrote:
>>>>>> 
>>>>>>> Hi All,
>>>>>>> I am getting error while loading CSV file.
>>>>>>> 
>>>>>>> val 
>>>>>>> datacsv=sqlContext.read.format("com.databricks.spark.csv").option("header",
>>>>>>>  "true").load("timeline.csv")
>>>>>>> java.lang.NoSuchMethodError: 
>>>>>>> org.apache.commons.csv.CSVFormat.withQuote(Ljava/lang/Character;)Lorg/apache/commons/csv/CSVFormat;
>>>>>>> 
>>>>>>> 
>>>>>>> I have added the dependencies in sbt file 
>>>>>>> // Spark Additional Library - CSV Read as DF
>>>>>>> libraryDependencies += "com.databricks" %% "spark-csv" % “1.5.0"
>>>>>>> and starting the spark-shell with command
>>>>>>> 
>>>>>>> spark-shell --master yarn-client  --jars 
>>>>>>> /opt/packages/xxxx-data-prepration/target/scala-2.10/xxxx-data-prepration-assembly-1.0.jar
>>>>>>>  --name nayan 
>>>>>>> 
>>>>>>> 
>>>>>>> 
>>>>>>> Thanks for any help!!
>>>>>>> 
>>>>>>> 
>>>>>>> Thanks,
>>>>>>> Nayan
>>>>> 
>>>> 
>> 

Reply via email to