RE: Use of nscala-time within spark-shell

2015-02-17 Thread Hammam CHAMSI
; user@spark.apache.org Great, or you can just use nscala-time with scala 2.10! On Tue Feb 17 2015 at 5:41:53 PM Hammam CHAMSI hscha...@hotmail.com wrote: Thanks Kevin for your reply, I downloaded the pre_built version and as you said the default spark scala version is 2.10. I'm now building

RE: Use of nscala-time within spark-shell

2015-02-17 Thread Hammam CHAMSI
of nscala-time within spark-shell To: hscha...@hotmail.com; user@spark.apache.org What is your scala version used to build Spark? It seems your nscala-time library scala version is 2.11, and default Spark scala version is 2.10. On Tue Feb 17 2015 at 1:51:47 AM Hammam CHAMSI hscha...@hotmail.com wrote

RE: Use of nscala-time within spark-shell

2015-02-17 Thread Hammam CHAMSI
@spark.apache.org Then, why don't you use nscala-time_2.10-1.8.0.jar, not nscala-time_2.11-1.8.0.jar ? On Tue Feb 17 2015 at 5:55:50 PM Hammam CHAMSI hscha...@hotmail.com wrote: I can use nscala-time with scala, but my issue is that I can't use it witinh spark-shell console! It gives my the error below

Use of nscala-time within spark-shell

2015-02-16 Thread Hammam CHAMSI
Hi All, Thanks in advance for your help. I have timestamp which I need to convert to datetime using scala. A folder contains the three needed jar files: joda-convert-1.5.jar joda-time-2.4.jar nscala-time_2.11-1.8.0.jar Using scala REPL and adding the jars: scala -classpath *.jar I can use