I can use nscala-time with scala, but my issue is that I can't use it witinh
spark-shell console! It gives my the error below.
Thanks
From: kevin...@apache.org
Date: Tue, 17 Feb 2015 08:50:04 +
Subject: Re: Use of nscala-time within spark-shell
To: hscha...@hotmail.com; kevin...@apache.org
of nscala-time within spark-shell
To: hscha...@hotmail.com; user@spark.apache.org
What is your scala version used to build Spark?
It seems your nscala-time library scala version is 2.11,
and default Spark scala version is 2.10.
On Tue Feb 17 2015 at 1:51:47 AM Hammam CHAMSI hscha...@hotmail.com wrote
2.11. I'll share
the results here.
Regards,
--
From: kevin...@apache.org
Date: Tue, 17 Feb 2015 01:10:09 +
Subject: Re: Use of nscala-time within spark-shell
To: hscha...@hotmail.com; user@spark.apache.org
What is your scala version used to build Spark
.
Thanks
--
From: kevin...@apache.org
Date: Tue, 17 Feb 2015 08:50:04 +
Subject: Re: Use of nscala-time within spark-shell
To: hscha...@hotmail.com; kevin...@apache.org; user@spark.apache.org
Great, or you can just use nscala-time with scala 2.10!
On Tue Feb 17
My fault, I didn't notice the 11 in the jar name. It is working now with
nscala-time_2.10-1.8.0.jar
Thanks Kevin
From: kevin...@apache.org
Date: Tue, 17 Feb 2015 08:58:13 +
Subject: Re: Use of nscala-time within spark-shell
To: hscha...@hotmail.com; kevin...@apache.org; user
What is your scala version used to build Spark?
It seems your nscala-time library scala version is 2.11,
and default Spark scala version is 2.10.
On Tue Feb 17 2015 at 1:51:47 AM Hammam CHAMSI hscha...@hotmail.com wrote:
Hi All,
Thanks in advance for your help. I have timestamp which I need