; user@spark.apache.org
Great, or you can just use nscala-time with scala 2.10!
On Tue Feb 17 2015 at 5:41:53 PM Hammam CHAMSI hscha...@hotmail.com wrote:
Thanks Kevin for your reply,
I downloaded the pre_built version and as you said the default spark scala
version is 2.10. I'm now building
of nscala-time within spark-shell
To: hscha...@hotmail.com; user@spark.apache.org
What is your scala version used to build Spark?
It seems your nscala-time library scala version is 2.11,
and default Spark scala version is 2.10.
On Tue Feb 17 2015 at 1:51:47 AM Hammam CHAMSI hscha...@hotmail.com wrote
@spark.apache.org
Then, why don't you use nscala-time_2.10-1.8.0.jar, not
nscala-time_2.11-1.8.0.jar ?
On Tue Feb 17 2015 at 5:55:50 PM Hammam CHAMSI hscha...@hotmail.com wrote:
I can use nscala-time with scala, but my issue is that I can't use it witinh
spark-shell console! It gives my the error below
)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Your help is very aappreciated,
Regards,
Hammam
)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Your help is very aappreciated,
Regards,
Hammam
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Use-of-nscala-time-within-spark-shell-tp21624.html
Sent from the Apache Spark User List mailing list archive at Nabble.com