We do this in SparkILookp (
https://github.com/apache/spark/blob/master/repl/scala-2.10/src/main/scala/org/apache/spark/repl/SparkILoop.scala#L1023-L1037).
What is the version of Spark you are using? How did you add the spark-csv
jar?
On Thu, Jul 16, 2015 at 1:21 PM, Koert Kuipers
i am using scala 2.11
spark jars are not in my assembly jar (they are provided), since i launch
with spark-submit
On Thu, Jul 16, 2015 at 4:34 PM, Koert Kuipers ko...@tresata.com wrote:
spark 1.4.0
spark-csv is a normal dependency of my project and in the assembly jar
that i use
but i
spark 1.4.0
spark-csv is a normal dependency of my project and in the assembly jar that
i use
but i also tried adding spark-csv with --package for spark-submit, and got
the same error
On Thu, Jul 16, 2015 at 4:31 PM, Yin Huai yh...@databricks.com wrote:
We do this in SparkILookp (
that solved it, thanks!
On Thu, Jul 16, 2015 at 6:22 PM, Koert Kuipers ko...@tresata.com wrote:
thanks i will try 1.4.1
On Thu, Jul 16, 2015 at 5:24 PM, Yin Huai yh...@databricks.com wrote:
Hi Koert,
For the classloader issue, you probably hit
No problem:) Glad to hear that!
On Thu, Jul 16, 2015 at 8:22 PM, Koert Kuipers ko...@tresata.com wrote:
that solved it, thanks!
On Thu, Jul 16, 2015 at 6:22 PM, Koert Kuipers ko...@tresata.com wrote:
thanks i will try 1.4.1
On Thu, Jul 16, 2015 at 5:24 PM, Yin Huai yh...@databricks.com
thanks i will try 1.4.1
On Thu, Jul 16, 2015 at 5:24 PM, Yin Huai yh...@databricks.com wrote:
Hi Koert,
For the classloader issue, you probably hit
https://issues.apache.org/jira/browse/SPARK-8365, which has been fixed in
Spark 1.4.1. Can you try 1.4.1 and see if the exception disappear?