Hm, no I don't have that in my path.
However, someone on the spark-csv project advised that since I could not
get another package/example to work, that this might be a Spark / Yarn
issue: https://github.com/databricks/spark-csv/issues/54
Thoughts? I'll open a ticket later this afternoon if the
Do you have commons-csv-1.1-bin.jar in your path somewhere ? I had to
download and add this.
Cheers
k/
On Wed, Apr 22, 2015 at 11:01 AM, Mohammed Omer beancinemat...@gmail.com
wrote:
Afternoon all,
I'm working with Scala 2.11.6, and Spark 1.3.1 built from source via:
`mvn -Pyarn
Afternoon all,
I'm working with Scala 2.11.6, and Spark 1.3.1 built from source via:
`mvn -Pyarn -Phadoop-2.4 -Dscala-2.11 -DskipTests clean package`
The error is encountered when running spark shell via:
`spark-shell --packages com.databricks:spark-csv_2.11:1.0.3`
The full trace of the
You can try pulling the jar with wget and using it with -jars with Spark shell.
I used 1.0.3 with Spark 1.3.0 but with a different version of scala. From the
stack trace it looks like Spark shell is just not seeing the csv jar...
Sent on the new Sprint Network from my Samsung Galaxy S®4.