I have code that works under 1.2.1 but when I upgraded to 1.3.0 it fails to
find the s3 hadoop file system.

I get the "java.lang.IllegalArgumentException: Wrong FS: s3://path to my
file], expected: file:///" when I try to save a parquet file. This worked in
1.2.1.

Has anyone else seen this?

I'm running spark using "local[8]" so it's all internal. These are actually
unit tests in our app that are failing now.

Thanks.
Jim




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/1-3-Hadoop-File-System-problem-tp22207.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to