Hi,
I’m stuck with the same issue, but I see
org.apache.hadoop.fs.s3native.NativeS3FileSystem in the hadoop-core:1.0.4
(that’s the current hadoop-client I use) and this far is transitive dependency
that comes from spark itself. I’m using custom build of spark 1.3.1 with
hadoop-client 1.0.4.
Thanks all...
btw, s3n load works without any issues with spark-1.3.1-bulit-for-hadoop
2.4
I tried this on 1.3.1-hadoop26
sc.hadoopConfiguration.set(fs.s3n.impl,
org.apache.hadoop.fs.s3native.NativeS3FileSystem)
val f = sc.textFile(s3n://bucket/file)
f.count
No it can't find the
NativeS3FileSystem class is in hadoop-aws jar.
Looks like it was not on classpath.
Cheers
On Thu, Apr 23, 2015 at 7:30 AM, Sujee Maniyam su...@sujee.net wrote:
Thanks all...
btw, s3n load works without any issues with spark-1.3.1-bulit-for-hadoop
2.4
I tried this on 1.3.1-hadoop26
Below is my code to access s3n without problem (only for 1.3.1. there is a bug
in 1.3.0).
Configuration hadoopConf = ctx.hadoopConfiguration();
hadoopConf.set(fs.s3n.impl,
org.apache.hadoop.fs.s3native.NativeS3FileSystem);
This thread from hadoop mailing list should give you some clue:
http://search-hadoop.com/m/LgpTk2df7822
On Wed, Apr 22, 2015 at 9:45 AM, Sujee Maniyam su...@sujee.net wrote:
Hi all
I am unable to access s3n:// urls using sc.textFile().. getting 'no
file system for scheme s3n://' error.
Thanks all...
btw, s3n load works without any issues with spark-1.3.1-bulit-for-hadoop
2.4
I tried this on 1.3.1-hadoop26
sc.hadoopConfiguration.set(fs.s3n.impl,
org.apache.hadoop.fs.s3native.NativeS3FileSystem)
val f = sc.textFile(s3n://bucket/file)
f.count
No it can't find the