Hi -
We have two namenodes set up at our company, say:
hdfs://A.mycompany.com
hdfs://B.mycompany.com
From the command line, I can do:
Hadoop fs –ls hdfs://A.mycompany.com//some-dir
And
Hadoop fs –ls hdfs://B.mycompany.com//some-other-dir
I’m now trying to do the same from a Java program
I'm hoping there is a better answer, but I'm thinking you could load
another configuration file (with B.company in it) using Configuration,
grab a FileSystem obj with that and then go forward. Seems like some
unnecessary overhead though.
Thanks,
Tom
On Thu, Dec 8, 2011 at 2:42 PM, Frank Astier
Can you show your code here ? What URL protocol are you using ?
On Thu, Dec 8, 2011 at 5:47 PM, Tom Melendez t...@supertom.com wrote:
I'm hoping there is a better answer, but I'm thinking you could load
another configuration file (with B.company in it) using Configuration,
grab a FileSystem
Can you show your code here ? What URL protocol are you using ?
I’m guess I’m being very naïve (and relatively new to HDFS). I can’t show too
much code, but basically, I’d like to do:
Path myPath = new Path(“hdfs://A.mycompany.com//some-dir”);
Where Path is a hadoop fs path. I think I can
I was confused about this for a while also I dont have all the details but
I think my question on s.o. might help you.
I was playing with different protocols...
Trying to find a way to programatically access all data in Hfds.