A few months ago, we installed Cloudera Hadoop 3 in our local machine
and everything was fine. Recently we also installed Whirr to start
working with clusters. Although we faced some problems, after a while,
we can start up a cluster, log into its master node and commence work.
However, I found out recently that when I type:

hadoop dfs -ls

into our local machine, it now displays everything in the current
directory I am in, not the contents of the DFS. This didn't use to
happen, so we are thinking something got messed up when we installed
Whirr.

What could have caused this, and more importantly, how can we get our
local hadoop dfs to point to the correct location?

I posted this question on a Cloudera support board and I got the
follow response:

For your question, you should email the Whirr user list at
whirr-user@incubator.apache.org. I suspect you have some old jar files
on your classpath, but they'll know for sure.

I appreciate the help and time.

Thank you,
Amy Zhang

Reply via email to