+1 (non-binding) Did the following on 7 RHEL 6.6 servers - Downloaded and built from source - Downloaded and verified checksum of the binary tar.gz file - Setup a cluster with 1 NN and 6 DNs - Tried regular HDFS commands - Tried EC commands (listPolicies, getPolicy, setPolicy), they work fine - Verified that with a 3-2 policy, 1.67x capacity is used. Below is the output after copying the binary tar.gz file into an EC folder. The file is 318MB.
Configured Capacity: 3221225472 (3 GB) Present Capacity: 3215348743 (2.99 GB) DFS Remaining: 2655666176 (2.47 GB) DFS Used: 559682567 (533.75 MB) Thanks Allen for clarifying on the markdown files. I also verified the site html files (content of the index.html, randomly selected some links). On Tue, Aug 30, 2016 at 2:20 PM Eric Badger <ebad...@yahoo-inc.com.invalid> wrote: > Well that's embarrassing. I had accidentally slightly renamed my > log4j.properties file in my conf directory, so it was there, just not being > read. Apologies for the unnecessary spam. With this and the public key from > Andrew, I give my non-binding +1. > > Eric > > > > On Tuesday, August 30, 2016 4:11 PM, Allen Wittenauer < > a...@effectivemachines.com> wrote: > > > > On Aug 30, 2016, at 2:06 PM, Eric Badger <ebad...@yahoo-inc.com.INVALID> > wrote: > > > > > > WARNING: log4j.properties is not found. HADOOP_CONF_DIR may be > incomplete. > > ^^^^^^^^^^^^^^^^^^^^^^^^^^ > > > > > > After running the above command, the RM UI showed a successful job, but > as you can see, I did not have anything printed onto the command line. > Hopefully this is just a misconfiguration on my part, but I figured that I > would point it out just in case. > > > It gave you a very important message in the output ... > > --------------------------------------------------------------------- > To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org > For additional commands, e-mail: common-dev-h...@hadoop.apache.org > >