I hit this issue as well, turned out to be a typo in the HortonWorks instructions.
<name>mapred.jobtracker.plugins</name> <value>org.apache.hadoop.thriftfs.ThriftJobTrackerPlugin</value> Should be: <name>mapreduce.jobtracker.plugins</name> <value>org.apache.hadoop.thriftfs.ThriftJobTrackerPlugin</value> Mike On Tue, Sep 10, 2013 at 2:34 PM, Edwards, Jonathan < [email protected]> wrote: > Fellow Ambari Users, > > I've setup a Hortonworks cluster (HDP 1.3.2) using Ambari and have HDFS, > MapRed, Hive, Hbase all working nicely. I've gotten ambitious and manually > installed Hue – and been able to get it working, with the exception of > JobTracker integration. The reason that I'm emailing the Ambari user group > is that the ultimate issue seems to be that, although properly configured > in the mapped-site.xml via Ambari and present in /usr/lib/hadoop/lib on the > JobTracker node managed by Ambari, the Thrift plugin isn't loading – I > suspect that the jar isn't being added to the class path by the startup > commands issued from the Ambari server to the remote agents. Is there a way > to view what the configured class path is for the job tracker? Is there a > way to override the default behavior to include additional JobTracker > plugin jars? > > Regards, > > *Jonathan Edwards* > > Technical Architect > > Senior Systems Engineering Consultant > > > *Staples, Inc.* > 1 Environmental Way | Broomfield, CO 80021** > > > > > >
