Hi Todd,

Two problems:
- The patch in HADOOP-6152 cannot be applied.

- I have tried an approach similar to the one described by the slides but it 
did not work since jetty cannot find the webapps directory.  See below:
2009-08-10 17:54:41,671 WARN org.mortbay.log: Web application not found 
file:/D:/@sze/hadoop/common/c2/build/webapps/hdfs
2009-08-10 17:54:41,671 WARN org.mortbay.log: Failed startup of context 
org.mortbay.jetty.webapp.webappcont...@1884a40{/,file:/D:/@sze/hadoop/common/c2/build/webapps/hdfs}
java.io.FileNotFoundException: file:/D:/@sze/hadoop/common/c2/build/webapps/hdfs
    at 
org.mortbay.jetty.webapp.WebAppContext.resolveWebApp(WebAppContext.java:959)
    at org.mortbay.jetty.webapp.WebAppContext.getWebInf(WebAppContext.java:793)
    at 
org.mortbay.jetty.webapp.WebInfConfiguration.configureClassLoader(WebInfConfiguration.java:62)
    at org.mortbay.jetty.webapp.WebAppContext.doStart(WebAppContext.java:456)
    at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:50)
    at 
org.mortbay.jetty.handler.HandlerCollection.doStart(HandlerCollection.java:152)
    at 
org.mortbay.jetty.handler.ContextHandlerCollection.doStart(ContextHandlerCollection.java:156)
    at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:50)
    at org.mortbay.jetty.handler.HandlerWrapper.doStart(HandlerWrapper.java:130)
    at org.mortbay.jetty.Server.doStart(Server.java:222)
    at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:50)
    at org.apache.hadoop.http.HttpServer.start(HttpServer.java:464)
    at 
org.apache.hadoop.hdfs.server.namenode.NameNode.startHttpServer(NameNode.java:362)
    at 
org.apache.hadoop.hdfs.server.namenode.NameNode.activate(NameNode.java:309)
    at 
org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:300)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:405)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:399)
    at 
org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1165)
    at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1174)

Thanks,
Nicholas




----- Original Message ----
> From: Todd Lipcon <t...@cloudera.com>
> To: common-dev@hadoop.apache.org
> Cc: hdfs-...@hadoop.apache.org; mapreduce-...@hadoop.apache.org
> Sent: Monday, August 10, 2009 5:30:52 PM
> Subject: Re: Question: how to run hadoop after the project split?
> 
> Hey Nicholas,
> 
> Aaron gave a presentation with his best guess at the HUG last month. His
> slides are here: http://www.cloudera.com/blog/2009/07/17/the-project-split/
> (starting at slide 16)
> (I'd let him reply himself, but he's out of the office this afternoon ;-) )
> 
> Hopefully we'll get towards something better soon :-/
> 
> -Todd
> 
> On Mon, Aug 10, 2009 at 5:25 PM, Tsz Wo (Nicholas), Sze <
> s29752-hadoop...@yahoo.com> wrote:
> 
> > I have to admit that I don't know the official answer.  The hack below
> > seems working:
> > - compile all 3 sub-projects;
> > - copy everything in hdfs/build and mapreduce/build to common/build;
> > - then run hadoop by the scripts in common/bin as before.
> >
> > Any better idea?
> >
> > Nicholas Sze
> >
> >


Reply via email to