See inline comments.

On 5/31/11 5:37 AM, "DKN" <[email protected]> wrote:

> Hi Eric, Thanks for your response. I did get a chance to download the sources
> from the trunk and partially was also successful in setting up the chukwa
> stand-alone cluster.
> 
> So, what I am doing now is to compile and setup the 0.5.0 chukwa with
> instructions on this page : http://wiki.apache.org/hadoop/Chukwa_Quick_Start
> 
> What I could do was to add an adaptor (in the initial_adaptors) for
> monitoring a file and could see that something was getting written in my
> hadoop file system with the text that I was appending. There were multiple
> updates made by other systemmetrics, hadoop, hadoopmetrics etc.
> 
> Currently I see the following three issues for me to make some more progress
> with this demo setup ..
> 
> 1. I modified the hadoop_home/conf/log4j.properties as per the instructions
> in the QuickStart page. While starting hadoop I see the following messages
> (not sure if I have to worry about it now .. ).
> 
> starting namenode, logging to /home/dev/hadoop-0.20.2//logs/hadoop-root
> -namenode-xtest12.out
> log4j:WARN No such property [datePattern] in org.apache.log4j.net.Socke
> tAppender.
> log4j:WARN No such property [file] in org.apache.log4j.net.SocketAppend
> er.
> 

Socket appender doesn¹t support File and DatePattern parameters.  In
log4j.properties, comment out:
 
log4j.appender.DRFA.File=${hadoop.log.dir}/${hadoop.log.file}
log4j.appender.DRFA.DatePattern=.yyyy-MM-dd
 
> 2. Apart from the chukwa clinet jar for hadoop, I have also copied json.jar
> and json*simple*.jar in HADOOP_HOME/lib folder. I get the following in the
> namenode logs and other logs : What could be going on here ?
> 
> Exception in thread "Timer thread for monitoring rpc"
> java.lang.NoClassDefFoundError: org/json/simple/JSONObject
>         at
> org.apache.hadoop.chukwa.inputtools.log4j.Log4JMetricsContext.emitRecord(Log4J
> MetricsContext.java:102)
>         at
> org.apache.hadoop.metrics.spi.AbstractMetricsContext.emitRecords(AbstractMetri
> csContext.java:306)
>         at
> org.apache.hadoop.metrics.spi.AbstractMetricsContext.timerEvent(AbstractMetric
> sContext.java:292)
>         at
> org.apache.hadoop.metrics.spi.AbstractMetricsContext.access$000(AbstractMetric
> sContext.java:52)
>         at
> org.apache.hadoop.metrics.spi.AbstractMetricsContext$1.run(AbstractMetricsCont
> ext.java:251)
>         at java.util.TimerThread.mainLoop(Timer.java:512)
>         at java.util.TimerThread.run(Timer.java:462)
> Caused by: java.lang.ClassNotFoundException: org.json.simple.JSONObject
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>         ... 7 more

It looks like the json simple library is not readable in HADOOP_HOME/lib, or
hadoop requires a "stop/start".  Make sure you have json-simple-1.1.jar in
HADOOP_HOME/lib and is picked up by hadoop, json.jar should not be necessary
anymore.  Check the file permission, and restart Hadoop.  In addition, which
version of Hadoop are you using?

> 3. I have configured my hbase properly and was able to also execute the
> hbase.spec to list out hbase tables etc. With this, when I started ./chukwa
> hicc .. everything seemed to initialize properly for hicc with embedded
> jetty. I could load the front page for http://localhost:4080/hicc. But I
> don't get anything in column and row selection to plot a graph.
> 
> How do I debug this ?
> Do I need MySql still, with the hbase present ?

MySQL is not required.  There is a little short cut in the code which
fetches the first row of the selected time range to cache the column
structure.  The default time selection is set to 24 hours, and if the system
has been running for less than 1 hour.  It would attempt to scan hbase and
use time-24 hours's first row as meta data, and come up empty.  Hence, try
to use time range like "Last 1 hour" to verify if there is data written to
hbase.  If there is no data in hbase, paste the initialization part of the
collector.log in the message, and we can debug further.

Regards,
Eric

Reply via email to