Re: Can't construct instance of class org.apache.hadoop.conf.Configuration

2012-04-30 Thread Brock Noland
Hi, I would try this: export CLASSPATH=$(hadoop classpath) Brock On Mon, Apr 30, 2012 at 10:15 AM, Ryan Cole r...@rycole.com wrote: Hello, I'm trying to run an application, written in C++, that uses libhdfs. I have compiled the code and get an error when I attempt to run the application.

[ANNOUNCE] Apache MRUnit 0.8.0-incubating released

2012-02-02 Thread Brock Noland
The Apache MRUnit team is pleased to announce the release of MRUnit 0.8.0-incubating from the Apache Incubator. This is the second release of Apache MRUnit, a Java library that helps developers unit test Apache Hadoop map reduce jobs. The release is available here:

Re: race condition in hadoop 0.20.2 (cdh3u1)

2012-01-17 Thread Brock Noland
Hi, tl;dr DUMMY should not be static. On Tue, Jan 17, 2012 at 3:21 PM, Stan Rosenberg srosenb...@proclivitysystems.com wrote: class MyKeyT implements WritableComparableT {  private String ip; // first part of the key  private final static Text DUMMY = new Text();  ...  public void

Re: desperate question about NameNode startup sequence

2011-12-17 Thread Brock Noland
Hi, Since your using CDH2, I am moving this to CDH-USER. You can subscribe here: http://groups.google.com/a/cloudera.org/group/cdh-user BCC'd common-user On Sat, Dec 17, 2011 at 2:01 AM, Meng Mao meng...@gmail.com wrote: Maybe this is a bad sign -- the edits.new was created before the master

Re: ArrayWritable usage

2011-12-13 Thread Brock Noland
Hi, ArrayWritable is a touch hard to use. Say you have an array of IntWritable[]. The get() method or ArrayWritable, after serializations/deserialization, does in fact return an array of type Writable. As such you cannot cast it directly to IntWritable[]. Individual elements are of type

Re: Question on Hadoop Streaming

2011-12-06 Thread Brock Noland
Does you job end with an error? I am guessing what you want is: -mapper bowtiestreaming.sh -file '/root/bowtiestreaming.sh' First option says use your script as a mapper and second says ship your script as part of the job. Brock On Tue, Dec 6, 2011 at 4:59 PM, Romeo Kienzler ro...@ormium.de

Re: hadoop-fuse unable to find java

2011-11-29 Thread Brock Noland
Hi, This specific issue is probably more appropriate on the CDH-USER list. (BCC common-user) It looks like the JRE detection mechanism recently added to BIGTOP would have this same issue: https://issues.apache.org/jira/browse/BIGTOP-25 To resolve the immediate issue I would set an environment

Re: Hadoop Serialization: Avro

2011-11-26 Thread Brock Noland
Hi, Depending on the response you get here, you might also post the question separately on avro-user. On Sat, Nov 26, 2011 at 1:46 PM, Leonardo Urbina lurb...@mit.edu wrote: Hey everyone, First time posting to the list. I'm currently writing a hadoop job that will run daily and whose output

Re: HDFS DataNode daily log growing really high and fast

2011-10-31 Thread Brock Noland
Hi, On Mon, Oct 31, 2011 at 12:59 AM, Ronen Itkin ro...@taykey.com wrote: For instance, yesterday's daily log: /var/log/hadoop/hadoop-hadoop-datanode-ip-10-10-10-4.log on the problematic Node03 was in the size of 1.1 GB while on other Nodes the same log was in the size of 87 MB. Again,

Re: Using KeyValueInputFormat as a Input format

2011-10-25 Thread Brock Noland
Hi, On Sun, Oct 23, 2011 at 10:40 AM, Varun Thacker varunthacker1...@gmail.com wrote: I am having trouble using KeyValueInputFormat as a Input format. I used both hadoop 0.20.1 and 0.21.0 and get a error while using it. This seems to be because of this issue -

Re: implementing comparable

2011-10-16 Thread Brock Noland
Hi, Inline.. On Sun, Oct 16, 2011 at 9:40 PM, Keith Thompson kthom...@binghamton.eduwrote: Thanks. I went back and changed to WritableComparable instead of just Comparable. So, I added the readFields and write methods. I also took care of the typo in the constructor. :P Now I am

Re: implementing comparable

2011-10-15 Thread Brock Noland
Hi, Discussion, below. On Sat, Oct 15, 2011 at 4:26 PM, Keith Thompson kthom...@binghamton.eduwrote: Hello, I am trying to write my very first MapReduce code. When I try to run the jar, I get this error: 11/10/15 17:17:30 INFO mapred.JobClient: Task Id :

Re: problem while running wordcount on lion x

2011-10-05 Thread Brock Noland
Hi, On Wed, Oct 5, 2011 at 7:13 PM, Jignesh Patel jign...@websoft.com wrote: I also found another problem if I directly export from eclipse as a jar file then while trying javac -jar or hadoop -jar doesn't recognize that jar. However same jar works well with windows. Can you please share

Re: problem while running wordcount on lion x

2011-10-05 Thread Brock Noland
to get a listing of the jar and: jar xf wordcountsmp/wordcount.jar To extract it. and got the error Unable to access jar file xf my jar file size is 5kb. I am feeling somehow eclipse export in macOS is not creating appropriate jar. On Oct 5, 2011, at 8:16 PM, Brock Noland wrote: Hi

Re: Outputformat and RecordWriter in Hadoop Pipes

2011-09-20 Thread Brock Noland
Hi, On Tue, Sep 13, 2011 at 12:27 PM, Vivek K hadoop.v...@gmail.com wrote: Hi all, I am trying to build a Hadoop/MR application in c++ using hadoop-pipes. I have been able to successfully work with my own mappers and reducers, but now I need to generate output (from reducer) in a format

Re: old problem: mapper output as sequence file

2011-09-19 Thread Brock Noland
Hi, On Mon, Sep 19, 2011 at 3:19 PM, Shi Yu sh...@uchicago.edu wrote: I am stuck again in a probably very simple problem.  I couldn't generate the map output in sequence file format.  I always get this error: java.io.IOException: wrong key class: org.apache.hadoop.io.Text is not class

Re: Hadoop Streaming job Fails - Permission Denied error

2011-09-14 Thread Brock Noland
Hi, This probably belongs on mapreduce-user as opposed to common-user. I have BCC'ed the common-user group. Generally it's a best practice to ship the scripts with the job. Like so: hadoop jar /usr/lib/hadoop-0.20/contrib/streaming/hadoop-streaming-0.20.2-cdh3u0.jar -input

Re: Is it possible to access the HDFS via Java OUTSIDE the Cluster?

2011-09-05 Thread Brock Noland
Hi, On Tue, Sep 6, 2011 at 9:29 AM, Ralf Heyde ralf.he...@gmx.de wrote: Hello, I have found a HDFSClient which shows me, how to access my HDFS from inside the cluster (i.e. running on a Node). My Idea is, that different processes may write 64M Chunks to HDFS from external

Re: Creating a hive table for a custom log

2011-09-01 Thread Brock Noland
Hi, On Thu, Sep 1, 2011 at 9:08 AM, Raimon Bosch raimon.bo...@gmail.com wrote: Hi, I'm trying to create a table similar to apache_log but I'm trying to avoid to write my own map-reduce task because I don't want to have my HDFS files twice. So if you're working with log lines like this: