Unique ID of current Node

2009-12-07 Thread Peter Volk
Hi, I'm trying to create a unique identifier of the node that I am sitting on no matter if it is a VM or metal. I've been scannning the API but have not found a possibility to get something I was looking for. Anyone have an idea? Cheers and thanks, Peter

Reminder: Apache Hadoop Get Together Berlin - December 2009

2009-12-07 Thread Isabel Drost
As announced on Apache Con US 09, the next Apache Hadoop Get Together Berlin is scheduled for next Wednesday: When: Wednesday December 16, 2009 at 5:00pm Where: newthinking store, Tucholskystr. 48, Berlin Talks scheduled so far: Richard Hutton (nugg.ad): Moving from five days to one hour. -

Re: Start Hadoop env using JAVA or HADOOP APIs (InProcess)

2009-12-07 Thread Steve Loughran
samuellawrence wrote: Hai, I have to start the HADOOP environment using java code (inprocess). I would like to use the APIs to start it. Could anyone please give me snippet or a link. Hi 1. I've been starting/stopping Hadoop with SmartFrog, in JVM. Email me direct and I will point you at

Re: Bypassing SSH

2009-12-07 Thread Mikhail Yakshin
Greets, Does anyone run Hadoop without SSH? Windows/Vista has a lot of problems with CYGWIN and SSHD. Unless the phase of the moon is just right and you have a magic rabbits foot it just doesn't work. I've spent much time trying to fix it just so I can do some Hadoop development. You don't

Hadoop DC Meetup Scheduled - Dec, 15, 2009 6:30 PM

2009-12-07 Thread Lalit Kapoor
Greetings, I would like to let everyone know that the next Hadoop DC User Group Meetup is scheduled for Tuesday, December 15th, 2009 from 6:30 - 8:30 at UMD campus. Please take a look at the agenda below for details. I hope to see you there, please RSVP here:

Re: On setting up Hadoop slaves

2009-12-07 Thread Allen Wittenauer
You basically can't use the out-of-the-box start/stop scripts when you have multiple DN or TT processes per node. You'll need to hack them to support multiple confs. On 12/6/09 10:28 PM, Yuzhe Tang tristar...@gmail.com wrote: Hi, I am setting up hadoop clusters. How can I configure system

Re: writing files to HDFS (from c++/pipes)

2009-12-07 Thread Prakhar Sharma
Hi Horson, Quite unfortunately, there is no documentation available for PIpes API. Its not just that, the API itself is quite weak and unstable. Only few examples given in PIpes distro work. And it appears there are very few ppl who use Hadoop Map/Reduce through the PIpes API. I have myself been

RE: Re: return in map

2009-12-07 Thread Gang Luo
Thanks. It helps. -Gang - 原始邮件 发件人: Amogh Vasekar am...@yahoo-inc.com 收件人: common-user@hadoop.apache.org common-user@hadoop.apache.org 发送日期: 2009/12/7 (周一) 12:43:07 上午 主 题: Re: Re: return in map Hi, If the file doesn’t exist, java will error out. For partial skips,

Re: problem on connecting two hadoop nodes

2009-12-07 Thread Mike Kendall
try this http://www.michael-noll.com/wiki/Running_Hadoop_On_Ubuntu_Linux_%28Multi-Node_Cluster%29 On Sat, Dec 5, 2009 at 7:33 PM, Yuzhe Tang tristar...@gmail.com wrote: Hi All, I am trying to set up a hadoop cluster. I have started hdfs on one machine, while mapreduce on the other machine.

Hadoop / Data Migration Specialist Needed

2009-12-07 Thread alevin
Hello. My name is Alex Levin and I am the COO of Brilig (www.brilig.com) a startup in New York focused on the online advertising space. We are looking to hire a Hadoop / Data Migration Specialist to play a crucial role in converting new client's data onto Brilig's service platforms. We are

Re: Why DrWho

2009-12-07 Thread hmarti2
Precisely.(Check for a covert 'tardis' for the acct) HAL --Original Message-- From: Habermaas, William To: common-user@hadoop.apache.org ReplyTo: common-user@hadoop.apache.org Subject: Why DrWho Sent: Dec 7, 2009 3:30 PM I am running Hadoop-0.20.1 on a Solaris box with dfs.permissions

Re: Why DrWho

2009-12-07 Thread pavel kolodin
2 days ago i have met the same problem. It is because java can't allocate memory to call it ) After that i was playing with -Xmx options for this variables in hadoop-env.sh, and now they are: export HADOOP_HEAPSIZE=1000 export HADOOP_NAMENODE_OPTS=-Xmx612 -Dcom.sun.management.jmxremote

Re: hadoop idle time on terasort

2009-12-07 Thread Scott Carey
On 12/2/09 12:22 PM, Vasilis Liaskovitis vlias...@gmail.com wrote: Hi, I am using hadoop-0.20.1 to run terasort and randsort benchmarking tests on a small 8-node linux cluster. Most runs consist of usually low (50%) core utilizations in the map and reduce phase, as well as heavy I/O

Hadoop / Data Integration Specialist Needed

2009-12-07 Thread Alex Levin
Hello. My name is Alex Levin and I am the COO of Brilig, a technology startup in New York focusing on the online advertising industry. We are currently in need of a Hadoop developer for a key client services position in our fast and exciting company. For more information, please see the full

Re: Why DrWho

2009-12-07 Thread Allen Wittenauer
On Solaris, you may also want to change: export HADOOP_IDENT_STRING=`/usr/xpg4/bin/id -u -n` On 12/7/09 4:42 PM, pavel kolodin pavelkolodinhad...@gmail.com wrote: 2 days ago i have met the same problem. It is because java can't allocate memory to call it ) After that i was playing with

Re: writing files to HDFS (from c++/pipes)

2009-12-07 Thread Owen O'Malley
On Dec 7, 2009, at 10:44 AM, Prakhar Sharma wrote: Quite unfortunately, there is no documentation available for PIpes API. Its not just that, the API itself is quite weak and unstable. *sigh* I agree that there should be more documentation. I'd love it if someone could write some up and

Re: writing files to HDFS (from c++/pipes)

2009-12-07 Thread Owen O'Malley
On Dec 7, 2009, at 10:05 AM, horson wrote: i want to write a file to hdfs, using hadoop pipes. can anyone tell me how to do that? You either use a Java OutputFormat, which is the easiest, or you use libhdfs to write to HDFS from C++. i looked at the hadoop pipes source and it looked

Out of Java heap space

2009-12-07 Thread Mark Kerzner
Hi, guys, first of all, I have added this section to hadoop-site.xml property namemapred.child.java.opts/name value-Xmx1024m/value /property Secondly, I am running on the EC2 Hadoop clusters using Apache distribution, and I have modified the hadoop-ec2-init-remote.sh in the

Re: Out of Java heap space

2009-12-07 Thread Mark Kerzner
oops, 2048 instead of 1024 did it. Even though on my machine 1024 was enough - but that's not such a big puzzle On Mon, Dec 7, 2009 at 6:26 PM, Mark Kerzner markkerz...@gmail.com wrote: Hi, guys, first of all, I have added this section to hadoop-site.xml property

hadoop-0.20 in Eclipse

2009-12-07 Thread Zhengguo 'Mike' SUN
Hi, I used to be able to create a Java project using the build.xml file that comes with hadoop distribution. It seemed that the 0.20 version of Hadoop uses some ivy related stuff. And now when I tried to create a project using the Ant build file, Eclipse gave me an error of problem setting the

RE: Out of Java heap space

2009-12-07 Thread Rekha Joshi
If it is hadoop 0.20 the files to modify are core-site.xml, hdfs-site.xml and mapred-site.xml, while the default configs are in core-default.xml,hdfs-default.xml and mapred-default.xml. Otherwise also, are you saying that providing -D works with same memory but not via onfig? If not, for

some current features in hadoop

2009-12-07 Thread Krishna Kumar
Dear All, Can anybody please let me know about some of the current features of hadoop on which development work is going on / or planning to go in future, like : 1. Record append 2. Snapshot 3. Erasure coding Etc. Thanks and Best Regards, Krishna Kumar

Re: some current features in hadoop

2009-12-07 Thread Todd Lipcon
On Mon, Dec 7, 2009 at 10:58 PM, Krishna Kumar krishna.ku...@nechclst.inwrote: Dear All, Can anybody please let me know about some of the current features of hadoop on which development work is going on / or planning to go in future, like : 1. Record append Not implemented and

Hadoop Pipes with distributed cache using dlopen

2009-12-07 Thread Upendra Dadi
Hi, I am facing some problems with using distributed cache archive with Pipes job. In my configuration file I have the following two properties: property namemapred.create.symlink/name valueyes/value /property property namemapred.cache.archives/name