RE: Estimating the time of my hadoop jobs

2013-12-18 Thread Kandoi, Nikhil
Thank you everyone for your solution , I think I got an idea of where I was making a mistake, not only was I setting up and destroying the jvm for a single Hadoop jobs I was also creating numerous Hadoop jobs for processing different files which can be handled in one single job. Will try the so

Re: pipes on hadoop 2.2.0 crashes

2013-12-18 Thread Silvina Caíno Lores
I tested that example as well and I'm getting the same exception with this in stderr: Hadoop Pipes Exception: failed to open hdfs://compute-0-7-2:54310/in/file at /home/scaino/hadoop-2.2.0-maven/hadoop-tools/hadoop-pipes/src/main/native/examples/impl/wordcount-nopipe.cc:82 in WordCountReader I fo

Authentication issue on Hadoop 2.2.0

2013-12-18 Thread Silvina Caíno Lores
Hi everyone, I'm working with a single node cluster and a Hadoop Pipes job that throws the following exception on execution: 13/12/18 10:44:55 INFO mapreduce.Job: Running job: job_1387359324416_0002 13/12/18 10:45:03 INFO mapreduce.Job: Job job_1387359324416_0002 running in uber mode : false 13/1

Re: Authentication issue on Hadoop 2.2.0

2013-12-18 Thread Silvina Caíno Lores
I forgot to mention that the wordcount pipes example runs successfully. On 18 December 2013 10:50, Silvina Caíno Lores wrote: > Hi everyone, > > I'm working with a single node cluster and a Hadoop Pipes job that throws > the following exception on execution: > > 13/12/18 10:44:55 INFO mapreduce.

Get dynamic values in a user defined class from reducer.

2013-12-18 Thread unmesha sreeveni
Can any one pls suggest a good way My scenario: i hav 1.Driver class 2.Mapper class 3.reducer class 4.Mid class After completing mapper it goes to reduce. From reducer it will be going to Driver and from driver to Mid class so i need to get a data from reducer class to Mid class So for that i dec

Re: Yarn -- one of the daemons getting killed

2013-12-18 Thread Krishna Kishore Bonagiri
Hi Vinod, Thanks for the link, I went through it and it looks like the OOM killer picks a process that has the highest oom_score. I have tried to capture oom_score for all the YARN daemon processes after each run of my application.The first time I have captured these details, I see that the name n

Re: Permission problem in Junit test - Hadoop 2.2.0

2013-12-18 Thread Karim Awara
Note: I have built hadoop not in my home directory but rather in a different volume. -- Best Regards, Karim Ahmed Awara On Wed, Dec 18, 2013 at 10:41 AM, Karim Awara wrote: > > Still the same problem. IF you notice, The unit test actually created the > directories upto $HOME_HDFS/build/test/da

Re: Get dynamic values in a user defined class from reducer.

2013-12-18 Thread Robert Dyer
Generally speaking, static fields are not useful in Hadoop. The issue you are seeing is that the reducer is running in a separate VM (possibly on a different node!) and thus the static value you are reading inside of Mid is actually a separate instantiation of that class and field. If you have an

Security integrity with QJM

2013-12-18 Thread Juan Carlos
I'm trying to configure a HDFS cluster with HA, kerberos and cipher. For HA I have used QJM with automatic failover. Til now I have HA and Kerberos running propertly, but I'm having problems when try to add cipher. Specifically when I set in core-site.xml the property hadoop.rpc.protection to somet

compatibility between new client and old server

2013-12-18 Thread Ken Been
I am trying to make a 2.2.0 Java client work with a 1.1.2 server. The error I am currently getting is below. I'd like to know if my problem is because I have configured something wrong or because the versions are simply not compatible for what I want to do. Thanks in advance for any help. Ke

Re: compatibility between new client and old server

2013-12-18 Thread Suresh Srinivas
2.x is a new major release. 1.x and 2.x are not compatible. In 1.x, the RPC wire protocol used java serialization. In 2.x, the RPC wire protocol uses protobuf. A client must be compiled against 2.x and should use appropriate jars from 2.x to work with 2.x. On Wed, Dec 18, 2013 at 10:45 AM, Ken B

Debug a hdfs -put command

2013-12-18 Thread Karim Awara
Hi, Does anyone know how to trace the calls made by datanode/namenode when executing a simple shell command -put? -- Best Regards, Karim Ahmed Awara -- -- This message and its contents, including attachments are intended solely for the original recipient. If you

multinode hadoop cluster on vmware

2013-12-18 Thread navaz
Hi I want to set up a multinode hadoop cluser on vmware. Can anybody suggest some good material to do so. I have used these https://www.dropbox.com/s/05aurcp42asuktp/Chiu%20Hadoop%20Pig%20Install%20Instructions.docx instruction to setup single node hadoop cluster on vmware. Now can anybody help m

Hadoop Pi Example in Yarn

2013-12-18 Thread -
How does the PI example can determine the number of mappers? I thought the only way to determine number of mappers is via the amount of filesplits you have in the input file... So for instance if the input size is 100MB and filesplit size is 20MB then I would expect to have 100/20 = 5 map tasks.

Re: Hadoop Pi Example in Yarn

2013-12-18 Thread Adam Kawa
A map task is created for each input split in your dataset. By default, an input split correlates to block in HDFS i.e. if a file consists of 1 HDFS block, then 1 map task will be started - if a file consists of N blocks, then N map task will be started for that file (obviously, assuming a default

Re: multinode hadoop cluster on vmware

2013-12-18 Thread Adam Kawa
Maybe you can try Serengeti http://www.projectserengeti.org/ or Vagrant ( http://java.dzone.com/articles/setting-hadoop-virtual-cluster, http://blog.cloudera.com/blog/2013/04/how-to-use-vagrant-to-set-up-a-virtual-hadoop-cluster/ )? 2013/12/18 navaz > Hi > > I want to set up a multinode hadoop

setup() and cleanup() methods for mapred api.

2013-12-18 Thread unmesha sreeveni
Is there setup() and cleanup() methods for mapred api. Is there a sample code for reference -- *Thanks & Regards* Unmesha Sreeveni U.B *Junior Developer*

Re: setup() and cleanup() methods for mapred api.

2013-12-18 Thread 梁李印
With old-api (mapred.*), there are no setup() and cleanup(). But you can use configure() and close(). Here is a sample code: public static class MapClass extends MapReduceBase implements Mapper { private final static IntWritable one = new IntWritable(1); private Text wor

Re: setup() and cleanup() methods for mapred api.

2013-12-18 Thread unmesha sreeveni
Thanks for Your reply Liyin. close() get executed after each reducer right? Are we able to capture a dynamic value in close()? On Thu, Dec 19, 2013 at 9:20 AM, 梁李印 wrote: > With old-api (mapred.*), there are no setup() and cleanup(). But you can > use configure() and close(). Here is a sample c

Re: Get dynamic values in a user defined class from reducer.

2013-12-18 Thread unmesha sreeveni
Thanks for ur reply Robert Dyer. Thanks for spending ur valuable time for clearing my doubts. I need to pass some other values too Reducer public class Reduce extends MapReduceBase implements Reducer { static int cnt =0; ArrayList ar = new ArrayList(); String data = null; public v

File not getting created in hdfs

2013-12-18 Thread unmesha sreeveni
I have a hadoop implementation for an algorithm. i am doing it in eclipse: so my algorithm runs different MR jobs depending upon the datasets.(for a 413 bytes file 7 MR jobs are executed) Algorithm | |___MR1-->creates a intermediate0.txt file. | |___MR2-->creates a intermediate1

Re: File not getting created in hdfs

2013-12-18 Thread unmesha sreeveni
an edit*** conti Completes the job. but except intermediate0.txt all other files are not getting created in hdfs from reducer. On Thu, Dec 19, 2013 at 10:23 AM, unmesha sreeveni wrote: > > I have a hadoop implementation for an algorithm. > > i am doing it in eclipse: >

count the number of rows in a table

2013-12-18 Thread Ranjini Rathinam
Hi, Need to write a mapreduce program to count the number of rows in a table. Please suggest me with example. Thanks in advance. Ranjini R

Re: count the number of rows in a table

2013-12-18 Thread Ted Yu
In 0.94, see src/main/java/org/apache/hadoop/hbase/mapreduce/RowCounter.java Cheers On Wed, Dec 18, 2013 at 9:34 PM, Ranjini Rathinam wrote: > Hi, > > Need to write a mapreduce program to count the number of rows in a table. > > Please suggest me with example. > > > Thanks in advance. > > > Ran

Re: count the number of rows in a table

2013-12-18 Thread unmesha sreeveni
Which database you are using ? Pls look at this: https://www.inkling.com/read/hadoop-definitive-guide-tom-white-3rd/chapter-13/a-mapreduce-application-to On Thu, Dec 19, 2013 at 11:04 AM, Ranjini Rathinam wrote: > Hi, > > Need to write a mapreduce program to count the number of rows in a table.