Re: Java Heap space Exception

2013-11-15 Thread unmesha sreeveni
Thx Rob Blah On Fri, Nov 15, 2013 at 6:54 PM, Rob Blah wrote: > MapReduce > http://hortonworks.com/blog/how-to-plan-and-configure-yarn-in-hdp-2-0/ > "We’ll thus assign 4 GB for Map task Containers, and 8 GB for Reduce tasks > Containers." > > You can try editing configuration file: > mapreduce.

Re: New data on unfinalized hdfs upgrade

2013-11-15 Thread Harsh J
If you "rollback", you lose all new data. On Sat, Nov 16, 2013 at 12:25 AM, krispyjala wrote: > What happens if you upgrade 1.0.4 to 2.2, run some stuff that puts in new > data while the upgrade is not finalized, but then revert back to 1.0.4? Will > the new data also be reverted back to 1.0.4 sa

Re: Hadoop jobtracker OOME fix applied and no OOME but JT hanged

2013-11-15 Thread Ted Yu
>From the command line, can you run 'jmap -heap' ? http://download.oracle.com/javase/1.5.0/docs/tooldocs/share/jmap.html On Fri, Nov 15, 2013 at 10:50 AM, Viswanathan J wrote: > Hi guys, > > I had JT OOME in hadoop version 1.2.1 and applied the patch based on the > fix given by Apache contribut

Re: Writing an application based on YARN 2.2

2013-11-15 Thread krispyjala
I also found this post to be of great help: http://tzulitai.wordpress.com/2013/08/30/yarn-applications-code-level-breakdown-client I think he has another post that covers ApplicationMaster code breakdown. -- Kris. -- View this message in context: http://hadoop-common.472056.n3.nabble.com/Wri

New data on unfinalized hdfs upgrade

2013-11-15 Thread krispyjala
What happens if you upgrade 1.0.4 to 2.2, run some stuff that puts in new data while the upgrade is not finalized, but then revert back to 1.0.4? Will the new data also be reverted back to 1.0.4 safely? -- View this message in context: http://hadoop-common.472056.n3.nabble.com/New-data-on-unfin

Hadoop jobtracker OOME fix applied and no OOME but JT hanged

2013-11-15 Thread Viswanathan J
Hi guys, I had JT OOME in hadoop version 1.2.1 and applied the patch based on the fix given by Apache contributors for jira issue mapreduce-5508. After applying that fix the heap size gradually increases but after one week jobtracker completely hangs and slowdown but didn't get JT OOME. No error

C++ code with pipes and CLapack Libraries

2013-11-15 Thread Salman Toor
Hi, I am trying to run my C++ code using hadoop-1.2.1. I am using pipes. The code requires to access CLAPACK libraries. My test setup is based on single node. CLAPECK is installed on that node. Everything compiles without any problem but the when I run a job the maper hangs first and than faile

Hadoop HA Namenode remote access

2013-11-15 Thread Bruno Andrade
Im configuring Hadoop 2.2.0 stable release with HA namenode but i dont know how to configure remote access to the cluster. I have HA namenode configured with manual failover and i defined|dfs.nameservices|and i can access hdfs with nameservice from all the nodes included in the cluster, but no

Re: Java Heap space Exception

2013-11-15 Thread Rob Blah
MapReduce http://hortonworks.com/blog/how-to-plan-and-configure-yarn-in-hdp-2-0/ "We’ll thus assign 4 GB for Map task Containers, and 8 GB for Reduce tasks Containers." You can try editing configuration file: mapreduce.map.memory.mb ? mapreduce.reduce.memory.mb ? Or set these in a driver. If y

Re: Java Heap space Exception

2013-11-15 Thread Rob Blah
Question have you tried giving more memory to the containers? 2013/11/15 unmesha sreeveni > ys found > tried -D mapred.child.java.opts=-Xmx4096M on the command line: > > > On Fri, Nov 15, 2013 at 3:01 PM, unmesha sreeveni > wrote: > >> hadoop jar /home/my/hadoop2.jar /user/unmesha/inputdata /u

Re: DefaultResourceCalculator ClassNotFoundException

2013-11-15 Thread Rob Blah
"Can you check the config entry for yarn.scheduler.capacity.resource- calculator ? It should point to org.apache.hadoop.yarn.util.resource. DefaultResourceCalculator" Answer provided by Ted Yu in thread "DefaultResourceCalculator class not found, ResourceManager fails to start." regards 2013/1

Re: Folder not created using Hadoop Mapreduce code

2013-11-15 Thread unmesha sreeveni
Was due to permission issues. http://stackoverflow.com/questions/15941108/hdfs-access-from-remote-host-through-java-api-user-authentication On Fri, Nov 15, 2013 at 8:34 AM, unmesha sreeveni wrote: > yes . I closed :( > > > On Thu, Nov 14, 2013 at 8:51 PM, java8964 java8964 > wrote: > >> Maybe j

Re: Java Heap space Exception

2013-11-15 Thread unmesha sreeveni
ys found tried -D mapred.child.java.opts=-Xmx4096M on the command line: On Fri, Nov 15, 2013 at 3:01 PM, unmesha sreeveni wrote: > hadoop jar /home/my/hadoop2.jar /user/unmesha/inputdata /user/unmesha/out > > > > On Fri, Nov 15, 2013 at 2:53 PM, unmesha sreeveni > wrote: > >> When i tried to ex

DefaultResourceCalculator ClassNotFoundException

2013-11-15 Thread YouPeng Yang
Hi all It‘s wierd to failed to start my yarn resourcemanager with an exception[1]. I aslo do some google, someone also encountered this problem with no solved answer. I check the src ,there is actually no the DefaultResourceCalculator in package :org.apache.hadoop.yarn.server.resourcema

Re: How to write the contents from mapper into file

2013-11-15 Thread unmesha sreeveni
[solved] It is due to permission issue. On Wed, Nov 13, 2013 at 11:27 PM, Rahul Bhattacharjee < rahul.rec@gmail.com> wrote: > If you have a map only job , then the output of the mappers would be > written by hadoop itself. > > thanks, > Rahul > > > On Wed, Nov 13, 2013 at 9:50 AM, Sahil Agar

Re: Java Heap space Exception

2013-11-15 Thread unmesha sreeveni
hadoop jar /home/my/hadoop2.jar /user/unmesha/inputdata /user/unmesha/out On Fri, Nov 15, 2013 at 2:53 PM, unmesha sreeveni wrote: > When i tried to execute my program with 100 MB file i am getting > JavaHeapSpace Exception > > > >hadoop jar /home/my/hadoop2.jar /user/unmesha/inputdata > /user/

Java Heap space Exception

2013-11-15 Thread unmesha sreeveni
When i tried to execute my program with 100 MB file i am getting JavaHeapSpace Exception >hadoop jar /home/my/hadoop2.jar /user/unmesha/inputdata /user/inputdata/out How to increase the heap size through commandline -- *Thanks & Regards* Unmesha Sreeveni U.B *Junior Developer*

Re: LeaseExpiredException : Lease mismatch in Hadoop mapReduce| How to solve?

2013-11-15 Thread unmesha sreeveni
u r most welcome :) On Fri, Nov 15, 2013 at 12:46 PM, chandu banavaram < chandu.banava...@gmail.com> wrote: > thanks > > > On Thu, Nov 14, 2013 at 10:18 PM, unmesha sreeveni > wrote: > >> @chandu banavaram: >> This exception usually happens if hdfs is trying to write into a file >> which is no

Dealing with stragglers in hadoop

2013-11-15 Thread jamal sasha
Hi, I have a very simple use case... Basically I have an edge list and I am trying to convert it into adjacency list.. Basically src target a b ac bd be and so on.. What I am trying to build is a [b,c] b [d,e] .. and so on.. But every now and then.. I hit a super node..which h