about SimpleDateFormat in code

2008-12-28 Thread Leon
Hi all i execute following script from hadoop source code: find . |grep -v svn |grep java$ |xargs grep SimpleDateFormat |grep static I got many result. SimpleDateFormat class is not thread safe. If the ojbect declare as static, it will cause thread safe problem. Leon Liu

Re: Does anyone have a working example for using MapFiles on the DistributedCache?

2008-12-28 Thread Amareshwari Sriramadasu
Sean Shanny wrote: To all, Version: hadoop-0.17.2.1-core.jar I have created a MapFile. What I don't seem to be able to do is correctly place the MapFile in the DistributedCache and the make use of it in a map method. I need the following info please: 1.How and where to place the MapFi

Re: OutofMemory Error, inspite of large amounts provided

2008-12-28 Thread Amareshwari Sriramadasu
Saptarshi Guha wrote: Caught it in action. Running ps -e -o 'vsz pid ruser args' |sort -nr|head -5 on a machine where the map task was running 04812 16962 sguha/home/godhuli/custom/jdk1.6.0_11/jre/bin/java -Djava.library.path=/home/godhuli/custom/hadoop/bin/../lib/native/Linux-amd64-64:/home

Re: OutofMemory Error, inspite of large amounts provided

2008-12-28 Thread Saptarshi Guha
Caught it in action. Running ps -e -o 'vsz pid ruser args' |sort -nr|head -5 on a machine where the map task was running 04812 16962 sguha/home/godhuli/custom/jdk1.6.0_11/jre/bin/java -Djava.library.path=/home/godhuli/custom/hadoop/bin/../lib/native/Linux-amd64-64:/home/godhuli/custom/hdfs/map

Re: OutofMemory Error, inspite of large amounts provided

2008-12-28 Thread Saptarshi Guha
On Sun, Dec 28, 2008 at 4:33 PM, Brian Bockelman wrote: > Hey Saptarshi, > > Watch the running child process while using "ps", "top", or Ganglia > monitoring. Does the map task actually use 16GB of memory, or is the memory > not getting set properly? > > Brian I haven't figured out how to run ga

run two diffrent job simulate

2008-12-28 Thread 叶忠强
Hello, I have two different data source, but they have to be processed simulate, so how do I design this system using hadoop? Thanks very much. Jerry

Re: OutofMemory Error, inspite of large amounts provided

2008-12-28 Thread Brian Bockelman
Hey Saptarshi, Watch the running child process while using "ps", "top", or Ganglia monitoring. Does the map task actually use 16GB of memory, or is the memory not getting set properly? Brian On Dec 28, 2008, at 3:00 PM, Saptarshi Guha wrote: Hello, I have work machines with 32GB and all

OutofMemory Error, inspite of large amounts provided

2008-12-28 Thread Saptarshi Guha
Hello, I have work machines with 32GB and allocated 16GB to the heap size ==hadoop-env.sh== export HADOOP_HEAPSIZE=16384 ==hadoop-site.xml== mapred.child.java.opts -Xmx16384m The same code runs when not being run through Hadoop, but it fails when in a Maptask. Are there other places where I