Invalid argument for option USER_DATA_FILE

2009-08-20 Thread Harshit Kumar
Hi When I try to execute *hadoop-ec2 launch-cluster test-cluster 2*, it executes, but keep waiting at "Waiting for instance to start", find below the exact display as it shows on my screen $ bin/hadoop-ec2 launch-cluster test-cluster 2 Testing for existing master in group: test-cluster Creating g

Re: Invalid argument for option USER_DATA_FILE

2009-08-20 Thread Harshit Kumar
best sent to common-user. > > On Thu, Aug 20, 2009 at 2:54 PM, Harshit Kumar > wrote: > > Hi > > When I try to execute *hadoop-ec2 launch-cluster test-cluster 2*, it > > executes, but keep waiting at "Waiting for instance to start", find below > > the

Hadoop executing a custom WRITABLE type

2009-09-02 Thread Harshit Kumar
Hi I am executing a custom writable type called as Duo. It is a class that implement the Writable Interface. input is a text file, in which each record consist of 3 words. for ex: hello come here Where are you How are you In the Driver class, Mapper makes the first word as key, and create an obj

Type mismatch in value from map: expected org.apache.hadoop.io.Text, recieved org.apache.hadoop.io.IntWritable

2009-09-07 Thread Harshit Kumar
I get this error on WordCount program, originally copied from the examples folder in Hadoop-0.19.2. I dont understand why I get this error. Everything is set right. Please see the code and give feedback. import java.io.IOException; import java.util.*; import org.apache.hadoop.fs.Path; import org.

Please interpret the output -please help - going mad

2009-09-07 Thread Harshit Kumar
Hi I am going mad, pulling my hairs now, past 2 days, trying to figure out the output by this map-reduce program. Please help or you can find me in asylum. in the map function below, output.collect(Text,Duo) - Duo is a custom input format, map functions trasnfers "hello, duo("hello,hello")" key-va

org.apache.hadoop.util.DiskChecker$DiskErrorException

2009-10-19 Thread Harshit Kumar
Hi I get the following error on executing a map-reduce application on EC2. Can any one please pass suggest some pointers what is going wrong? I tried searching the mailing list, but couldnt find any help regarding this type of error. *[r...@ip-10-243-47-69 hadoop-0.19.0]# bin/hadoop jar lubm.jar

Re: Eclipse plugin for Hadoop

2009-10-21 Thread Harshit Kumar
Dont know why do u need to download the plug-in from some site. The plug-in is included in the downloaded hadoop tar file. You can find plugin jar file inside /your-directory-structure/hadoop-version/contrib/eclipse-plugin/ - H. Kumar 2009/10/22 Vandana Ayyalasomayajula > Hi All, > > I was co

S3 Exception

2009-10-25 Thread Harshit Kumar
Hi There is 1 GB of rdf/owl files that I am executing on EC2. Execution throws the following exception --- 08/11/19 16:08:27 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same. org.apache.hadoop.fs.s3.S3Except

S3 Exception for a Map Reduce job on EC2

2009-10-28 Thread Harshit Kumar
Hi There is 1 GB of rdf/owl files that I am executing on EC2. Execution throws the following exception --- 08/11/19 16:08:27 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same. org.apache.hadoop.fs.s3.S3Except

S3 Exception for a Map Reduce job on EC2

2009-10-28 Thread Harshit Kumar
Hi There is 1 GB of rdf/owl files that I am executing on EC2. Execution throws the following exception --- 08/11/19 16:08:27 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same. org.apache.hadoop.fs.s3.S3Except

Re: Http 404 Error while viewing tasktracker in Browser

2009-11-02 Thread Harshit Kumar
Can you please share the URL that you typed. Also, may be the port is blocked by the firewall, check the open ports on your system. Cheers H. Kumar skype: harshit900 Blog: http://harshitkumar.wordpress.com Website: http:/kumarharmuscat.tripod.com 2009/11/2 Abhilaash > > Hi all, > > I have

Re: Http 404 Error while viewing tasktracker in Browser

2009-11-03 Thread Harshit Kumar
Hi Other reasons can be hadoop jobtracker failed to start, check if namenode, secondarynamenode, task tracker, job tracker are all up or not. Cheers H. Kumar Phone(Mobile): +82-10-2892-9663 Phone(Office): +82-31- skype: harshit900 Blog: http://harshitkumar.wordpress.com Website: http:/kumarharmus

Re: Hadoop User Group (Bay Area) - next Wednesday (Nov 18th) at Yahoo!

2009-11-10 Thread Harshit Kumar
Hi Any consideration of making this event available for global members of hadoop community, say for instance, streaming it live. Or at least record the event and upload on youtube. Regards H. Kumar Phone(Mobile): +82-10-2892-9663 Phone(Office): +82-31- skype: harshit900 Blog: http://harshitkumar

Re: Values returned by Map to Reducer

2009-11-16 Thread Harshit Kumar
Yes. H. Kumar Phone(Mobile): +82-10-2892-9663 Phone(Office): +82-31- skype: harshit900 Blog: http://harshitkumar.wordpress.com Website: http:/kumarharmuscat.tripod.com 2009/11/17 Something Something > Does Hadoop Mapreduce guarantee that the *values* returned by Mapper to the > Reducer are sor

Re: Values returned by Map to Reducer

2009-11-16 Thread Harshit Kumar
Oh Yes, you are right, i replied to this mail, saying yes. However, that yes was for the keys which are sorted. sorry for the confusion. H. Kumar Phone(Mobile): +82-10-2892-9663 Phone(Office): +82-31- skype: harshit900 Blog: http://harshitkumar.wordpress.com Website: http:/kumarharmuscat.tripod.c

attempt_201001221636_0001_m_000003_0, Status : FAILED java.io.FileNotFoundException:

2010-01-21 Thread Harshit Kumar
Hi I will appreciate if someone can help resolve the following error. i am going nuts over this. $ bin/hadoop jar sparqlcloud.jar org.bike.MainClass 10/01/22 16:38:27 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same. 10/01/

Re: attempt_201001221636_0001_m_000003_0, Status : FAILED java.io.FileNotFoundException:

2010-01-22 Thread Harshit Kumar
ndows namespace so hadoop cannot find /work/temp folder? would like to hear some feedback from the community thanks H. Kumar 2010/1/22 Harshit Kumar > Hi > > I will appreciate if someone can help resolve the following error. i am > going nuts over this. > > $ bin/hadoop jar sp

java.io.IOException: Cannot open filename /user/root /�s�t�e�p�1�/�p�a�r�t�-�0�0�0� 0�0

2010-02-04 Thread Harshit Kumar
Hi I dont understand the reason for this error. java.io.IOException: Cannot open filename /user/root/�s�t�e�p�1�/�p�a�r�t�-�0�0�0�0�0 at org.apache.hadoop.hdfs.DFSClient$DFSInputStream.openInfo(DFSClient.java:1394) at org.apache.hadoop.hdfs.DFSClient$DFSInputStream.(DFSClient.java:1385) at org.ap