Re: Running hadoop mapreduce examples with input format specifier

2022-05-02 Thread Ayush Saxena
jar > hadoop-3.3.1/share/hadoop/mapreduce/hadoop-mapreduce-examples-3.3.1.jar > join -inFormat org.apache.hadoop.mapreduce.lib.input.TextInputFormat > /examples-input/ /examples-output/, I get the error - java.lang.Exception: > java.io.IOException: wrong key class: org.apache.hadoop.io.

Re: Running hadoop mapreduce examples with input format specifier

2022-05-01 Thread Pratyush Das
With the invocation - hadoop jar hadoop-3.3.1/share/hadoop/mapreduce/hadoop-mapreduce-examples-3.3.1.jar join -inFormat org.apache.hadoop.mapreduce.lib.input.TextInputFormat /examples-input/ /examples-output/, I get the error - java.lang.Exception: java.io.IOException: wrong key class

Re: Running hadoop mapreduce examples with input format specifier

2022-05-01 Thread Ayush Saxena
ecuting the Join.java example in the Hadoop Mapreduce Examples jar > using the following invocation - hadoop jar > hadoop-3.3.1/share/hadoop/mapreduce/hadoop-mapreduce-examples-3.3.1.jar join > -inFormat TextInputFormat /examples-input/ /examples-output/ > > I keep getting an erro

Running hadoop mapreduce examples with input format specifier

2022-05-01 Thread Pratyush Das
Hi, I tried executing the Join.java example in the Hadoop Mapreduce Examples jar using the following invocation - hadoop jar hadoop-3.3.1/share/hadoop/mapreduce/hadoop-mapreduce-examples-3.3.1.jar join -inFormat TextInputFormat /examples-input/ /examples-output/ I keep getting an error

CVE-2017-15713: Apache Hadoop MapReduce job history server vulnerability

2018-01-19 Thread Jason Lowe
CVE-2017-15713: Apache Hadoop MapReduce job history server vulnerability Severity: Severe Vendor: The Apache Software Foundation Versions Affected: Hadoop 0.23.0 to 0.23.11 Hadoop 2.0.0-alpha to 2.8.2 Hadoop 3.0.0-alpha to 3.0.0-beta1 Users affected: Users running the MapReduce job

Re: hadoop mapreduce job rest api

2015-12-25 Thread Drake민영근
Maybe this?: http://hadoop.apache.org/docs/r2.7.1/hadoop-yarn/hadoop-yarn-site/ResourceManagerRest.html#Cluster_Applications_APISubmit_Application Drake 민영근 Ph.D kt NexR On Thu, Dec 24, 2015 at 3:04 PM, Artem Ervits wrote: > Take a look at webhcat api > On Dec 24, 2015 12:50 AM, "ram kumar" wr

Re: hadoop mapreduce job rest api

2015-12-23 Thread Artem Ervits
Take a look at webhcat api On Dec 24, 2015 12:50 AM, "ram kumar" wrote: > Hi, > > I want to submit a mapreduce job using rest api, > and get the status of the job every n interval. > Is there a way to do it? > > Thanks >

hadoop mapreduce job rest api

2015-12-23 Thread ram kumar
Hi, I want to submit a mapreduce job using rest api, and get the status of the job every n interval. Is there a way to do it? Thanks

Hadoop Mapreduce Client Core Customization

2015-10-07 Thread Muhammad Afzal
Hi everyone, I am customizing Hadoop Mapreduce Client Core module (Hadoop 2.6.0) and adding some custom features in it, as part of customization i have to include some third party library in core module, problem is that when i add a third party dependency in pom.xml file, i start getting compile

Re: How to partition a file to smaller size for performing KNN in hadoop mapreduce

2015-01-20 Thread unmesha sreeveni
Drake 민영근 Ph.D >>>>> >>>>> On Thu, Jan 15, 2015 at 6:05 PM, unmesha sreeveni < >>>>> unmeshab...@gmail.com> wrote: >>>>> >>>>>> Is there any way.. >>>>>> Waiting for a reply.I have p

Re: How to partition a file to smaller size for performing KNN in hadoop mapreduce

2015-01-20 Thread Drake민영근
;>> unmeshab...@gmail.com> wrote: >>>> >>>>> Is there any way.. >>>>> Waiting for a reply.I have posted the question every where..but none >>>>> is responding back. >>>>> I feel like this is the right place to ask dou

Re: How to partition a file to smaller size for performing KNN in hadoop mapreduce

2015-01-20 Thread unmesha sreeveni
ding back. >>>> I feel like this is the right place to ask doubts. As some of u may >>>> came across the same issue and get stuck. >>>> >>>> On Thu, Jan 15, 2015 at 12:34 PM, unmesha sreeveni < >>>> unmeshab...@gmail.com> wrote:

Re: How to partition a file to smaller size for performing KNN in hadoop mapreduce

2015-01-20 Thread Drake민영근
me of u may came >>> across the same issue and get stuck. >>> >>> On Thu, Jan 15, 2015 at 12:34 PM, unmesha sreeveni < >>> unmeshab...@gmail.com> wrote: >>> >>>> Yes, One of my friend is implemeting the same. I know global sharing of >

Re: How to partition a file to smaller size for performing KNN in hadoop mapreduce

2015-01-20 Thread unmesha sreeveni
eeveni > > wrote: >> >>> Yes, One of my friend is implemeting the same. I know global sharing of >>> Data is not possible across Hadoop MapReduce. But I need to check if that >>> can be done somehow in hadoop Mapreduce also. Because I found some papers >>&g

Re: How to partition a file to smaller size for performing KNN in hadoop mapreduce

2015-01-20 Thread Drake민영근
he same. I know global sharing of >> Data is not possible across Hadoop MapReduce. But I need to check if that >> can be done somehow in hadoop Mapreduce also. Because I found some papers >> in KNN hadoop also. >> And I trying to compare the performance too. >> >> Hope

Re: How to partition a file to smaller size for performing KNN in hadoop mapreduce

2015-01-15 Thread unmesha sreeveni
my friend is implemeting the same. I know global sharing of > Data is not possible across Hadoop MapReduce. But I need to check if that > can be done somehow in hadoop Mapreduce also. Because I found some papers > in KNN hadoop also. > And I trying to compare the performance too. > &g

Re: How to partition a file to smaller size for performing KNN in hadoop mapreduce

2015-01-14 Thread unmesha sreeveni
Yes, One of my friend is implemeting the same. I know global sharing of Data is not possible across Hadoop MapReduce. But I need to check if that can be done somehow in hadoop Mapreduce also. Because I found some papers in KNN hadoop also. And I trying to compare the performance too. Hope some

Re: How to partition a file to smaller size for performing KNN in hadoop mapreduce

2015-01-14 Thread Ted Dunning
have you considered implementing using something like spark? That could be much easier than raw map-reduce On Wed, Jan 14, 2015 at 10:06 PM, unmesha sreeveni wrote: > In KNN like algorithm we need to load model Data into cache for predicting > the records. > > Here is the example for KNN. > > >

How to partition a file to smaller size for performing KNN in hadoop mapreduce

2015-01-14 Thread unmesha sreeveni
In KNN like algorithm we need to load model Data into cache for predicting the records. Here is the example for KNN. [image: Inline image 1] So if the model will be a large file say1 or 2 GB we will be able to load them into Distributed cache. The one way is to split/partition the model Result

Re: PCA in HAdoop MapReduce

2014-07-30 Thread Adaryl "Bob" Wakefield, MBA
field, MBA Principal Mass Street Analytics 913.938.6685 www.linkedin.com/in/bobwakefieldmba From: unmesha sreeveni Sent: Wednesday, July 30, 2014 1:27 AM To: User Hadoop Subject: PCA in HAdoop MapReduce ​I am little bit confused in doing PCA I am trying it myown and I refered

PCA in HAdoop MapReduce

2014-07-29 Thread unmesha sreeveni
​I am little bit confused in doing PCA I am trying it myown and I refered http://nyx-www.informatik.uni-bremen.de/664/1/smith_tr_02.pdf 1.MR job to calculate mean 2.MR Job to substract mean from the input data(Data Adjust in .pdf) 3.MR job to find covarience and Calculated the eigenvectors and ei

Re: Huge text file for Hadoop Mapreduce

2014-07-09 Thread Stanley Shi
; wrote: > >> http://www.cs.cmu.edu/~./enron/ >> >> Not sure the uncompressed size but pretty sure it’s over a Gig. >> >> B. >> >> *From:* navaz >> *Sent:* Monday, July 07, 2014 6:22 PM >> *To:* user@hadoop.apache.org >> *Subject:*

Re: Huge text file for Hadoop Mapreduce

2014-07-07 Thread Du Lam
; *To:* user@hadoop.apache.org > *Subject:* Huge text file for Hadoop Mapreduce > > > Hi > > > > I am running basic word count Mapreduce code. I have download a file > Gettysburg.txt which is of 1486bytes. I have 3 datanodes and replication > factor is set to 3. The da

Re: Huge text file for Hadoop Mapreduce

2014-07-07 Thread Adaryl "Bob" Wakefield, MBA
http://www.cs.cmu.edu/~./enron/ Not sure the uncompressed size but pretty sure it’s over a Gig. B. From: navaz Sent: Monday, July 07, 2014 6:22 PM To: user@hadoop.apache.org Subject: Huge text file for Hadoop Mapreduce Hi I am running basic word count Mapreduce code. I have download a

Huge text file for Hadoop Mapreduce

2014-07-07 Thread navaz
Hi I am running basic word count Mapreduce code. I have download a file Gettysburg.txt which is of 1486bytes. I have 3 datanodes and replication factor is set to 3. The data is copied into all 3 datanodes but there is only one map task is running . All other nodes are ideal. I think this is b

Re: hadoop-mapreduce-examples-2.2.0 : Exception from container-launch

2014-06-30 Thread Li Lu
regards, Li Lu On Jun 29, 2014, at 7:24 PM, EdwardKing wrote: > I run hadoop-mapreduce-examples-2.2.0.jar,it can get correct result,but it > raise error "exitCode: 1 due to: Exception from container-launch". Why? How > to solve it? Thanks. > > [yarn@localhost sbin]

hadoop-mapreduce-examples-2.2.0 : Exception from container-launch

2014-06-29 Thread EdwardKing
I run hadoop-mapreduce-examples-2.2.0.jar,it can get correct result,but it raise error "exitCode: 1 due to: Exception from container-launch". Why? How to solve it? Thanks. [yarn@localhost sbin]$ hadoop jar /opt/yarn/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.2

Failed to run hadoop-mapreduce-examples-2.3.0.jar

2014-06-18 Thread Tianyin Xu
last step, I follow the Wiki to run the wordcount (I didn't use ant because the hadoop-mapreduce-examples-2.3.0.jar is already built) #hadoop jar hadoop-mapreduce-project/build/hadoop-mapreduce-examples-2.3.0.jar wordcount -Dmapreduce.job.user.name=$USER in out I get the following e

[Blog] changing default key value output In Hadoop MapReduce Jobs for beginners.

2014-05-03 Thread unmesha sreeveni
Hi http://www.unmeshasreeveni.blogspot.in/2014/04/can-we-change-default-key-value-output.html This is the sample code for changing default key value output In Hadoop MapReduce Jobs for beginners. Please post your comments in blog page. Let me know your thoughts. -- *Thanks & Reg

Re: [Blog] Code For Deleting Output Folder If Exist In Hadoop MapReduce Jobs

2014-05-01 Thread unmesha sreeveni
Please see this link: http://www.unmeshasreeveni.blogspot.in/2014/04/code-for-deleting-existing-output.html On Fri, May 2, 2014 at 8:52 AM, unmesha sreeveni wrote: > Hi > >This is the sample code for Deleting Output Folder(If Exist) In Hadoop > MapReduce Jobs for beginners

[Blog] Code For Deleting Output Folder If Exist In Hadoop MapReduce Jobs

2014-05-01 Thread unmesha sreeveni
Hi This is the sample code for Deleting Output Folder(If Exist) In Hadoop MapReduce Jobs for beginners that can be included along with our MapReduce Code to work on with same output folder for debugging. Please post your comments in blog page. Let me know your thoughts Thanks unmesha

Hadoop MapReduce Streaming - how to change the final output file name with the desired name rather than in partition like: part-0000*

2014-03-19 Thread Phan, Truong Q
Hi Could you please provide me an alternative link where it explains on how to change the final output file name with the desired name rather than in partition like: part-*? Can I have a sample Python's code to run MapReduce Streaming with a custome output file names? One helper from the A

Re: Rewriting Ab-Initio scripts using Hadoop MapReduce

2013-12-27 Thread Prashant Kommireddi
What specific info are you looking for? On Monday, December 23, 2013, Manoj Babu wrote: > Hi All, > > Can anybody share their experience on Rewriting Ab-Initio scripts using > Hadoop MapReduce? > > > Cheers! > Manoj. >

Rewriting Ab-Initio scripts using Hadoop MapReduce

2013-12-23 Thread Manoj Babu
Hi All, Can anybody share their experience on Rewriting Ab-Initio scripts using Hadoop MapReduce? Cheers! Manoj.

Re: hadoop -Mapreduce

2013-12-21 Thread Chris Mawata
umar B > > wrote: >> >>> Hi Ranjini, >>> >>> >>> >>> Please check this mapper class whether this suites your needs. >>> >>> org.apache.hadoop.mapreduce.lib.map.MultithreadedMapper >>> >>> >>> >>> Thanks and Regards, >>> >>&

Re: XmlInputFormat Hadoop -Mapreduce

2013-12-17 Thread Shekhar Sharma
Hello Ranjini, PFA the source code for XML Input Format. Also find the output and the input which i have used. ATTACHED FILES DESRIPTION: (1) emp.xml --->Input Data for testing (2)emp_op.tar.zg-->Output. Results of the map only job ( I have set the number of reducer=0) (3)src.tar--> the source f

Re: XmlInputFormat Hadoop -Mapreduce

2013-12-17 Thread Shekhar Sharma
Hi Ranjini, I have modified the code and it is perfectly working fine for me...Please mail me on shekhar2...@gmail.com i will send u the zip code... The code which you have writtenl, i really dont understand why from the mapper class you are emitting the key as NullWritable which doesn't make sens

XmlInputFormat Hadoop -Mapreduce

2013-12-17 Thread Ranjini Rathinam
Hi, I have attached the code. Please verify. Please suggest . I am using hadoop 0.20 version. import java.io.IOException; import java.util.logging.Level; import java.util.logging.Logger; import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.fs.FileSystem; import org.apache.hado

XmlInputFormat Hadoop -Mapreduce

2013-12-17 Thread Ranjini Rathinam
Hi, I am trying to process xml via mapreduce. and output should be in text format. I am using hadoop 0.20 the following error has occured , the link provided https://github.com/studhadoop/xmlparsing-hadoop/blob/master/XmlParser11.java I have used the Package org.apache.hadoop.mapreduce.lib. on

Re: Hadoop-MapReduce

2013-12-17 Thread Ranjini Rathinam
rser but this parser uses File as Object but >>> hdfs >>> >> uses FileSystem. >>> >> >>> >> Eg, >>> >> >>> >> File fXmlFile = new File("D:/elango/test.xml"); >>> >> >>> >> System.out.println(g); >>> >> Docum

Re: Hadoop-MapReduce

2013-12-17 Thread unmesha sreeveni
>> DocumentBuilderFactory.newInstance(); >> >> DocumentBuilder dBuilder = dbFactory.newDocumentBuilder(); >> >> Document doc = dBuilder.parse(fXmlFile); >> >> >> >> >> >> This cant be used as hdfs, because hdfs path is accessed t

Re: Hadoop-MapReduce

2013-12-17 Thread Ranjini Rathinam
e hdfs path is accessed through > >> FileSystem. > >> > >> I kindly request u to , Please suggest me to fix the above issue. > >> > >> Thanks in advance > >> > >> Ranjini R > >> > >> > >> > >> > >>

Re: Hadoop-MapReduce

2013-12-17 Thread Shekhar Sharma
t; I kindly request u to , Please suggest me to fix the above issue. >> >> Thanks in advance >> >> Ranjini R >> >> >> >> >> On Tue, Dec 10, 2013 at 11:07 AM, Ranjini Rathinam >> wrote: >>> >>> >>> >>> --

Re: Hadoop-MapReduce

2013-12-16 Thread Ranjini Rathinam
i R > > > > > On Tue, Dec 10, 2013 at 11:07 AM, Ranjini Rathinam > wrote: > >> >> >> -- Forwarded message -- >> From: Shekhar Sharma >> Date: Mon, Dec 9, 2013 at 10:23 PM >> Subject: Re: Hadoop-MapReduce >> To: user@h

Re: java.io.FileNotFoundException: File does not exist: Error while running Decision Tree Hadoop MapReduce

2013-12-12 Thread unmesha sreeveni
i export my program as jar file and run in cluster the above error > is happenening > But 2 files are created > C45/intermediate0.txt and C45/rule.txt. > But this results in a wrong output. > > > On Thu, Dec 12, 2013 at 3:37 PM, John Hancock wrote: > >> Is the file actual

Re: java.io.FileNotFoundException: File does not exist: Error while running Decision Tree Hadoop MapReduce

2013-12-12 Thread unmesha sreeveni
te: > Is the file actually in hdfs? Did you run "hadoop dfs -copyFromLocal > " > > > On Wed, Dec 11, 2013 at 5:53 AM, unmesha sreeveni > wrote: > >> I am trying to run Decision Tree in Hadoop MapReduce. >> >> But it is showing java.io.FileNotFound

Re: java.io.FileNotFoundException: File does not exist: Error while running Decision Tree Hadoop MapReduce

2013-12-12 Thread John Hancock
Is the file actually in hdfs? Did you run "hadoop dfs -copyFromLocal " On Wed, Dec 11, 2013 at 5:53 AM, unmesha sreeveni wrote: > I am trying to run Decision Tree in Hadoop MapReduce. > > But it is showing java.io.FileNotFoundException: File does not exist: in > cluste

Re: hadoop -Mapreduce

2013-12-11 Thread Ranjini Rathinam
gt;> Please check this mapper class whether this suites your needs. >> >> org.apache.hadoop.mapreduce.lib.map.MultithreadedMapper >> >> >> >> Thanks and Regards, >> >> Vinayakumar B >> >> *From:* Ranjini Rathinam [mailto:ranjinibe...@gmail.com] >> *Sent:* 09 D

java.io.FileNotFoundException: File does not exist: Error while running Decision Tree Hadoop MapReduce

2013-12-11 Thread unmesha sreeveni
I am trying to run Decision Tree in Hadoop MapReduce. But it is showing java.io.FileNotFoundException: File does not exist: in cluster. But when i tried it from Eclipse it is showing the correct result with the below settings Configuration conf = new Configuration(); conf.set("fs.defa

Re: Hadoop-MapReduce

2013-12-10 Thread Ranjini Rathinam
013 at 10:23 PM > Subject: Re: Hadoop-MapReduce > To: user@hadoop.apache.org > Cc: ssan...@datameer.com > > > It does work i have used it long back.. > > BTW if it is not working, write the custom input format and implement > your record reader. That would be far

Re: Hadoop-MapReduce

2013-12-09 Thread Shekhar Sharma
gt;>>> >>>> >>>> i am using hadoop 0.20 version and java 1.6 . >>>> >>>> Please suggest. >>>> >>>> Thanks in advance. >>>> >>>> Regrads, >>>> Ranjini. R >>>> On Mon, Dec 9,

Re: Hadoop-MapReduce

2013-12-09 Thread Shekhar Sharma
ava uses unchecked or unsafe operations. >>>> Note: Recompile with -Xlint:unchecked for details. >>>> 2 errors >>>> >>>> >>>> i am using hadoop 0.20 version and java 1.6 . >>>> >>>> Please suggest. >>>> >>

Re: Hadoop-MapReduce

2013-12-09 Thread Ranjini Rathinam
t;> Thanks in advance. >>> >>> Regrads, >>> Ranjini. R >>> On Mon, Dec 9, 2013 at 11:08 AM, Ranjini Rathinam < >>> ranjinibe...@gmail.com> wrote: >>> >>>> >>>> >>>> -- Forwarded message -- &

Re: hadoop -Mapreduce

2013-12-08 Thread Ranjini Rathinam
ites your needs. > > org.apache.hadoop.mapreduce.lib.map.MultithreadedMapper > > > > Thanks and Regards, > > Vinayakumar B > > *From:* Ranjini Rathinam [mailto:ranjinibe...@gmail.com] > *Sent:* 09 December 2013 11:13 > *To:* user@hadoop.apache.org > *Subject:* Re: hadoop -Mapreduce > > &g

Re: Hadoop-MapReduce

2013-12-08 Thread Ranjini Rathinam
ks in advance. >> >> Regrads, >> Ranjini. R >> On Mon, Dec 9, 2013 at 11:08 AM, Ranjini Rathinam < >> ranjinibe...@gmail.com> wrote: >> >>> >>> >>> -- Forwarded message -- >>> From: Subroto >>> Date:

Re: Hadoop-MapReduce

2013-12-08 Thread Ranjini Rathinam
e: > >> >> >> -- Forwarded message -- >> From: Subroto >> Date: Fri, Dec 6, 2013 at 4:42 PM >> Subject: Re: Hadoop-MapReduce >> To: user@hadoop.apache.org >> >> >> Hi Ranjini, >> >> A good example to look into :

RE: hadoop -Mapreduce

2013-12-08 Thread Vinayakumar B
Hi Ranjini, Please check this mapper class whether this suites your needs. org.apache.hadoop.mapreduce.lib.map.MultithreadedMapper Thanks and Regards, Vinayakumar B From: Ranjini Rathinam [mailto:ranjinibe...@gmail.com] Sent: 09 December 2013 11:13 To: user@hadoop.apache.org Subject: Re: hadoop

Re: hadoop -Mapreduce

2013-12-08 Thread Ranjini Rathinam
HI, I need to use thread functionality to run more than one mapper class parallelly. Please suggest with some sample code. Thanks in advance. On Fri, Dec 6, 2013 at 4:37 PM, Subroto wrote: > Hi Ranjini, > > The number of mappers depend on InputSplits and which intern depends on > size of inpu

Re: Hadoop-MapReduce

2013-12-06 Thread Subroto
Hi Ranjini, A good example to look into : http://www.undercloud.org/?p=408 Cheers, Subroto Sanyal On Dec 6, 2013, at 12:02 PM, Ranjini Rathinam wrote: > Hi, > > How to read xml file via mapreduce and load them in hbase and hive using java. > > Please provide sample code. > > I am using had

Re: hadoop -Mapreduce

2013-12-06 Thread Subroto
Hi Ranjini, The number of mappers depend on InputSplits and which intern depends on size of input data. The number of reducers can be configured by "mapred.reduce.tasks" Further you can get more information on numbers of Maps and Reduce for a Job from: http://wiki.apache.org/hadoop/HowManyMapsA

Hadoop-MapReduce

2013-12-06 Thread Ranjini Rathinam
Hi, How to read xml file via mapreduce and load them in hbase and hive using java. Please provide sample code. I am using hadoop 0.20 version and java 1.6. Which parser version should be used. Thanks in advance. Ranjini

hadoop -Mapreduce

2013-12-06 Thread Ranjini Rathinam
HI, How to run more than one mapper and reduce parallelly.? Please suggest.Thanks in advance. Ranjini.

Re: Desicion Tree Implementation in Hadoop MapReduce

2013-12-02 Thread unmesha sreeveni
Thank you Yexi...Thanks for spending your valuable time. On Mon, Dec 2, 2013 at 8:22 PM, Yexi Jiang wrote: > Yes, the user is responsible for using the correct model for a given piece > of testing (or unlabeled) data. > > > 2013/12/2 unmesha sreeveni > >> To make it more general, it's better t

Re: Desicion Tree Implementation in Hadoop MapReduce

2013-12-02 Thread Yexi Jiang
Yes, the user is responsible for using the correct model for a given piece of testing (or unlabeled) data. 2013/12/2 unmesha sreeveni > To make it more general, it's better to separate them. Since there might > be multiple batches of training (or to-be-label), and you only need to > train the m

Re: Desicion Tree Implementation in Hadoop MapReduce

2013-12-01 Thread unmesha sreeveni
To make it more general, it's better to separate them. Since there might be multiple batches of training (or to-be-label), and you only need to train the model once (if your data is stable). Ok , I will go for the second one. So if we are going for separate.They will not have any connection with b

Re: Desicion Tree Implementation in Hadoop MapReduce

2013-12-01 Thread Yexi Jiang
Actually the training and testing (or prediction) are not necessary to be done in one shot. If you need to do them consecutively in your particular scenario, you can do it as what you said. To make it more general, it's better to separate them. Since there might be multiple batches of training (or

Re: Desicion Tree Implementation in Hadoop MapReduce

2013-12-01 Thread unmesha sreeveni
1. I jst thought of building a model using a project named say DT and wen a huge input comes do another mr job test.java with in DT. If not chaining jobs we need to create seperate project right DT_build and DT_test projects NO need for seperate project file? 2. M1_train - dataset for training. M1

Re: Desicion Tree Implementation in Hadoop MapReduce

2013-12-01 Thread Yexi Jiang
What is your motivation of using chaining jobs? 2013/12/1 unmesha sreeveni > Thanks Yexi...A very nice explanation...Thanks a lot.. > Explained in a very simple way which is really understandable for > beginners..Thanks a lot. > I can go for chaining jobs right? > > > > > > On Sun, Dec 1, 2013

Re: Desicion Tree Implementation in Hadoop MapReduce

2013-12-01 Thread unmesha sreeveni
Thanks Yexi...A very nice explanation...Thanks a lot.. Explained in a very simple way which is really understandable for beginners..Thanks a lot. I can go for chaining jobs right? On Sun, Dec 1, 2013 at 8:55 PM, Yexi Jiang wrote: > In my opinion. > > 1. Build the decision tree model with the

Re: Desicion Tree Implementation in Hadoop MapReduce

2013-12-01 Thread Yexi Jiang
In my opinion. 1. Build the decision tree model with the training data. 2. Store it somewhere. 3. When the unlabeled data is available: 3.1 if the unlabeled data is huge, write another mrjob to process them, load the model at the setup stage, use the model to label the data one by one in map st

Re: Desicion Tree Implementation in Hadoop MapReduce

2013-12-01 Thread unmesha sreeveni
Thanks Yexi , But how it can be accomplished. The input to Desicion Tree MR will be a set of data. But while predicting a data it will be a one line data without classlabel right? So what changes will be there in mrjob.Should we design like this. 1. When a set of data is coming draw Desicion tree

Re: Desicion Tree Implementation in Hadoop MapReduce

2013-11-30 Thread Yexi Jiang
I watched the video in it but I cannot access its source code due to permission issue. In my opinion, once the decision tree model is built, the model is small enough to be loaded into memory and can be used directly without another mrjob for prediction. The prediction can be conducted in a streami

Re: Desicion Tree Implementation in Hadoop MapReduce

2013-11-30 Thread unmesha sreeveni
I have gone through a Map Reduce implementation of c4.5 in http://btechfreakz.blogspot.in/2013/04/implementation-of-c45-algorithm-using.html Here a decision tree is build. So my doubt is Can we also include the prediction along with that? On Tue, Nov 26, 2013 at 8:52 AM, Yexi Jiang wrote: > Y

Re: Desicion Tree Implementation in Hadoop MapReduce

2013-11-25 Thread Yexi Jiang
You are welcome :) 2013/11/25 unmesha sreeveni > ok . Thx Yexi > > > On Tue, Nov 26, 2013 at 1:41 AM, Yexi Jiang wrote: > >> As far as I know, there is no ID3 implementation in mahout currently, but >> you can use the decision forest instead. >> https://cwiki.apache.org/confluence/display/MAHO

Re: Desicion Tree Implementation in Hadoop MapReduce

2013-11-25 Thread unmesha sreeveni
ok . Thx Yexi On Tue, Nov 26, 2013 at 1:41 AM, Yexi Jiang wrote: > As far as I know, there is no ID3 implementation in mahout currently, but > you can use the decision forest instead. > https://cwiki.apache.org/confluence/display/MAHOUT/Breiman+Example. > > > 2013/11/25 unmesha sreeveni > >> I

Re: Desicion Tree Implementation in Hadoop MapReduce

2013-11-25 Thread Yexi Jiang
As far as I know, there is no ID3 implementation in mahout currently, but you can use the decision forest instead. https://cwiki.apache.org/confluence/display/MAHOUT/Breiman+Example. 2013/11/25 unmesha sreeveni > Is that ID3 classification? > It includes prediction also? > > > On Sat, Nov 23, 2

Re: Desicion Tree Implementation in Hadoop MapReduce

2013-11-24 Thread unmesha sreeveni
Is that ID3 classification? It includes prediction also? On Sat, Nov 23, 2013 at 9:01 PM, Yexi Jiang wrote: > You can directly find it at https://github.com/apache/mahout, or you can > check out from svn by following > https://cwiki.apache.org/confluence/display/MAHOUT/Version+Control. > > > 20

Re: Desicion Tree Implementation in Hadoop MapReduce

2013-11-23 Thread Yexi Jiang
You can directly find it at https://github.com/apache/mahout, or you can check out from svn by following https://cwiki.apache.org/confluence/display/MAHOUT/Version+Control. 2013/11/23 unmesha sreeveni > I want to go through Decision tree implementation in mahout. Refereed Apache > Mahout

Desicion Tree Implementation in Hadoop MapReduce

2013-11-23 Thread unmesha sreeveni
I want to go through Decision tree implementation in mahout. Refereed Apache Mahout 6 Feb 2012 - Apache Mahout 0.6 released Apache Mahout has reached version 0.6. All developers are encouraged to begin using version 0.6. Highlights include: Improved Decision Tree perfor

Re: Folder not created using Hadoop Mapreduce code

2013-11-15 Thread unmesha sreeveni
ote: > >> Maybe just a silly guess, did you close your Writer? >> >> Yong >> >> -- >> Date: Thu, 14 Nov 2013 12:47:13 +0530 >> Subject: Re: Folder not created using Hadoop Mapreduce code >> From: unmeshab...@gmail.com >

Re: LeaseExpiredException : Lease mismatch in Hadoop mapReduce| How to solve?

2013-11-15 Thread unmesha sreeveni
u r most welcome :) On Fri, Nov 15, 2013 at 12:46 PM, chandu banavaram < chandu.banava...@gmail.com> wrote: > thanks > > > On Thu, Nov 14, 2013 at 10:18 PM, unmesha sreeveni > wrote: > >> @chandu banavaram: >> This exception usually happens if hdfs is trying to write into a file >> which is no

Re: LeaseExpiredException : Lease mismatch in Hadoop mapReduce| How to solve?

2013-11-14 Thread chandu banavaram
thanks On Thu, Nov 14, 2013 at 10:18 PM, unmesha sreeveni wrote: > @chandu banavaram: > This exception usually happens if hdfs is trying to write into a file > which is no more in hdfs.. > > I think in my case certain files are not created in my hdfs.it failed to > create due to some permissions

Re: LeaseExpiredException : Lease mismatch in Hadoop mapReduce| How to solve?

2013-11-14 Thread unmesha sreeveni
@chandu banavaram: This exception usually happens if hdfs is trying to write into a file which is no more in hdfs.. I think in my case certain files are not created in my hdfs.it failed to create due to some permissions. I am trying out. On Wed, Nov 13, 2013 at 9:25 AM, unmesha sreeveni wrote:

Re: Folder not created using Hadoop Mapreduce code

2013-11-14 Thread unmesha sreeveni
yes . I closed :( On Thu, Nov 14, 2013 at 8:51 PM, java8964 java8964 wrote: > Maybe just a silly guess, did you close your Writer? > > Yong > > -- > Date: Thu, 14 Nov 2013 12:47:13 +0530 > Subject: Re: Folder not created using Hadoop Mapreduce

RE: Folder not created using Hadoop Mapreduce code

2013-11-14 Thread java8964 java8964
Maybe just a silly guess, did you close your Writer? Yong Date: Thu, 14 Nov 2013 12:47:13 +0530 Subject: Re: Folder not created using Hadoop Mapreduce code From: unmeshab...@gmail.com To: user@hadoop.apache.org @rab ra: ys using filesystem s mkdir() we can create folders and we can also create

Re: Folder not created using Hadoop Mapreduce code

2013-11-13 Thread unmesha sreeveni
@rab ra: ys using filesystem s mkdir() we can create folders and we can also create it using Path in = new Path("foldername"); On Thu, Nov 14, 2013 at 12:45 PM, unmesha sreeveni wrote: > i used to create folders previously like this and it created also. > I dnt know why it is not happening now .

Re: Folder not created using Hadoop Mapreduce code

2013-11-13 Thread unmesha sreeveni
i used to create folders previously like this and it created also. I dnt know why it is not happening now . And it is working with eclipse.ie i am getting "inputfile" file within "in" folder On Thu, Nov 14, 2013 at 12:42 PM, unmesha sreeveni wrote: > I tried hadoop fs -lsr / > I did nt find th

Re: Folder not created using Hadoop Mapreduce code

2013-11-13 Thread unmesha sreeveni
I tried hadoop fs -lsr / I did nt find there also On Thu, Nov 14, 2013 at 12:28 PM, Rahul Bhattacharjee < rahul.rec@gmail.com> wrote: > it might be creating within the user directory of the user in hdfs. > > trying creating something starting with a forward slash. > > Thanks, > Rahul > > > O

Re: Folder not created using Hadoop Mapreduce code

2013-11-13 Thread Rahul Bhattacharjee
it might be creating within the user directory of the user in hdfs. trying creating something starting with a forward slash. Thanks, Rahul On Wed, Nov 13, 2013 at 10:40 PM, Amr Shahin wrote: > Do you get an exception or it just fails silently ? > > > On Thu, Nov 14, 2013 at 10:27 AM, unmesha

Re: Folder not created using Hadoop Mapreduce code

2013-11-13 Thread rab ra
Unless you FileSystem's mkdir() method , i m not sure you create a folder in hdfs On 14 Nov 2013 11:58, "unmesha sreeveni" wrote: > I am trying to create a file with in "in" folder. but when i tried to run > this in cluster i noticed that this "in" folder is not within hdfs. > > why is it so? > >

Re: Folder not created using Hadoop Mapreduce code

2013-11-13 Thread Amr Shahin
run hadoop fs -lsr / Could've been created in a location different from where you expected On Thu, Nov 14, 2013 at 10:51 AM, unmesha sreeveni wrote: > no exception Amr. just fail to create . > > > On Thu, Nov 14, 2013 at 12:10 PM, Amr Shahin wrote: > >> Do you get an exception or it just fails

Re: Folder not created using Hadoop Mapreduce code

2013-11-13 Thread unmesha sreeveni
no exception Amr. just fail to create . On Thu, Nov 14, 2013 at 12:10 PM, Amr Shahin wrote: > Do you get an exception or it just fails silently ? > > > On Thu, Nov 14, 2013 at 10:27 AM, unmesha sreeveni > wrote: > >> I am trying to create a file with in "in" folder. but when i tried to run >>

Re: Folder not created using Hadoop Mapreduce code

2013-11-13 Thread Amr Shahin
Do you get an exception or it just fails silently ? On Thu, Nov 14, 2013 at 10:27 AM, unmesha sreeveni wrote: > I am trying to create a file with in "in" folder. but when i tried to run > this in cluster i noticed that this "in" folder is not within hdfs. > > why is it so? > > Any thing wrong? >

Folder not created using Hadoop Mapreduce code

2013-11-13 Thread unmesha sreeveni
I am trying to create a file with in "in" folder. but when i tried to run this in cluster i noticed that this "in" folder is not within hdfs. why is it so? Any thing wrong? My Driver code is Path in = new Path("in"); Path input = new Path("in/inputfile"); BufferedWriter createinput = n

Re: LeaseExpiredException : Lease mismatch in Hadoop mapReduce| How to solve?

2013-11-12 Thread unmesha sreeveni
:) Ok Why u also experienced the same? On Tue, Nov 12, 2013 at 5:14 PM, chandu banavaram < chandu.banava...@gmail.com> wrote: > plz send the answer to me for this query > > > On Tue, Nov 12, 2013 at 2:52 AM, unmesha sreeveni > wrote: > >> While running job with 90 Mb file i am getting LeaseExp

Re: LeaseExpiredException : Lease mismatch in Hadoop mapReduce| How to solve?

2013-11-12 Thread chandu banavaram
plz send the answer to me for this query On Tue, Nov 12, 2013 at 2:52 AM, unmesha sreeveni wrote: > While running job with 90 Mb file i am getting LeaseExpiredException > > 13/11/12 15:46:41 WARN mapred.JobClient: Use GenericOptionsParser for > parsing the arguments. Applications should impleme

LeaseExpiredException : Lease mismatch in Hadoop mapReduce| How to solve?

2013-11-12 Thread unmesha sreeveni
While running job with 90 Mb file i am getting LeaseExpiredException 13/11/12 15:46:41 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same. 13/11/12 15:46:42 INFO input.FileInputFormat: Total input paths to process : 1 13/11/12

Re: Empty value in write() method : Custom Datatype in Hadoop MapReduce

2013-10-30 Thread unmesha sreeveni
Empty value in write() method : Custom Datatype in Hadoop MapReduce[Solved] On Wed, Oct 30, 2013 at 11:23 AM, unmesha sreeveni wrote: > I am emiting two 2D double array as key and value.I am in construction of > my WritableComparable class. > > > *public class MF implements Wri

Empty value in write() method : Custom Datatype in Hadoop MapReduce

2013-10-29 Thread unmesha sreeveni
I am emiting two 2D double array as key and value.I am in construction of my WritableComparable class. *public class MF implements WritableComparable{ /** * @param args */private double[][] value; public MF() { // TODO Auto-generated constructor stub } public MF(double[][] value) { //

Re: How to make a Servlet execute a Hadoop MapReduce job and get results back

2013-07-01 Thread Dhaval Shah
You can submit a map reduce job from tomcat itself in blocking mode using java API and read directly from hdfs using java API as well. No need for exec Sent from Yahoo! Mail on Android

  1   2   >