How I share variables between jobs

2012-02-08 Thread Joan
her job, this variable doesn't exist. So, Can you help me, how I can set variable into job and rehuse in other job? Thanks Joan

Re: Chain multiple jobs

2011-02-11 Thread Joan
-> output (file) job2: input(job1's output) --> output(file) How to indicate to ControlledJob or JobControl that It doesn't add new dependency while job1 has not finished? Thanks Joan 2011/2/10 Joan > Hi, > > I've two jobs and I'm trying to control them by Co

Chain multiple jobs

2011-02-10 Thread Joan
get the exception: Exception in thread "main" java.io.FileNotFoundException: File "output from job1" does not exist. Because when "cjob2 = new ControlledJob(job2, dependingJobs);" is being instanced the input of job2 doesn't exist. Can someone help me? Thanks Joan

how to increase number of maps

2011-02-04 Thread Joan
don't understand I force number of maps with: job.getConfiguration().set("mapreduce.job.maps","4"); job.getConfiguration().set("mapreduce.map.tasks","4"); But this configuration doesn't run. Thanks joan

How to get file configuration from mapper

2011-01-28 Thread Joan
ocal/taskTracker/user/jobcache/job_201101260830_/attempt_201101260830__m_00_0/work/./conf/myconfig.xml * So this produces a error because It tries to get myconfig.xml but It doesn't find it. *Initial cause was ./conf/myconfig.xml (No such file or directory)* Someone can I help me? Thanks Joan

Re: How to set object in Configuration

2011-01-26 Thread Joan
Hi Li, Ye I agree but my object implements serializable but how to indicate it in hadoop's configuration? Joan 2011/1/27 li ping > yes, I agree. The configuration parameter should be a serialize-able value. > because the parameter will be transfer to other node to run the job. If

How to set object in Configuration

2011-01-26 Thread Joan
ng,Object) Someone know how to set custom objet into hadoop configuration? Thanks Joan

Re: How to pass object between jobs

2011-01-25 Thread Joan
Text object, from first job, to MyObject in second job and I wan't to do this, I want to can work with MyObject directly. Thanks, Joan 2011/1/25 Harsh J > Could you describe your need with an example? > > If you want the output of a Job (Job1), say being a single file with a >

How to pass object between jobs

2011-01-25 Thread Joan
Hi, I would like pass one object from job1 to job2 Someone can I help me, please? Thanks Joan

Re: How to reduce number of splits in DataDrivenDBInputFormat?

2011-01-20 Thread Joan
ize object) from previous Reducer. How can I do? Thanks Sonal, Joan 2011/1/20 Sonal Goyal > Which hadoop version are you on? > > You can alternatively try using hiho from > https://github.com/sonalgoyal/hiho to get your data from the db. Please > write to me directly i

Re: How to reduce number of splits in DataDrivenDBInputFormat?

2011-01-20 Thread Joan
> > Thanks and Regards, > Sonal > <https://github.com/sonalgoyal/hiho>Connect Hadoop with databases, > Salesforce, FTP servers and others <https://github.com/sonalgoyal/hiho> > Nube Technologies <http://www.nubetech.co> > > <http://in.linkedin.com/in/sonalgoya

Re: How to reduce number of splits in DataDrivenDBInputFormat?

2011-01-19 Thread Joan
Hi Sonal, I put both configurations: job.getConfiguration().set("mapreduce.job.maps","4"); job.getConfiguration().set("mapreduce.map.tasks","4"); But both configurations don't run. I also try to set "mapred.map.task" bu

How to reduce number of splits in DataDrivenDBInputFormat?

2011-01-19 Thread Joan
splits and i don't know how to reduce this splits or how to indicates to DataDrivenDBInputFormat splits by my date column (corresponds to splitBy). The main goal's improve performance, so I want to my Map's faster. Can someone help me? Thanks Joan

Re: how to write custom object using M/R

2011-01-19 Thread Joan
ject implements Writable, but I continues doesn't work, and I also put job.setOutputFormatClass(SequenceFileOutputFormat.class) and SequenceFileOutputFormat.setOutputPath(conf, outputDir). However, I'm not using "setOutputCompression". Joan 2011/1/18 David Rosenstrauch > I a

Re: how to write custom object using M/R

2011-01-19 Thread Joan
essionType.BLOCK); > > DR > > > > On 01/14/2011 01:27 PM, MONTMORY Alain wrote: > >> Hi, >> >> I think you have to put : >> job.setOutputFormatClass(SequenceFileOutputFormat.class); >> to make it works.. >> hopes this help &

Re: how to write custom object using M/R

2011-01-19 Thread Joan
ng(in); } @Override public void write(DataOutput out) throws IOException { out.writeInt(id); Text.writeString(out, str); } }* But I don't understand why not serialize object, Thanks Joan 2011/1/17 Lance Norskog > Does you custom object have Writable implemented?

Re: how to write custom object using M/R

2011-01-17 Thread Joan
Hi Alain, I put it, but It didn't work. Joan 2011/1/14 MONTMORY Alain > Hi, > > > > I think you have to put : > > job.setOutputFormatClass(SequenceFileOutputFormat.*class*); > > to make it works.. > > hopes this help > > > > Alain

how to write custom object using M/R

2011-01-14 Thread Joan
#x27;t know to save my object in first M/R and how to use it in second M/R Thanks Joan

How to compile Hadoop 0.21.0?

2011-01-13 Thread Joan
doop-trunk\build.xml BUILD FAILED C:\workspace\hadoop-trunk\build.xml:67: Execute failed: java.io.IOException: Cannot run program "tr": CreateProcess error=2, Thanks Joan

Solr + Hadoop

2011-01-13 Thread Joan
org.apache.hadoop.mapred.Task.initialize(Task.java:487) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:311) at org.apache.hadoop.mapred.Child$4.run(Child.java:217) at java.security.AccessController.doPrivileged(Native Method)* Please, Is someone using this patch with 0.21.0 Version Hadoop?. Can someone help me? Thanks, Joan

Re: How to split DBInputFormat?

2011-01-04 Thread Joan
Thanks, I've incremented number map tasks and number of reduce tasks, Although worksI think that it's not a solution so I will try both proposals Joan 2011/1/4 Hari Sreekumar > Arvind, > > Where can I find DataDrivenInputFormat? Is it available in v0.20.2 and is >

How to split DBInputFormat?

2011-01-03 Thread Joan
ds and I would like using DBInputSplit but I don't know how I used it and how many split I need? Thanks Joan

Creating Solr index from map/reduce

2010-12-29 Thread Joan
Hi, I'm trying generate Solr index from hadoop (map/reduce) so I'm using this patch SOLR-301 , however I don't get it. When I try to run CSVIndexer with some arguments: -solr I'm runnig CSVIndexer: /bin/hadoop jar my.jar CSVIndexer -solr /