thanks
2012/6/28 Harsh J
> The job starts up with a static list of paths and splits derived from
> it during submission time. Input files added to the directory after
> the job has been submitted are not considered.
>
> On Wed, Jun 27, 2012 at 11:15 PM, Félix López
> wrote:
> > I have a task wi
The job starts up with a static list of paths and splits derived from
it during submission time. Input files added to the directory after
the job has been submitted are not considered.
On Wed, Jun 27, 2012 at 11:15 PM, Félix López wrote:
> I have a task with a folder as input: FileInputFormat.set
Yes. you can. Here is code snippet from our project.
final JobConf conf = new JobConf( c_JOB_SETTING.getCustomizeConf() );
conf.setJar( c_JOB_SETTING.getJarName() );
conf.setJobName( "MortgageValTestinput=" + c_JOB_SETTING
.getInputPath() + " output=" + c_JOB_SETTING.getOutputP
you can use oozie for that.
Thanks,
Mayank
On Wed, Jun 27, 2012 at 9:54 AM, Félix López wrote:
> How can I send jobs remotely? I have a cluster running and I would like to
> execute a mapreduce task from another machine (outside the cluster) and
> without to have to do this : bin/hadoop jar ha
I have a task with a folder as input: FileInputFormat.setInputPaths(job,
new Path("/folder"));
What happens when the task is running and I write new files in the folder?
The task receive the new files or not?
Thanks
How can I send jobs remotely? I have a cluster running and I would like to
execute a mapreduce task from another machine (outside the cluster) and
without to have to do this : bin/hadoop jar hadoop.jar main input output
every time.
I don't know if this is possible in hadoop or maybe I have to prog
Sherif,
For 2.x, use via your Job's configuration, the properties
"mapreduce.map.log.level" and "mapreduce.reduce.log.level" (valid
values are TRACE/DEBUG/INFO/WARN/ERROR/FATAL) to switch the Child
JVM's task logging levels.
On Wed, Jun 27, 2012 at 5:53 PM, Sherif Akoush wrote:
> Hi,
>
> I have
Hi,
I have been trying to setup up Hadoop logging at the task level but
with no success so far. I have modified log4j.properties and set many
parameters to DEBUG level
(log4j.logger.org.apache.hadoop.mapred.Task=DEBUG
log4j.logger.org.apache.hadoop.mapred.MapTask=DEBUG
log4j.logger.org.apache.hado