Joey Echeverria,
Yes, that works.
I thought job.addCacheFile(new URI(args[0])); could run on hadoop-0.20.2.
Because hadoop-0.20.2 could run context object.
Thanks!
--
Regards!
Jun Tan
At 2011-09-24 11:54:42,"Joey Echeverria" wrote:
>I think the API call you're looking for is
>DistributedC
I think the API call you're looking for is
DistributedCache.addCacheFile(URI, Configuration) [1]
-Joey
[1]
http://hadoop.apache.org/common/docs/r0.20.2/api/org/apache/hadoop/filecache/DistributedCache.html#addCacheFile(java.net.URI,
org.apache.hadoop.conf.Configuration)
2011/9/23 谭军 :
> Hi Joey
Hi Joey Echeverria,
My hadoop version is 0.20.2
--
Regards!
Jun Tan
在 2011-09-24 11:36:08,"Joey Echeverria" 写道:
>Which version of Hadoop are you using?
>
>2011/9/23 谭军 :
>> Harsh,
>> It is java.net.URI that is imported.
>>
>> --
>>
>> Regards!
>>
>> Jun Tan
>>
>> At 2011-09-24 00:52:14,"H
Which version of Hadoop are you using?
2011/9/23 谭军 :
> Harsh,
> It is java.net.URI that is imported.
>
> --
>
> Regards!
>
> Jun Tan
>
> At 2011-09-24 00:52:14,"Harsh J" wrote:
>>Jun,
>>
>>Common cause is that your URI class is not the right import.
>>
>>It must be java.net.URI and not any other
Hi Swathi.V.,
Thank you very much.
It's very kind of you to do that.
I think the code you gave is implemented in old APIs.
I made it several days ago. What I can't is by new APIs.
I just get started to mapreduce programming and get some problems with my code.
When you get time we can talk online.
T
Harsh,
It is java.net.URI that is imported.
--
Regards!
Jun Tan
At 2011-09-24 00:52:14,"Harsh J" wrote:
>Jun,
>
>Common cause is that your URI class is not the right import.
>
>It must be java.net.URI and not any other class. Fix this and your
>problem would go away.
>
>2011/9/23 谭军 :
>>
As part of my Java mapper I have a command executes some code on the
local node and copies a local output file to the hadoop fs.
Unfortunately I'm getting the following output:
"Error occurred during initialization of VM"
"Could not reserve enough space for object heap"
I've tried adjusti
Hi JunTun,
1. Distributed Cache in new API usage:
// Setting up the cache for the application
1. Copy the requisite files to the FileSystem:
$ bin/hadoop fs -copyFromLocal lookup.dat /myapp/lookup.dat
$ bin/hadoop fs -copyFromLocal map.zip /myapp/map.zip
$ bin/hadoop fs -co
Jun,
Common cause is that your URI class is not the right import.
It must be java.net.URI and not any other class. Fix this and your
problem would go away.
2011/9/23 谭军 :
> Hi,
> I encountered an error that I cannot understand.
>
> Configuration conf = new Configuration();
> Job job = new Job(c
Great!.. Thanks Raj and Mathias
Just a clarification query on top of my question.
I wanna log some information of my processing/data logged into my log files.
I'm planning to log it by LOG.debug() , if I do so in my mapper or reducer it'd
be availabe under HADOOP_HOME/logs/history dir, right?
Sec
Hi,
I encountered an error that I cannot understand.
Configuration conf = new Configuration();
Job job = new Job(conf, "job1");
job.addCacheFile(new URI(args[0]));
Why did it report "The method addCacheFile(URI) is undefined for the type Job"?
Thanks!
--
Regards!
Jun Tan
Harsh,
I am exploring the different features of Hadoop and have setup the Stand
alone mode in Eclipse and would like to know what features it does/doesn't
supports. For some reason running Hadoop in distributed mode from Eclipse
gives an exception.
Thanks,
Praveen
On Fri, Sep 23, 2011 at 1:10 PM
Hi Swathi.V.,
I think my code below would work:
Configuration conf1 = new Configuration();
Job job1 = new Job(conf1, "Retrieval1");
job1.setJarByClass(Retrieval.class);
job1.addCacheFile(new URI(args[0])); // problem here
conf1.set("keyNodeFile", ar
Hi Jun Tan,
Yes i use 0.21.0 version. So i have used those. Well the Hadoop Definitive
Guide has job dependency examples for 0.20.x.
Thank You,
2011/9/23 谭军
> Swathi.V.,
> ControlledJob cannot be resolved in my eclipse.
> My hadoop version is 0.20.2
> ControlledJob can only be resolved in hado
Swathi.V.,
ControlledJob cannot be resolved in my eclipse.
My hadoop version is 0.20.2
ControlledJob can only be resolved in hadoop 0.21.0 (+)?
Or I need some certain plugins?
Thanks
--
Regards!
Jun Tan
At 2011-09-22 00:56:54,"Swathi V" wrote:
Hi,
This code might help you
//JobDependancie
Hi All
I do have a query here on maintaining Hadoop map-reduce logs.
In default the logs appear in respective task tracker nodes which you can
easily drill down from the job tracker web UI at times of any failure.(Which
I was following till now) . Now I need to get into the next level
Hello Praveen,
Is your question from a test-case perspective?
Cause otherwise is it not clear what you gain in 'Distributed' vs. 'Standalone'?
On Fri, Sep 23, 2011 at 12:15 PM, Praveen Sripati
wrote:
> Hi,
>
> What are the features available in the Fully-Distributed Mode and the
> Pseudo-Distri
17 matches
Mail list logo