It should be like this:
 Configuration conf = new Configuration();
 Job job = new Job(conf, "test");
  job.setJarByClass(Test.class);

 DistributedCache.addCacheFile(new Path("your hdfs path").toUri(),
    job.getConfiguration());


but the best example is test cases:
http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/src/test/java/org/apache/hadoop/mapreduce/filecache/TestClientDistributedCacheManager.java?view=markup





On Wed, Jul 10, 2013 at 6:07 AM, Ted Yu <yuzhih...@gmail.com> wrote:

> You should use Job#addCacheFile()
>
>
> Cheers
>
>
> On Tue, Jul 9, 2013 at 3:02 PM, Botelho, Andrew <andrew.bote...@emc.com>wrote:
>
>> Hi,****
>>
>> ** **
>>
>> I was wondering if I can still use the DistributedCache class in the
>> latest release of Hadoop (Version 2.0.5).****
>>
>> In my driver class, I use this code to try and add a file to the
>> distributed cache:****
>>
>> ** **
>>
>> import java.net.URI;****
>>
>> import org.apache.hadoop.conf.Configuration;****
>>
>> import org.apache.hadoop.filecache.DistributedCache;****
>>
>> import org.apache.hadoop.fs.*;****
>>
>> import org.apache.hadoop.io.*;****
>>
>> import org.apache.hadoop.mapreduce.*;****
>>
>> import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;****
>>
>> import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;****
>>
>> ** **
>>
>> Configuration conf = new Configuration();****
>>
>> DistributedCache.addCacheFile(new URI("file path in HDFS"), conf);****
>>
>> Job job = Job.getInstance(); ****
>>
>> …****
>>
>> ** **
>>
>> However, I keep getting warnings that the method addCacheFile() is
>> deprecated.****
>>
>> Is there a more current way to add files to the distributed cache?****
>>
>> ** **
>>
>> Thanks in advance,****
>>
>> ** **
>>
>> Andrew****
>>
>
>

Reply via email to