I think they should be hadoop-common-0.21.0.jar (This has FileSystem
in it) and hadoop-hdfs-0.21.0.jar (Actual HDFS impl. classes)

(both ought to be available in the root of the distribution, scratch
my earlier hint of within directories, those contain sources and
binaries).

On Sat, May 7, 2011 at 10:57 PM, Bo Li <libo.1024....@gmail.com> wrote:
> Hi Harsh,
>
> Thank you very much for your reply. I am using 0.21.x. I have already
> included those two but still the problem is there. Could you specify the
> name of those two jar files so that I can confirm I was not using the wrong
> files.
>
> Best Regards,
> Bo
>
> On 5/6/2011 4:39 PM, Harsh J wrote:
>>
>> Hello Bo,
>>
>> What version of Hadoop are you using?
>>
>> In case of 0.20.x, use the single hadoop-*-core.jar available under
>> the root of the distribution's directory.
>> In case of 0.21.x, you will need to use two separate jars (from
>> common/ and hdfs/ folders of the distribution).
>>
>> On Sat, May 7, 2011 at 1:13 AM, Bo Li<libo.1024....@gmail.com>  wrote:
>>>
>>> Hi All,
>>>
>>> I need some help for my project. I plan to develop a Java program to
>>> manipulate remote HDFS. The code cannot pass the compilation with
>>> throwing
>>> some errs like "cannot find symbol FileSystem". I know that may be
>>> because I
>>> do not set .jar file into class path. But the thing is I do not know
>>> which
>>> jar file should I put in and where I could find that file(s). Could
>>> anyone
>>> help? Thank you very much!
>>>
>>> Best Regards,
>>> Bo
>>>
>>
>>
>



-- 
Harsh J

Reply via email to