And the conf dir (/Users/bryanduxbury/hadoop-0.16.3/conf) I hope it is the similar as the one you are using for your hadoop installation.

I'm not sure I understand this. It isn't similar, it's the same as my hadoop installation. I'm only operating on localhost at the moment. I'm just trying to get a LocalFileSystem up and running so I can run some tests.

On May 14, 2008, at 8:24 PM, lohit wrote:

You could do this.
open up hadoop (its a shell script). The last line is the one which executes the corresponding class of hadoop, instead of exec, make it echo and see what all is present in your classpath. Make sure your generated class path matches the same. And the conf dir (/ Users/bryanduxbury/hadoop-0.16.3/conf) I hope it is the similar as the one you are using for your hadoop installation.

Thanks,
lohit

----- Original Message ----
From: Bryan Duxbury <[EMAIL PROTECTED]>
To: core-user@hadoop.apache.org
Sent: Wednesday, May 14, 2008 7:30:26 PM
Subject: Re: Trouble hooking up my app to HDFS

Nobody has any ideas about this?

-Bryan

On May 13, 2008, at 11:27 AM, Bryan Duxbury wrote:

I'm trying to create a java application that writes to HDFS. I have
it set up such that hadoop-0.16.3 is on my machine, and the env
variables HADOOP_HOME and HADOOP_CONF_DIR point to the correct
respective directories. My app lives elsewhere, but generates it's
classpath by looking in those environment variables. Here's what my
generated classpath looks like:

/Users/bryanduxbury/hadoop-0.16.3/conf:/Users/bryanduxbury/
hadoop-0.16.3/hadoop-0.16.3-core.jar:/Users/bryanduxbury/
hadoop-0.16.3/hadoop-0.16.3-test.jar:/Users/bryanduxbury/
hadoop-0.16.3/lib/commons-cli-2.0-SNAPSHOT.jar:/Users/bryanduxbury/
hadoop-0.16.3/lib/commons-codec-1.3.jar:/Users/bryanduxbury/
hadoop-0.16.3/lib/commons-httpclient-3.0.1.jar:/Users/bryanduxbury/
hadoop-0.16.3/lib/commons-logging-1.0.4.jar:/Users/bryanduxbury/
hadoop-0.16.3/lib/commons-logging-api-1.0.4.jar:/Users/bryanduxbury/
hadoop-0.16.3/lib/jets3t-0.5.0.jar:/Users/bryanduxbury/
hadoop-0.16.3/lib/jetty-5.1.4.jar:/Users/bryanduxbury/hadoop-0.16.3/
lib/jetty-ext/commons-el.jar:/Users/bryanduxbury/hadoop-0.16.3/lib/
jetty-ext/jasper-compiler.jar:/Users/bryanduxbury/hadoop-0.16.3/lib/
jetty-ext/jasper-runtime.jar:/Users/bryanduxbury/hadoop-0.16.3/lib/
jetty-ext/jsp-api.jar:/Users/bryanduxbury/hadoop-0.16.3/lib/
junit-3.8.1.jar:/Users/bryanduxbury/hadoop-0.16.3/lib/kfs-0.1.jar:/
Users/bryanduxbury/hadoop-0.16.3/lib/log4j-1.2.13.jar:/Users/
bryanduxbury/hadoop-0.16.3/lib/servlet-api.jar:/Users/bryanduxbury/
hadoop-0.16.3/lib/xmlenc-0.52.jar:/Users/bryanduxbury/projects/
hdfs_collector/lib/jtestr-0.2.jar:/Users/bryanduxbury/projects/
hdfs_collector/lib/jvyaml.jar:/Users/bryanduxbury/projects/
hdfs_collector/lib/libthrift.jar:/Users/bryanduxbury/projects/
hdfs_collector/build/hdfs_collector.jar

The problem I have is that when I go to get a FileSystem object for
my file:/// files (for testing locally), I'm getting errors like this:

   [jtestr] java.io.IOException: No FileSystem for scheme: file
   [jtestr]       org/apache/hadoop/fs/FileSystem.java:1179:in
`createFileSystem'
   [jtestr]       org/apache/hadoop/fs/FileSystem.java:55:in `access
$300'
   [jtestr]       org/apache/hadoop/fs/FileSystem.java:1193:in `get'
   [jtestr]       org/apache/hadoop/fs/FileSystem.java:150:in `get'
   [jtestr]       org/apache/hadoop/fs/FileSystem.java:124:in
`getNamed'
   [jtestr]       org/apache/hadoop/fs/FileSystem.java:96:in `get'

I saw an archived message that suggested this was a problem with
the classpath, but there was no resolution that I saw. Does anyone
have any ideas why this might be occurring?

Thanks,

Bryan

Reply via email to