Hi all,
I loo the start-dfs.sh, it seems that it only support three start mode
: regular , upgrade and rollback. So why here do not support
importCheckpoint ? what is the consideration ?
--
Best Regards
Jeff Zhang
Hi,
I also have the source from hadoop-common. However, when I do ant clean jar
from hdfs folder, ivy seems to try and download the hadoop-core jar file from
the repository?
Maybe the newer version of hadoop-core is not compatible with mine?
How do I force the compile to use my own jar file, t
Thanks Todd, I forgot to change dfs.http.address
On Fri, May 21, 2010 at 12:58 PM, Todd Lipcon wrote:
> On Thu, May 20, 2010 at 9:50 PM, Jeff Zhang wrote:
>>
>> Hi Todd,
>>
>> I try the command, but no process is using the port 50070
>>
>
> Double check fs.default.name and dfs.http.address are
On Thu, May 20, 2010 at 9:50 PM, Jeff Zhang wrote:
> Hi Todd,
>
> I try the command, but no process is using the port 50070
>
>
Double check fs.default.name and dfs.http.address are both pointing to a
domain name which resolve on that machine to a local IP address. It seems to
think you're trying
Hi Todd,
I try the command, but no process is using the port 50070
On Fri, May 21, 2010 at 12:39 PM, Todd Lipcon wrote:
> The new machine apparently has something listening on one of the NN ports.
> try sudo fuser -n tcp 50070
> (will tell you which pid is listening on that port)
> -Todd
>
> O
The new machine apparently has something listening on one of the NN ports.
try sudo fuser -n tcp 50070
(will tell you which pid is listening on that port)
-Todd
On Thu, May 20, 2010 at 9:35 PM, Jeff Zhang wrote:
> Hi all,
>
> I'd like to recover the name node in another new machine. I copy the
Hi all,
I'd like to recover the name node in another new machine. I copy the
meta data from the old name node to the new name node, and then modify
the configuration (including the fs.default.name). Then I stop the old
dfs cluster, and restart the new dfs cluster on the new machine, then
I get the
Hi Harold,
The error message "cannot find symbol" hints of not having necessary
libraries. Looks like it is trying to access Kerberos libraries which it is
unable to find. You can check if all the required Kerberos libraries are
available.
Regards,
Sagar
-Original Message-
From:
Hi All,
For some reason, my hdfs source code can't compile anymore. ~1-2 weeks ago it
was compiling fine but now it's not. I haven't made any changes to my code
since I last compiled.
When I do ant clean jar: I get the following errors.
compile-hdfs-classes:
[javac] Compiling 198 source fi
On Thu, May 20, 2010 at 09:29, Erik Forsberg wrote:
> Assuming we're building a web interface that needs to read some files
> from HDFS, and we don't want to use the Java API, would the Thrift
> gateway be the best option? Or is fuse-dfs better?
I wrote namenode and datanode plugins to expose HDF
Hi!
What's the status of the Thrift gateway for HDFS
(http://wiki.apache.org/hadoop/HDFS-APIs)? My google karma seems to be
bad because I'm not getting many hits from people using it.
Is it stable? How's the performance?
Assuming we're building a web interface that needs to read some files
fro
11 matches
Mail list logo