Dear all,
I fixed the problem in the previous email by doing that on Ubuntu 10
instead of RedHat 9. RedHat 9 might be too old?
Thanks so much!
Bing
On Tue, Feb 14, 2012 at 1:00 PM, Bing Li wrote:
> Dear all,
>
> I am a new user of HDFS. The default Data/Name directory is /tmp. I would
> like t
Thanks Patrick,
The concept is clear to me now. As a first step I would like to configure LDAP
with Hadoop.
I am using Apache Hadoop 1.0.0 but not able to find configuration steps in this
version documentation.
It would be really helpful if someone can point me to relevant documentation of
conf
Hi folks,
I'm trying to compile Hadoop v1.0.0 libhdfs on Ubuntu 11.10 64-bit. All
java classes compile fine except when I specify -Dislibhdfs=1. Here's
where the error starts:
[exec] In file included from
/usr/include/x86_64-linux-gnu/sys/select.h:46:0,
[exec] from
/u
Thanks, Harsh. I did not cover the discussion before, what you suggest
works.
Hao
On Mon, Feb 13, 2012 at 3:32 PM, Harsh J wrote:
> Hao,
>
> This only affects your startup, and once its up you should have no
> further issues, and was fixed via
> https://issues.apache.org/jira/browse/HDFS-1835.
Hao,
This only affects your startup, and once its up you should have no
further issues, and was fixed via
https://issues.apache.org/jira/browse/HDFS-1835.
This was also previously discussed at http://search-hadoop.com/m/0n3W12wbQRQ
As a workaround, you can try adding inside conf/hadoop-env.sh:
Hi,
I am beginning with Hadoop. I have been experiencing a trouble of waiting a
long time before I can add (put) my input data to HDFS in Hadoop v0.20.2.
I followed configuration step and format the HDFS. I am able to create new
directory but when I am try to add input data set to DFS, I am stuck
If you want to browse files, etc in HDFS, you might look at Cloudera Hue.
It provides a web GUI with file explorer capabilities for HDFS. It is
very handy.
John
On Mon, Feb 13, 2012 at 7:00 AM, Michael wrote:
> Hi harsh,
>
> I've been tasked to create a simple portal to obtain info from
>
LDAP and Kerberos are orthogonal in Hadoop, but both are often used
together. LDAP allows for centralized user/group management (sort of like
DNS for your users). Kerberos is for strong authentication of users.
When using Kerberos in Hadoop, you want to propagate user/group identities
to all your
Hi harsh,
I've been tasked to create a simple portal to obtain info from
namenode/datanodes of a hadoop installation.
I know this might be available from cloudera and mapR, the tech leads want to
roll their own simple implementation to reduce dependencies when deploying.
We'll be fixed to a
There is one member variable which called dfs in DistributedFileSystem
class,
The type of dfs is DFSClient class.
All of the file system operation in DistributedFileSystem class
is transferred to the corresponding operation of dfs.
And the dfs will communicate with NameNode server by the meaning o
If you want to use the many methods of DFS, that is how you may check
for it and use it. However, the use of DistributedFileSystem directly
is discouraged as well (its not an interface, and is not meant to be
used directly by users at least).
What are you looking to do with it exactly Michael, and
Hi folks,
I'm using the FileSystem class to connect to a HDFS installation. I'm also
checking the instance if it's of DistributedFileSystem.
FileSystem fs = FileSystem.get(uri, conf, "hadoop");
>
> DistributedFileSystem dfs = null;
>
> if (fs instanceof DistributedFileSystem) {
> ...
>
Was wond
Hi,
I am bit confused on Security part of Hadoop. Cluster is behind the firewall. I
have read that Hadoop can be configured with LDAP also.
I want to know which is better : configure Hadoop security with LDAP or
Kerberos as both provide authentication.
Please provide me more details on this as
13 matches
Mail list logo