Hi
I am looking for a way to fetch all hadoop groups available on cluster.
Is there any way to do this using Java?
Thanks
Shashi
Acl java api for each.
> https://hadoop.apache.org/docs/r2.7.2/api/org/apache/
> hadoop/fs/FileSystem.html#setAcl(org.apache.hadoop.fs.Path,
> java.util.List)
>
> Rakesh
>
> On Mon, Sep 19, 2016 at 11:22 AM, Shashi Vishwakarma <
> shashi.vish...@gmail.com> wrote:
&g
-R, option can be used to apply operations to all files and
> directories recursively.
>
> Regards,
> Rakesh
>
> On Sun, Sep 18, 2016 at 8:53 PM, Shashi Vishwakarma <
> shashi.vish...@gmail.com> wrote:
>
>> I have following scenario. There is parent fold
I have following scenario. There is parent folder /user with five child
folder as test1 , test2, test3 etc in HDFS.
/user/test1
/user/test2
/user/test3
I applied acl on parent folder to make sure user has automatically access
to child folder.
hdfs dfs -setfacl -m
Hi
I want to create encrypted zone using WebHDFS. Below is command for which i
am looking for its REST implementation .
hdfs crypto -createZone -keyName myKey -path /zone
Can someone point me documentation/examaple for same?
Thanks
Shashi
Hi
I want to invoke below command using java .
hadoop key create myKey
hdfs crypto -createZone -keyName myKey -path /zone
Can someone point to java api documentation for this or any example to
implement this in java?
Thanks in advance.
Shashi
rship infrastructure is (i.e. run the "groups" command at
> the OS layer or query an LDAP server).
>
> I hope this helps.
>
> --Chris Nauroth
>
> [1]
> http://hadoop.apache.org/docs/r2.7.2/hadoop-project-dist/hadoop-hdfs/HdfsPermissionsGuide.html#Group_Mapping
> [2]
not files
> inside it) in class path of your client program.
>
>
>
> -Vinay
>
>
>
> *From:* Shashi Vishwakarma [mailto:shashi.vish...@gmail.com]
> *Sent:* Monday, November 02, 2015 10:47 PM
> *To:* user@hadoop.apache.org
> *Subject:* Re: Utility to push data into HD
which just requires an HTTP client on
> your platform of choice.
>
>
> http://hadoop.apache.org/docs/r2.7.1/hadoop-project-dist/hadoop-hdfs/WebHDFS.html
>
> --Chris Nauroth
>
> From: Shashi Vishwakarma <shashi.vish...@gmail.com>
> Reply-To: "user@hadoop
...@gmail.com>
> Reply-To: "user@hadoop.apache.org" <user@hadoop.apache.org>
> Date: Sunday, November 1, 2015 at 9:35 AM
> To: "user@hadoop.apache.org" <user@hadoop.apache.org>
> Subject: Re: Utility to push data into HDFS
>
> I am guessing t
Hi
I need build a common utility for unix/windows based system to push data
into hadoop system. User can run that utility from any platform and should
be able to push data into HDFS.
Any suggestions ?
Thanks
Shashi
Hi
I have been using curl command for ingesting data into HDFS using WebHDFS.
Admins are saying that resource utilization for curl is very high which is
hampering server performance.
I would like to know if curl can use so much of resource which could lead
to performance issue.
Just to give
AM, Gera Shegalov g...@shegalov.com wrote:
I filed https://issues.apache.org/jira/browse/HADOOP-12326 to do that,
you can take a look at the patch. Your understanding is correct: md5 of crc
in each block, then md5 of those block md5s.
On Sun, Aug 9, 2015 at 7:35 AM Shashi Vishwakarma
formatting or byte order you
could use md5sum for the remote file as well if the file is reasonably small
hadoop fs -cat /abc.txt | md5sum
On Fri, Aug 7, 2015 at 3:35 AM Shashi Vishwakarma
shashi.vish...@gmail.com wrote:
Hi
I have a small confusion regarding checksum verification.Lets say
Hi
I have a small confusion regarding checksum verification.Lets say , i have
a file abc.txt and I transferred this file to hdfs. How do I ensure about
data integrity?
I followed below steps to check that file is correctly transferred.
*On Local File System:*
md5sum abc.txt
Hi
I have a small confusion regarding checksum verification.Lets say , i have
a file abc.txt and I transferred this file to hdfs. How do I ensure about
data integrity?
I followed below steps to check that file is correctly transferred.
*On Local File System:*
md5sum abc.txt
Hi
I have map reduce job for hbase bulk load. Job is converting data into
Hfiles and loading into hbase but after certain map % job is failing. Below
is the exception that I am getting.
Error: java.io.FileNotFoundException:
17 matches
Mail list logo