Hi,
In Hadoop-2.7.2, i am getting same error reported in here
https://issues.apache.org/jira/browse/HDFS-12985
Is there patch available for hadoop-2.7.2 version? How can i restart
namenode without null pointer exception?
Thanks,
Kumar
, 2016 at 12:28 PM, <wget.n...@gmail.com> wrote:
> Looks like the strong encryption policy file for Java (Oracle) isn’t
> installed. Or you don’t have a valid Kerberos ticket in your cache (klist).
>
>
>
> --
> B: mapredit.blogspot.com
>
>
>
> *From: *kumar r
Hi,
I am trying to configure hadoop pseudo node secure cluster (to ensure
proper working) in Azure using Azure Domain Service.
OS - Windows Server 2012 R2 Datacenter
Hadoop Version - 2.7.2
I can able to run
*hadoop fs -ls /*
Example MapReduce job works fine
*yarn jar
Hi,
I have configured hadoop-2.7.2 and oozie-4.2.0 with Kerberos security
enabled.
Distcp oozie action submitted as workflow job. When running the oozie
launcher, i am getting following exception.
2016-07-29 12:39:04,394 ERROR [uber-SubtaskRunner]
org.apache.hadoop.tools.DistCp: Exception
then the command would still display the mask,
> because the application knows to apply that logic regardless of the
> FileSystem implementation.
>
> [1] http://www.vanemery.com/Linux/ACL/POSIX_ACL_on_Linux.html
>
> --Chris Nauroth
>
> From: kumar r <kumarc...@gmail.com>
&g
Hi,
In Hadoop, i have enabled authorization. I have set few acl for a directory.
When i execute getfacl command in hadoop bin, i can see mask value in that.
hadoop fs -getfacl /Kumar
# file: /Kumar
# owner: Kumar
# group: Hadoop
user::rwx
user:Babu:rwx
group::r-x
mask::rwx
other::r-x
If i run
Hi,
Yes, Hadoop will work fine in windows for all mode including fully
distributed mode. The same will be applicable for Spark and Yarn.
On Mon, May 9, 2016 at 11:27 PM, Abi wrote:
> Is Hadoop work on multiple windows machine out of the box.
>
> 1. Keyword is "out
Hi,
Is there HDFS rest api support available for transparent encryption
"crypto" in HDFS?
https://hadoop.apache.org/docs/r2.7.2/hadoop-project-dist/hadoop-hdfs/TransparentEncryption.html#crypto_command-line_interface
Thanks,
Kumar
Hi,
I have configured hadoop-2.7.2 pseudo-node cluster in windows. When i
submit a MR job, it works fine. But if i submit multiple MR jobs then only
one job runs at a time.
First job is in RUNNING state and all other job in ACCEPTED state even yarn
has enough memory (3 GB free out of 6 GB).
But
Working on Hadoop-2.6.0, enabled HDFS ACLs. When trying through command
line, recursive -R working correctly but when using REST API, its not
working
*hadoop fs -setfacl -x -R default:group:HadoopUsers /test1*
The above command working correctly but when trying with REST API,
recursive not
I am windows user, Configured Hadoop-2.6.0 secured with kerberos. Trying to
set ACL for a directory using below command
*hadoop fs -setfacl --set user::rwx,user:user1:---,group::rwx,other::rwx /test1*
It gives
*-setfacl: Too many arguments
Usage: hadoop fs [generic options] -setfacl [-R]
Hi,
I am using hadoop-2.6.0 enabled with kerberos and ldap with Active
Directory in Windows. I have tested some ACL property. Below property
is not working for me.
security.job.client.protocol.acl
security.admin.operations.protocol.acl
I have set true for the following properties but still
12 matches
Mail list logo