I Found it works with the following
You need to be in a thread as a PrivilegedExceptionAction
final String user = My Identity;
UserGroupInformation uig = UserGroupInformation.createRemoteUser(user);
try {
return uig.doAs(new PrivilegedExceptionActionReturnType() {
I am trying to move our jobs from a 0.2.. hadoop cluster to a 1.03 cluster
One issue I am running into is that files - and specifically the hadoop
output files are created with permission
*rw---*
*
*
our 0.2 cluster the permissions are
*rw-r--r--*
*
*
*The problem is that the web hdfs
Hi Steve,
You can use fs.permissions.umask-modeto set the appropriate umask
From: Steve Lewis lordjoe2...@gmail.com
To: common-user common-user@hadoop.apache.org
Sent: Monday, May 20, 2013 9:33 AM
Subject: Default permissions for Hadoop output files
I am
Has anybody used pluggable block placement policy?
If yes then give direction to me, how to use this feature in hadoop
2.0.3-alpha and I also need the code of pluggable block placement policy.
Thanks in advance.
--
*With regards ---*
*Mohammad Mustaqeem*,
M.Tech (CSE)
MNNIT Allahabad
9026604270
Hi Mohammad,
I believe we've already answered this earlier to an almost exact question:
http://search-hadoop.com/m/b8FAa2eI6kj1. Does that not suffice? Is there
something more specific you are looking for?
On Mon, May 20, 2013 at 12:30 PM, Mohammad Mustaqeem 3m.mustaq...@gmail.com
wrote:
Has
Hi
You find good start up materiel for RHadoop here,
https://github.com/RevolutionAnalytics/rmr2/blob/master/docs/tutorial.md
http://bighadoop.wordpress.com/2013/02/25/r-and-hadoop-data-analysis-rhadoop/
I am also working on RHadoop, mail me if any difficulty in RHadoop.
On Mon, May 20, 2013
Not sure if this is the right group to ask questions about flume:
I am getting an exception about unable to open a port in flume when trying to
create a remote agent, more details below:
---
13/05/20 04:55:30 ERROR avro.AvroCLIClient: Unable to open connection to Flume.
Sai, Are you able to run the netcat flume sample?
--
Lenin.
On May 20, 2013 5:40 PM, Sai Sai saigr...@yahoo.in wrote:
Not sure if this is the right group to ask questions about flume:
I am getting an exception about unable to open a port in flume when trying
to create a remote agent, more
Lenin
Thanks for your reply.
Here is the 1st sample which works, i am not sure if you r referring to this:
---
agent1.sources = netsource
agent1.sinks = logsink
agent1.channels = memorychannel
agent1.sources.netsource.type = netcat
agent1.sources.netsource.bind =
Hi,
I have a basic question on HDFS. I was reading that HDFS doesnt work well with
low latency data access. Rather it is designed for the high throughput
of data. Can you please explain in simple words the difference between
Low latency data access Vs High throughput of data.
Thanks,
Raj
I'll take a swing at this one.
Low latency data access: I hit the enter key (or submit button) and I
expect results within seconds at most. My database query time should be
sub-second.
High throughput of data: I want to scan millions of rows of data and count
or sum some subset. I expect this
Hi Chris,
Thanks for the explaination.
Regards,
Raj
From: Chris Embree cemb...@gmail.com
To: user@hadoop.apache.org; Raj Hadoop hadoop...@yahoo.com
Sent: Monday, May 20, 2013 1:51 PM
Subject: Re: Low latency data access Vs High throughput of data
I'll
Hi,
I was not able to stopThrift Server after performing the following steps.
$ bin/hive --service hiveserver
Starting Hive Thrift Server
$ netstat -nl | grep 1
tcp 0 0 :::1 :::* LISTEN
I gave the following to stop. but not working.
hive --service hiveserver --action stop 1
Hi Sanjay,
I am using 0.9 version.
I do not have a sudo access. is there any other command to stop the service.
thanks,
raj
From: Sanjay Subramanian sanjay.subraman...@wizecommerce.com
To: u...@hive.apache.org u...@hive.apache.org; Raj Hadoop
14 matches
Mail list logo