Hadoop Priviledged User Error!

2013-08-16 Thread Pavan Sudheendra
Hi , Can anyone give me details why i'm getting the below error? 06:08:34,313 ERROR UserGroupInformation:1411 - PriviledgedActionException as:root (auth:SIMPLE) cause:org.apache.hadoop.security.AccessControlException: Permission denied: user=root, access=WRITE,

Re: Hadoop Priviledged User Error!

2013-08-16 Thread Nitin Pawar
read the error denied: user=root, access=WRITE,inode=/user:hdfs:supergroup:drwxr-xr-x you are executing the command as user root (remember a linux user root has no root level access to hdfs) . Your user (root) does not have write permission to /user either create a directory as user hdfs and

No FileSystem for scheme: hdfs in namenode HA

2013-08-16 Thread ch huang
hi,all i setup namenode HA hadoop cluster and write some demo code import java.io.FileNotFoundException; import java.io.IOException; import java.net.URI; import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.fs.FSDataOutputStream; import org.apache.hadoop.fs.FileStatus; import

question about data block replicator number

2013-08-16 Thread ch huang
hi,all: if i have three data node, and data block replicator number is 2, one of data node failed,so it's data block on it will move to another alive DN,the replicator still be 3 ,what if the failed DN recover ,the replicator will become 4?

Re: question about data block replicator number

2013-08-16 Thread Jitendra Yadav
Yup that's the responsibility of namenode to control under and over replicated blocks automatically, However you can you balancer script any time. Thanks On Fri, Aug 16, 2013 at 2:09 PM, bharath vissapragada bharathvissapragada1...@gmail.com wrote: No, namenode deletes over-replicated blocks

RE: Calling a MATLAB library in map reduce program

2013-08-16 Thread Chandra Mohan, Ananda Vel Murugan
Thanks for all the suggestion. I will explore more and raise specific questions if needed. Regards, Anand.C From: Sandy Ryza [mailto:sandy.r...@cloudera.com] Sent: Thursday, August 15, 2013 1:30 AM To: user@hadoop.apache.org Subject: Re: Calling a MATLAB library in map reduce program To add to

how to cache a remote hadoop file

2013-08-16 Thread Visioner Sadak
Hello friends i m using webhdfs to fetch a remote hadoop file in my browser is there any caching mechanism that you guys know to load this file faster http://termin1:50070/webhdfs/v1/Name1Home/new_file_d561yht35-9a1a-4a7b-9n.jpg?op=OPEN

Re: how to cache a remote hadoop file

2013-08-16 Thread Jitendra Yadav
I think Inmemory hadoop mechanism will full fill your requirement. Thanks On Fri, Aug 16, 2013 at 2:21 PM, Visioner Sadak visioner.sa...@gmail.comwrote: Hello friends i m using webhdfs to fetch a remote hadoop file in my browser is there any caching mechanism that you guys know to load this

Re: No FileSystem for scheme: hdfs in namenode HA

2013-08-16 Thread Harsh J
You require hadoop-hdfs dependency for HDFS FS to get initialized. Your issue lies in how you're running the application, not your code. If you use Maven, include hadoop-client dependency to get all the required dependency for a hadoop client program. Otherwise, run your program with hadoop jar,

Re: Exceptions in Name node and Data node logs

2013-08-16 Thread t578384
Hi Vimal , Could you elaborate this , what do you mean running 6 process in on single node, How many nodes do you have in total and all are used for hadoop and hbase as well. Possibilities: - Might be there is dn no listed in your slave file and configure as DN in you cluster. chances are

Re: how to cache a remote hadoop file

2013-08-16 Thread Visioner Sadak
thanks jeetu is there any configurations in order to implement it On Fri, Aug 16, 2013 at 2:26 PM, Jitendra Yadav jeetuyadav200...@gmail.comwrote: I think Inmemory hadoop mechanism will full fill your requirement. Thanks On Fri, Aug 16, 2013 at 2:21 PM, Visioner Sadak

Re: how to cache a remote hadoop file

2013-08-16 Thread Jitendra Yadav
Frankly, I am not using hadoop as in memory cache, but you can integrate it with other vendors offerings. below link might help you. http://www.gridgain.com/products/in-memory-hadoop-accelerator/ Thanks On Fri, Aug 16, 2013 at 4:33 PM, Visioner Sadak visioner.sa...@gmail.comwrote: thanks

Automating Hadoop installation

2013-08-16 Thread Andrew Pennebaker
I think it would make Hadoop installation easier if we released standardized packages. What if Ubuntu users could simply apt-get install hadoop they same way they apt-get install apache2? Similarly, could we release a Chocolatey http://chocolatey.org/ package for Windows users? The easier the

Re: Automating Hadoop installation

2013-08-16 Thread John Meagher
That sounds like what Bigtop is doing, at least covering the Linux distros. http://bigtop.apache.org/ On Fri, Aug 16, 2013 at 11:23 AM, Andrew Pennebaker apenneba...@42six.com wrote: I think it would make Hadoop installation easier if we released standardized packages. What if Ubuntu users

Re: Automating Hadoop installation

2013-08-16 Thread Konstantin Boudnik
Yup, patches are always welcome! As for win support - more the merrier! Although I doubt that many people here have such experience (say my own stops with a C++ compiler on 3.11; and never touched it since). Considering a very low interest in packaged stack from win'd crowd , I personally would

Re: Automating Hadoop installation

2013-08-16 Thread Matthew Farrellee
And in the case of Fedora, there's work underway to truly use distro standard packages so yum install hadoop will be handled by the Fedora infrastructure. If you're interested check out the Fedora Big Data SIG. Best, matt On 08/16/2013 02:06 PM, Konstantin Boudnik wrote: Yup, patches are

Things to keep in mind when writing to a db

2013-08-16 Thread jamal sasha
Hi, I am wondering if there is any tutorial to see. What are the challenges for reading and/or writing to/from database. Is there a common flavor across all the database. For example, the dbs start a server on some host : port Establish connection to that host:port It can be across proxy?

e-Science app on Hadoop

2013-08-16 Thread Felipe Gutierrez
Hello, Does anybody know an e-Science application to run on Hadoop? Thanks. Felipe -- *-- -- Felipe Oliveira Gutierrez -- felipe.o.gutier...@gmail.com -- https://sites.google.com/site/lipe82/Home/diaadia*

Re: e-Science app on Hadoop

2013-08-16 Thread Jay Vyas
there are literally hundreds. Here is a great review article for how mapreduce is used in the bioinformatics algorithms space: http://www.biomedcentral.com/1471-2105/11/S12/S1 On Fri, Aug 16, 2013 at 3:38 PM, Felipe Gutierrez felipe.o.gutier...@gmail.com wrote: Hello, Does anybody know an

Re: how to cache a remote hadoop file

2013-08-16 Thread Visioner Sadak
friends is there any open source caching mechanism for hadoop On Fri, Aug 16, 2013 at 4:56 PM, Jitendra Yadav jeetuyadav200...@gmail.comwrote: Frankly, I am not using hadoop as in memory cache, but you can integrate it with other vendors offerings. below link might help you.

Re: Decompression using LZO

2013-08-16 Thread Sanjay Subramanian
What do u want to do ? View the .LZO file on HDFS ? From: Sandeep Nemuri nhsande...@gmail.commailto:nhsande...@gmail.com Reply-To: user@hadoop.apache.orgmailto:user@hadoop.apache.org user@hadoop.apache.orgmailto:user@hadoop.apache.org Date: Tuesday, August 6, 2013 12:08 AM To: