SequenceFile with one very large value

2011-10-27 Thread Florin P
Hello! Suppose this scenario: 1. The DFS block 64MB 2. We populate a SequenceFile with a binary value that has 200MB (that represents a PDF file) In the circumstances of above scenario: 1. How many blocks will be created on HDFS? 2. The number of blocks will be 200MB/64MB aprox 4 blocks? 3. How

Permission denied for normal users

2011-10-27 Thread Josu Lazkano
Hello list, I am new on Hadoop, I configura a 3 slaves and 1 master Hadoop cluster. The problem is that with normal users I can not execute nothing: $ hadoop dfs -ls /user/josu.lazkano/gutenberg Found 7 items -rw-r--r-- 2 josu.lazkano supergroup 343695 2011-10-11 09:47

topology.script.file update

2011-10-27 Thread Tom Hall
I was hoping that if I updated the file it would give new answers as datanodes were restarted and reconnected but that does not seem to be the case. Surely I dont need to restart the namenode...

Re: Permission denied for normal users

2011-10-27 Thread Uma Maheswara Rao G 72686
- Original Message - From: Josu Lazkano josu.lazk...@barcelonamedia.org Date: Thursday, October 27, 2011 9:38 pm Subject: Permission denied for normal users To: hdfs-user@hadoop.apache.org Hello list, I am new on Hadoop, I configura a 3 slaves and 1 master Hadoop cluster. The problem

Loading and reading Snappy compressed files on Hadoop

2011-10-27 Thread Shantian Purkad
Hi, What is best way to load a snappy compressed files on Hadoop and then read them through M/R I see that hadoop can uncompress .gz files (using single map) But looking for a better way to utilize multiple mappers to read precompressed files. Thanks and Regards, Shantian

Re: Loading and reading Snappy compressed files on Hadoop

2011-10-27 Thread Shantian Purkad
Or is there any way that we can compress the files while being copied to Hadoop (either through hadoop fs -copyFromLocal or through HIVE's load data local inpath '.') From: Tim Broberg tim.brob...@exar.com To: hdfs-user@hadoop.apache.org