Hello!
Suppose this scenario:
1. The DFS block 64MB
2. We populate a SequenceFile with a binary value that has 200MB (that
represents a PDF file)
In the circumstances of above scenario:
1. How many blocks will be created on HDFS?
2. The number of blocks will be 200MB/64MB aprox 4 blocks?
3. How
Hello list, I am new on Hadoop, I configura a 3 slaves and 1 master
Hadoop cluster.
The problem is that with normal users I can not execute nothing:
$ hadoop dfs -ls /user/josu.lazkano/gutenberg
Found 7 items
-rw-r--r-- 2 josu.lazkano supergroup 343695 2011-10-11
09:47
I was hoping that if I updated the file it would give new answers as
datanodes were restarted and reconnected but that does not seem to be
the case.
Surely I dont need to restart the namenode...
- Original Message -
From: Josu Lazkano josu.lazk...@barcelonamedia.org
Date: Thursday, October 27, 2011 9:38 pm
Subject: Permission denied for normal users
To: hdfs-user@hadoop.apache.org
Hello list, I am new on Hadoop, I configura a 3 slaves and 1 master
Hadoop cluster.
The problem
Hi,
What is best way to load a snappy compressed files on Hadoop and then read them
through M/R
I see that hadoop can uncompress .gz files (using single map)
But looking for a better way to utilize multiple mappers to read precompressed
files.
Thanks and Regards,
Shantian
Or is there any way that we can compress the files while being copied to Hadoop
(either through hadoop fs -copyFromLocal or through HIVE's load data local
inpath '.')
From: Tim Broberg tim.brob...@exar.com
To: hdfs-user@hadoop.apache.org