On Wed, Jul 21, 2010 at 10:25 PM, Edward Capriolo wrote:
> On Wed, Jul 21, 2010 at 12:42 PM, Xavier Stevens
> wrote:
> > Hi Urckle,
> >
> > A lot of the more "advanced" setups just record data directly to HDFS to
> > start with. You have to write some custom code using the HDFS API but
> > that
On 21/07/2010 17:55, Edward Capriolo wrote:
On Wed, Jul 21, 2010 at 12:42 PM, Xavier Stevens wrote:
Hi Urckle,
A lot of the more "advanced" setups just record data directly to HDFS to
start with. You have to write some custom code using the HDFS API but
that way you don't need to import
On Wed, Jul 21, 2010 at 12:42 PM, Xavier Stevens wrote:
> Hi Urckle,
>
> A lot of the more "advanced" setups just record data directly to HDFS to
> start with. You have to write some custom code using the HDFS API but
> that way you don't need to import large masses of data. People also use
> "
Hi Xavier,
thanks for replying. Your input is very much appreciated! This is
exactly what I need.
Thanks again,
Regards
Enthusiastic Hadoop newbie!! :-D
On 21/07/2010 17:42, Xavier Stevens wrote:
Hi Urckle,
A lot of the more "advanced" setups just record data directly to HDFS to
start w
Hi Urckle,
A lot of the more "advanced" setups just record data directly to HDFS to
start with. You have to write some custom code using the HDFS API but
that way you don't need to import large masses of data. People also use
"distcp" to do large scale imports, but if you're hitting something l
Hi, I have a newbie question.
Scenario:
Hadoop version: 0.20.2
MR coding will be done in java.
Just starting out with my first Hadoop setup. I would like to know are
there any best practice ways to load data into the dfs? I have
(obviously) manually put data files into hdfs using the shell co