have understood the Hadoop and Hadoop Ecosystem(Pig as ETL, Hive as
DataWare house, Sqoop as importing tool). I worked and learned on single
node cluster with demo data.
As Hadoop suits best on Unix platform. Please help me to understand the
requirement form start to finish to use Hadoop
on Unix platform. Please help me to understand the
requirement form start to finish to use Hadoop in production.
What would be the things to use Hadoop on real time project.
like Hadoop automation on Unix, alert of failure process.
Please put some light on using Hadoop on real time
There is an important difference between real time and real fast
Real time means that system response must meet a fixed schedule.
Real fast just means sooner is better.
Real time systems always have hard schedules. The schedule could be in
microseconds to control a laser for making masks for
On Tue, Oct 2, 2012 at 7:05 PM, Hank Cohen hank.co...@altior.com wrote:
There is an important difference between real time and real fast
Real time means that system response must meet a fixed schedule.
Real fast just means sooner is better.
Good thought, but real-time can also include a
Cohen
From: Ted Dunning [mailto:tdunn...@maprtech.com]
Sent: Tuesday, October 02, 2012 4:13 PM
To: user@hadoop.apache.org
Subject: Re: HADOOP in Production
On Tue, Oct 2, 2012 at 7:05 PM, Hank Cohen
hank.co...@altior.commailto:hank.co...@altior.com wrote:
There is an important difference between
Hadoop in production.
What would be the things to use Hadoop on real time project.
like Hadoop automation on Unix, alert of failure process.
Please put some light on using Hadoop on real time and what objectives are
recommended.
Thanks Regards
Yogesh Kumar
(Pig as ETL, Hive as
DataWare house, Sqoop as importing tool). I worked and learned on single
node cluster with demo data.
As Hadoop suits best on Unix platform. Please help me to understand the
requirement form start to finish to use Hadoop in production.
What would be the things to use Hadoop
: NEED HELP:: using Hadoop in Production
Prioridade: Alta
Hi all,
I have understood the Hadoop and Hadoop Ecosystem(Pig as ETL, Hive as DataWare
house, Sqoop as importing tool). I worked and learned on single node cluster
with demo data.
As Hadoop suits best on Unix platform. Please help me
Hi Pavel,
Seems your team spent some time on the performance and tuning issues. Just
wonder whether an automatic Hadoop tuning tool like Starfish would be
interesting to you. We'd like to exchange the tuning experience with you.
Thanks,
Jie
Starfish Group, Duke
@hadoop.apache.org
Subject: Re: Experience with Hadoop in production
Just be sure you have that corporate card available 24x7 when you need
to call support ;)
Sent from my iPhone
On Feb 23, 2012, at 10:30, Serge Blazhievsky
serge.blazhiyevs...@nice.com wrote:
What I have seen companies do often
Hi,
We are going into 24x7 production soon and we are considering whether we
need vendor support or not. We use a free vendor distribution of Cluster
Provisioning + Hadoop + HBase and looked at their Enterprise version but it
is very expensive for the value it provides (additional functionality
A lot of it depends on your staff and their experiences.
Maybe they don't have hadoop, but if they were involved with large
databases, data warehouse, etc they can utilize their skills experiences
and provide a lot of help.
If you have linux admins, system admins, network admins with years of
What I have seen companies do often is that they will use free version of
the commercial vendor and only get their support if there are major
problems that they cannot solve on their own.
That way you will get free distribution and insurance that you have
support if something goes wrong.
Serge
Just be sure you have that corporate card available 24x7 when you need
to call support ;)
Sent from my iPhone
On Feb 23, 2012, at 10:30, Serge Blazhievsky
serge.blazhiyevs...@nice.com wrote:
What I have seen companies do often is that they will use free version of
the commercial vendor and
On Wed, Jul 21, 2010 at 10:25 PM, Edward Capriolo edlinuxg...@gmail.comwrote:
On Wed, Jul 21, 2010 at 12:42 PM, Xavier Stevens xstev...@mozilla.com
wrote:
Hi Urckle,
A lot of the more advanced setups just record data directly to HDFS to
start with. You have to write some custom code
Hi, I have a newbie question.
Scenario:
Hadoop version: 0.20.2
MR coding will be done in java.
Just starting out with my first Hadoop setup. I would like to know are
there any best practice ways to load data into the dfs? I have
(obviously) manually put data files into hdfs using the shell
Hi Urckle,
A lot of the more advanced setups just record data directly to HDFS to
start with. You have to write some custom code using the HDFS API but
that way you don't need to import large masses of data. People also use
distcp to do large scale imports, but if you're hitting something like
Hi Xavier,
thanks for replying. Your input is very much appreciated! This is
exactly what I need.
Thanks again,
Regards
Enthusiastic Hadoop newbie!! :-D
On 21/07/2010 17:42, Xavier Stevens wrote:
Hi Urckle,
A lot of the more advanced setups just record data directly to HDFS to
start
On Wed, Jul 21, 2010 at 12:42 PM, Xavier Stevens xstev...@mozilla.com wrote:
Hi Urckle,
A lot of the more advanced setups just record data directly to HDFS to
start with. You have to write some custom code using the HDFS API but
that way you don't need to import large masses of data.
On 21/07/2010 17:55, Edward Capriolo wrote:
On Wed, Jul 21, 2010 at 12:42 PM, Xavier Stevensxstev...@mozilla.com wrote:
Hi Urckle,
A lot of the more advanced setups just record data directly to HDFS to
start with. You have to write some custom code using the HDFS API but
that way you
Please?
Bill Boas
VP, Business Development
System Fabric Works
510-375-8840
[EMAIL PROTECTED]
www.systemfabricworks.com
21 matches
Mail list logo