Thanks. Hopefully you keep us informed via this thread.

Kylie McCormick schrieb:
Hello:
I am actually working on this myself on my project Multisearch. The Map()
function uses clients to connect to services and collect responses, and the
Reduce() function merges them together. I'm working on putting this into a
Servlet as well, too, so it can be used via Tomcat.

I've worked with a number of different web services... OGSA-DAI and Axis Web
Services. My experience with Hadoop (which is not entirely researched yet)
is that it is faster than using these other methods alone. Hopefully by the
end of the summer I'll have some more research on this topic (about speed).

The other links posted here are really helpful...

Kylie


On Tue, Aug 5, 2008 at 10:11 AM, Mork0075 <[EMAIL PROTECTED]> wrote:

Hello,

i just discovered the Hadoop project and it looks really interesting to me.
As i can see at the moment, Hadoop is really useful for data intensive
computations. Is there a Hadoop scenario for scaling web applications too?
Normally web applications are not that computation heavy. The need of
scaling them, arises from increasing users, which perform (every user in his
session) simple operations like querying some data from the database.

So distributing this scenario, a Hadoop job would be to "map" the requests
to a certain server in the cluster and "reduce" it. But this is what load
balancers normally do, this doenst solve the scalabilty problem so far.

So my question: is there a Hadoop scenario for "non computation heavy but
heavy load" web applications?

Thanks a lot





Reply via email to