This is somewhat of a noob question I know, but after learning about Hadoop, testing it in a small cluster and running Map Reduce jobs on it, I'm still not sure if Hadoop is the right distributed file system to serve web requests. In other words, can, or is it right to, serve Images and data from HDFS using something like FUSE to mount a filesystem where Apache could serve images from it? We have huge images, thus the need for a distributed file system, and they go in, get stored with lots of metadata, and are redundant with Hadoop/HDFS - but is it the right way to serve web content?
I looked at glusterfs before, they had an Apache and Lighttpd module which made it simple, does HDFS have something like this, do people just use a FUSE option as I described, or is this not a good use of Hadoop? Thanks P