Hello Sir,

I have some doubts, please help me.
we have requirement of scalable storage system, we have developed one
agro-advisory system in which farmers will sent the crop pictures
particularly in sequential manner some
6-7 photos of 3-4 kb each would be stored in storage server and these photos
would be read sequentially by scientist to detect the problem, writing to
images would not be done.

So for storing these images we  are using hadoop file system, is it feasible
to use hadoop
file system for the same purpose.

As also the images are of only 3-4 kb and hadoop reads the data in blocks of
size 64 mb
how can we increase the performance, what could be the tricks and tweaks
that should be done to use hadoop for such kind of purpose.

Next problem is as hadoop stores all the metadata in memory,can we use some
mechanism to store the files in the block of some greater size because as
the files would be of small size,so it will store the lots metadata and will
overflow the main memory
please suggest what could be done


regards,
Snehal

Reply via email to