A bit of a confusing set of replies was in the previous thread... Doing my best to answer the issue at hand, please reference past posts if my reply is not clear...

On May 28, 2009, at 6:04 AM, PJ wrote:

Could you clarify/expand on this a bit - I am setting up a site where I
expect to have a lot of images, both still and flvs, and a lot of
recipes (including ingredients, procedures ad text :& ;images).

I would put some thought into how you store the recipes. You mention XML below. This will all depend on the structure of the recipes data. In a simple case, you have quantity and item for each recipe. A simple relationship of something like the recipe parent, where that holds the title, and maybe description and instructions, linked to a second table with quantity and item columns would work fine.

That is also very rigid, and you never know where your data needs are going to deviate from that form. Put some thought into this. Get a good sampling of your recipe data to make sure you build this out in a way that is forward flexible.

I am
storing the images in /images directory. If the amount of images gets
rather large, you are suggesting to store them on another server, right?

I would first start be researching how your OS deals with large quantities of images in one single directory. Write a script to copy 1 image 100,000 times over into the same directory. Test how fast you can grab a random image, test how fast you can delete a random image.

Different OS's will behave different under different amounts

Even from a pure CLI management perspective, a simple `ls -la` will take some time to finish on 100,000 images.

I generally do something along the lines of:
images/$user-id/$year/$month/$day/$image-name-$random.ext

Back to your question. Just because the amount of images gets large, does not mean you need to use multiple servers. It is only if the load of requests for those images gets more than the server can handle that you may want to look into distributing that load.

Distributing image load over http, if that is what you are doing, is almost trivial. Starting with round robin DNS and rsync to more advanced load balancing, or looking into a CDN like what Amazon offers. There are solutions abound in this area.

Now, with a lot of recipes, I understand that I should be storing them
also on another server; and perhaps using XML to store the recipes. Does
that sound like I have understood your advice?

I am not so sure. Recipes are text only, take little space. A simple phpBB forum may have GB's of data in them, on a shared database server. You may want to have a second database for replication, as a "hot backup", and then have other things in place for "cold backups".

This all depends on a lot of factors about your end plans which have not yet been shared.
--
Scott * If you contact me off list replace talklists@ with scott@ *


--
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:    http://lists.mysql.com/mysql?unsub=arch...@jab.org

Reply via email to