the urls never repeat. and it's a very active table so I don't wanna compress right now.
And you're right, most URLs are < 255chars, but some are bigger, so can't use varchar. I guess I'll just use TEXT :) Thanks! Peter On 9/12/06, Mike Wexler <[EMAIL PROTECTED]> wrote:
Peter Van Dijck wrote: > Hi, > URL's have a practical limit of 2083 characters it seems. To store > these in a space efficient way (I have 1,000,000s of url rows), what's > the best approach? varchar has a 255 maximum, right? Should I just use > TEXT? I'm not searching *in* the urls, I am selecting like this: > "where url = 'xxx'". Do the URLs occur multiple times? If so, I would create a URL table, that had the URL and a primary auto_increment key. Then I would just reference the key for each instance. The other thing you could do is use the compress function. Note, that while URLs can be 2083 characters, they generally aren't. So if you use a TEXT field and had 1,000,000 URLs and the average URL was x characters long you would need (x + overhead) * 1,000,000 bytes. I would guess for most situations (x + overhead) is less than 200, so that is only about 200 MB. not particularly huge. You can probably save a factor of 2 or 3 with compress. If the URLs repeat a lot, you can probably save a lot more than that with the sepearate URL table. > > Thanks, > Peter >
-- find videoblogs: http://mefeedia.com my blog: http://poorbuthappy.com/ease/ my job: http://petervandijck.net -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe: http://lists.mysql.com/[EMAIL PROTECTED]