Hi there I have a mySQL table with about 40000 entries. I also have a Rev cgi 2.5
script that sends (very) complex SELECT requests; in which the "WHERE" part can feature as much as 50 nested booleans winth "and", "or", "binary" etc For quite some time I realized that requests to mySQL slow down my scripts in a terrible way, both because of the complexity of the requests, and also because of (imho) a buffer size problem when the amount of data returned is huge... I've tried several tricks to improve the speed of my scripts (like "one big request" vs "several small requests in a loop"), but with no luck... finally, I tried the following : I dumped the content of myTable as a text file, opened it in the Rev script, and did the selection of records inside a "repeat for each line" loop. And to my surprise, the speed of the script improved to almost 40% (which is a lot for a script that used to take 5 to 6 sec, and now takes 3.5 to 4 sec)... Well, I don't know what conclusion to draw from this... Besides the obvious superiority of Transcript... I guess some wise and experienced guys will tell me that for sophisticated DB processing I should have switched to a better product (like Valentina) long time ago... But nevertheless I'm curious to know if anyone already faced the need to (almost) completely drop SQL in favor of Transcript for DB data search... Best, JB _______________________________________________ use-revolution mailing list use-revolution@lists.runrev.com Please visit this url to subscribe, unsubscribe and manage your subscription preferences: http://lists.runrev.com/mailman/listinfo/use-revolution