There is one commonly used alternative that does deprive the robot
visitor from harvesting data, and that's to limit what can be done
with a robot. You can easily do this, just like Google and Yahoo do
with transaction counting and stop access for any particular IP when
their allotment of transactions per time period expires. Granted, it's
a pain in the butt to implement such features, it does keep those
robots at bay though. If all your data is contained in a single XML
file, then this method is useless. If it takes a lot of calls to your
program to retrieve your database, then you can use it.

-John Coryat

http://maps.huge.info

http://www.usnaviguide.com
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Google Maps API" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to 
[email protected]
For more options, visit this group at 
http://groups.google.com/group/Google-Maps-API?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to