I have this ongoing problem with Google's spider as well. The pages on some sites I maintain have dynamically inserted <base href=...> elements (which is a nice trick to edit pages in a subdirectory of a test machine and have all of the links work).
The pages also links to separate JavaScript files to save bandwidth using <script src=...> elements. Googlebot doesn't understand that base URL is also applied to script URLs. So when I'm at page http://www.example.doc/something, even though base points to "http://www.example.com/", and the script src is "scripts/widget.js", Googlebot looks for files in, you guessed it, "http://www.example.doc/something/scripts/widget.js". I wrote to Google's technical people about this. No response.