I've been having some issues with scripts that ask IE to visit a lot
of URLs eventually killing IE and failing.  In these instances,
ruby.exe is sitting right around 30Mb of memory, Excel (used to import
data) is around 5Mb, and IE works itself up from 100Mb to 1.5Gb.

At it's most basic point, I have a script that gathers around 2,000
strings into a single array.  The script then iterates through this
array formatting a URL for IE using these strings.  IE uses a goto to
visit the page.  For instance, I boiled a script down to the code
below and experience the same memory problems:

array = [blah] #around 2,000 entries

def test
  array.each do |city|
   url = "http://www.google.com/search?q=#{city}";
   ie.goto url

   if ie.contains_text(city)
    puts "yay"
   else
    puts "nay"
   end #if
  end #array.each
end #test_goto

I've omitted the assigning of the array, but since excel and ruby
aren't hoarding memory, I don't believe this is related.  So in this
case, why is IE eating so much memory?  I tried another instance where
I added the ie = Watir::IE.new and an ie.close to the function, but
opening and closing IE for each url took up more memory.

Is there another way I should be doing this?  Right now IE will
survive for about 400 URLs and then die out.  I'm running it on an XP
laptop w/ 4Gb RAM and dual 2.2Ghz procs.
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Watir General" group.
To post to this group, send email to watir-general@googlegroups.com
Before posting, please read the following guidelines: 
http://wiki.openqa.org/display/WTR/Support
To unsubscribe from this group, send email to 
watir-general-unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/watir-general
-~----------~----~----~----~------~----~------~--~---

Reply via email to