On 2007-05-22, Robert M Robinson <[EMAIL PROTECTED]> wrote:

>  That brings me to my question.  I have noticed that when editing large files 
>  (millions of lines), deleting a large number of lines (say, hundreds of 
>  thousands to millions) takes an unbelieveably long time in VIM--at least on 
>  my systems.  This struck me as so odd, I looked you up (for the first time 
>  in all my years of use) so I could ask why!
> 
>  Seriously, going to line 1 million of a 2 million line file and typing the 
>  command ":.,$d" takes _minutes_ on my system (Red Hat Linux on a 2GHz Athlon 
>  processor (i686), 512kb cache, 3 Gb memory), far longer than searching the 
>  entire 2 million line file for a single word (":g/MyQueryName/p").  Doing it 
>  this way fits way better into my usual workflow than using "head -n 
>  1000000", because of course I'm using a regular expression search to 
>  determine that I
>  want to truncate my file at line 1000000 in the first place.
> 
>  I looked in the archive, and couldn't see that this issue had been raised 
>  before.  Is there any chance it can get added to the list of performance 
>  enhancement requests?

Do you have syntax highlighting enabled?  That can really slow vim 
down.

I created and opened a file as follows:

   while true
   do
   echo '123456789012345678901234567890123456789012345678901234567890'
   done | head -2000000 > two_million_lines
   vim two_million_lines

Then within vim executed:

   50%
   :.,$d

Using vim 7.1 under Cygwin and Windows XP on a 3.6 GHz Pentium with 
2 GB of RAM:  9 seconds.

Using vim 7.1 under Red Hat Enterprise Linux WS release 4 on a 2.8 
GHz Pentium with 500 MB RAM:  16 seconds.

Regards,
Gary

-- 
Gary Johnson                 | Agilent Technologies
[EMAIL PROTECTED]     | Mobile Broadband Division
                             | Spokane, Washington, USA

Reply via email to