"Armel Asselin" <[EMAIL PROTECTED]> wrote: > >> Hello, > >> > >> first of all, thanks for the great work! I've been absent for a long time > >> of > >> the scintilla list: it works too well ;-) > >> Nonetheless, I'm in front of this problem (I'm currently stucked with > >> 1.67 > >> but i'd like to know if it's already fixed, I checked history but I could > >> not see anything): > >> - I have a file of 1.5MB all in one word-wrapped line of XML showing > >> correctly > >> - I apply a filter of my own which pretty-prints XML (in a undo-able > >> manner), the XML is now 131000 lines high > >> - Scintilla re-set scrollbars after each line insertion (leading to > >> quadratic update time) > > > > Well, if you instead replace the one line with all 131,000 lines at once > > (SetSelection(), ReplaceSelection()), you shouldn't have that problem. > > > > - Josiah > thank you for the idea, but here i'm in front of predicitibility problem... > I do not know at all in advance if the file that I treat is heavily modified > or in very scarse manner (I mean small modifications spread everywhere), and > i'd like to avoid a 1.5 MB undo action if it can be done with a dozen of > 'few bytes' undo actions.
Unless your operation is slow itself, take a pass over your data to determine how many new lines would be created. If newlines > f(len(line)), for some function f, then replace the line. Otherwise do the iterative thing. My suggestion for f is square root. As long as you are creating fewer than sqrt(len content) new lines, the quadratic behavior of the mechanism you are using will keep to below O(n) overall running time. When you would take more than O(n) overall running time, switch to the 'replace everything' mechanism. If you are more concerned about undo/redo space, maybe use some other function of n. - Josiah _______________________________________________ Scintilla-interest mailing list [email protected] http://mailman.lyra.org/mailman/listinfo/scintilla-interest
