Re: [Tutor] String concatenation too slow

2008-07-01 Thread Kent Johnson
On Tue, Jul 1, 2008 at 5:25 AM, W W <[EMAIL PROTECTED]> wrote:

> You might find this study/links to be helpful! We just had a
> discussion on this very concept, my guess is that you'll find the
> results informative, and especially helpful.

There is also an interesting recent thread on comp.lang.python:
http://groups.google.com/group/comp.lang.python/browse_thread/thread/cdef678dd995c54f/2af9e9eed46bf18c?lnk=gst

One thing I didn't know is that the optimization for string += only
occurs in some specific circumstances.

Kent
___
Tutor maillist  -  Tutor@python.org
http://mail.python.org/mailman/listinfo/tutor


Re: [Tutor] String concatenation too slow

2008-07-01 Thread W W
> On Tuesday 01 July 2008 00:12, Shrutarshi Basu wrote:

>>I've been
>> using string concatenation to read through the string, and then create
>> the new one based on a dictionary lookup. However it becomes very slow
>> once the string gets very long (several thousand characters). Part of
>> it is undoubtedly due to the fact that the algorithm is quadratic (i'm
>> trying to find a better way) but I was wondering if there might be a
>> faster alternative to string concatenation. Would appending to a list
>> of strings be faster? I'm going to be doing thousands of these
>> appends, so even a small boost would be helpful.
>> Thanks,
>> Basu

Basu,

You might find this study/links to be helpful! We just had a
discussion on this very concept, my guess is that you'll find the
results informative, and especially helpful.

At 04:28 AM 6/27/2008, Kent Johnson wrote:

On Fri, Jun 27, 2008 at 6:48 AM, Dick Moores <[EMAIL PROTECTED]> wrote:

> Instead I've tried to find out if it's true what Alex Martelli
writes on p.
> 484 in the section, "Building up a string from pieces" in his _Python in a
> Nutshell_, 2nd ed., which covers Python 2.4x.

You might be interested in this, complete with a picture:
http://personalpages.tds.net/~kent37/blog/arch_m1_2004_08.html#e55

and this followup:
http://personalpages.tds.net/~kent37/blog/arch_m1_2004_08.html#e56

HTH,
Wayne
___
Tutor maillist  -  Tutor@python.org
http://mail.python.org/mailman/listinfo/tutor


Re: [Tutor] String concatenation too slow

2008-06-30 Thread Chris Fuller
You could try creating a list of strings, and then using a ''.join(list) to 
concatenate, but you are probably going to be best off using the cStringIO 
module (http://docs.python.org/lib/module-cStringIO.html).  Try timing all 
three and see how they compare when joining lots of strings.  The length of 
the strings probably won't matter as much as the number of them.

Cheers

On Tuesday 01 July 2008 00:12, Shrutarshi Basu wrote:
> I'm working on a program to create Lindenmayer systems. These systems
> depend on heavy string rewriting to form complex patterns.I've been
> using string concatenation to read through the string, and then create
> the new one based on a dictionary lookup. However it becomes very slow
> once the string gets very long (several thousand characters). Part of
> it is undoubtedly due to the fact that the algorithm is quadratic (i'm
> trying to find a better way) but I was wondering if there might be a
> faster alternative to string concatenation. Would appending to a list
> of strings be faster? I'm going to be doing thousands of these
> appends, so even a small boost would be helpful.
> Thanks,
> Basu
___
Tutor maillist  -  Tutor@python.org
http://mail.python.org/mailman/listinfo/tutor


Re: [Tutor] String concatenation too slow

2008-06-30 Thread Shrutarshi Basu
My bad, should have included some code. Here's the function which does
the grunt work. self.axiom is a string, where each character gets
replaced by its counterpart from self.rules. output then goes back to
the calling function. That's the end of one generation of the string.
The next generation happens when generate() is called again after
axiom has been replaced (usually by output and some additions)

def generate(self):
output = ''
for element in self.axiom:
if element in self.rules.keys():
output = output + self.rules[element]
else:
output = output + element
return output

Looking at Wesley's example. the function should then yield each
character replacement (a dict lookup) and the join() would put
everything together. I think that would be faster. Will try soon.
Thanks,
Basu

On Tue, Jul 1, 2008 at 1:37 AM, wesley chun <[EMAIL PROTECTED]> wrote:
>> I've been
>>  using string concatenation to read through the string, and then create
>>  the new one based on a dictionary lookup. However it becomes very slow
>>  once the string gets very long (several thousand characters). [...]
>> I was wondering if there might be a
>>  faster alternative to string concatenation. Would appending to a list
>>  of strings be faster?
>
> without knowing more about your application, my 1st inclination would
> be to turn your code that "appends" each successive addition to the
> string into a generator function.  then when you need to final
> massively large string, i'd use a generator expression inside the call
> to the delimiter's join() method.
>
> for example:
>
> def nextLsystem(...):
>:
>for n in range(XXX):
># blah-blah stuff in a loop
>yield nextStr
>
> final = ''.join(x for x in nextLsystem(XXX))
>
> i like this code because it doesn't keep building up a data structure
> like continuously concatenating strings nor continually appending to a
> list, both of which are memory-intensive.
>
> i'm using a generator to create each successive string, without saving
> previous result necessarily.  then the generator expression -- unlike
> a list comprehension which must build an entire list -- passes each
> string to join(), which then creates the final string.
>
> i'm sure others have better ideas, but like it said, it's just a gut
> shot from me here.
>
> good luck!
> -- wesley
> - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
> "Core Python Programming", Prentice Hall, (c)2007,2001
>http://corepython.com
>
> wesley.j.chun :: wescpy-at-gmail.com
> python training and technical consulting
> cyberweb.consulting : silicon valley, ca
> http://cyberwebconsulting.com
>



-- 
The ByteBaker :
http://www.bytebaker.com
___
Tutor maillist  -  Tutor@python.org
http://mail.python.org/mailman/listinfo/tutor


Re: [Tutor] String concatenation too slow

2008-06-30 Thread wesley chun
> I've been
>  using string concatenation to read through the string, and then create
>  the new one based on a dictionary lookup. However it becomes very slow
>  once the string gets very long (several thousand characters). [...]
> I was wondering if there might be a
>  faster alternative to string concatenation. Would appending to a list
>  of strings be faster?

without knowing more about your application, my 1st inclination would
be to turn your code that "appends" each successive addition to the
string into a generator function.  then when you need to final
massively large string, i'd use a generator expression inside the call
to the delimiter's join() method.

for example:

def nextLsystem(...):
:
for n in range(XXX):
# blah-blah stuff in a loop
yield nextStr

final = ''.join(x for x in nextLsystem(XXX))

i like this code because it doesn't keep building up a data structure
like continuously concatenating strings nor continually appending to a
list, both of which are memory-intensive.

i'm using a generator to create each successive string, without saving
previous result necessarily.  then the generator expression -- unlike
a list comprehension which must build an entire list -- passes each
string to join(), which then creates the final string.

i'm sure others have better ideas, but like it said, it's just a gut
shot from me here.

good luck!
-- wesley
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
"Core Python Programming", Prentice Hall, (c)2007,2001
http://corepython.com

wesley.j.chun :: wescpy-at-gmail.com
python training and technical consulting
cyberweb.consulting : silicon valley, ca
http://cyberwebconsulting.com
___
Tutor maillist  -  Tutor@python.org
http://mail.python.org/mailman/listinfo/tutor


[Tutor] String concatenation too slow

2008-06-30 Thread Shrutarshi Basu
I'm working on a program to create Lindenmayer systems. These systems
depend on heavy string rewriting to form complex patterns.I've been
using string concatenation to read through the string, and then create
the new one based on a dictionary lookup. However it becomes very slow
once the string gets very long (several thousand characters). Part of
it is undoubtedly due to the fact that the algorithm is quadratic (i'm
trying to find a better way) but I was wondering if there might be a
faster alternative to string concatenation. Would appending to a list
of strings be faster? I'm going to be doing thousands of these
appends, so even a small boost would be helpful.
Thanks,
Basu

-- 
The ByteBaker :
http://www.bytebaker.com
___
Tutor maillist  -  Tutor@python.org
http://mail.python.org/mailman/listinfo/tutor