Pro tip. I've seen both push based systems and pull based systems at
work. The push based systems tend to break whenever the thing that
you're pushing to has problems. Pull-based systems tend to be much
more reliable in my experience.
You have described a push-based system. I would therefore
Ben Tilly emitted:
Pro tip. I've seen both push based systems and pull based systems at
work. The
push based systems tend to break whenever the thing that you're pushing to
has problems. Pull-based systems tend to be much more reliable in my
experience.
[...]
If you disregard this tip,
Queuing systems aren't really new or 'technofrippery'. In-memory FIFO
stacks are ridiculously fast compared to transaction safe rdbms' for this
simple purpose. Databases incur a lot of overhead for wonderful things
that don't aid this cause.
This isn't magic, sometimes it's just the right tool
On Fri, Apr 5, 2013 at 12:04 PM, John Redford eire...@hotmail.com wrote:
Ben Tilly emitted:
Pro tip. I've seen both push based systems and pull based systems at
work. The
push based systems tend to break whenever the thing that you're pushing to
has problems. Pull-based systems tend to be
Anthony Caravello writes:
Queuing systems aren't really new or 'technofrippery'. In-memory FIFO
stacks
are ridiculously fast compared to transaction safe rdbms' for this simple
purpose. Databases incur a lot of overhead for wonderful things that
don't aid
this cause.
No one said queuing
Ben Tilly expands:
On Fri, Apr 5, 2013 at 12:04 PM, John Redford eire...@hotmail.com wrote:
Your writing is FUD.
Are you reading something into what I wrote that wasn't there?
Because I'm pretty sure that what I wrote isn't FUD.
It was. Ask anyone. I'm not your English tutor.
A
I bow to you. I've been on this list for a long time and figured my 20
years of development and engineering experience might be of assistance and
for the first time I offered it. From now on, you should answer all the
questions.
-unsubscribe
On Apr 5, 2013 6:05 PM, John Redford
Approved for list. - bill
-- Forwarded message --
From: Greg London em...@greglondon.com
Date: Fri, Apr 5, 2013 at 12:22 AM
Subject: RE: intern?
To: r...@tamias.net r...@tamias.net, bob.cla...@verizon.net
bob.cla...@verizon.net, bill.n1...@gmail.com bill.n1...@gmail.com
Cc: Greg
I am currently in the midst of implementing a fairly non-trivial recursive
algorithm in Perl. The depth of the recursion is quite large, so much so that I
have set no warning recursion which warns with a depth over 100. This seems
pretty small to me! If the default is to warn at a depth of 100
On Apr 5, 2013 8:24 PM, Uri Guttman u...@stemsystems.com wrote:
as for your ram usage, all recursions can be unrolled into plain loops by
managing your own stack. this is a classic way to save ram and sub call
overhead. with perl it would be a fairly trivial thing to do. use an array
for the
On Fri, Apr 5, 2013 at 6:03 PM, Conor Walsh c...@adverb.ly wrote:
On Apr 5, 2013 8:24 PM, Uri Guttman u...@stemsystems.com wrote:
as for your ram usage, all recursions can be unrolled into plain loops by
managing your own stack. this is a classic way to save ram and sub call
overhead. with
Detailed? What's kept beyond a called b (arguments...) ? That's not a
lot of bytes, unless it's complete deep copies of structures.
-C.
On Apr 5, 2013 9:08 PM, Ben Tilly bti...@gmail.com wrote:
On Fri, Apr 5, 2013 at 6:03 PM, Conor Walsh c...@adverb.ly wrote:
On Apr 5, 2013 8:24 PM, Uri
Detailed? What's kept beyond a called b (arguments...) ? That's not a
lot of bytes, unless it's complete deep copies of structures.
perldoc -f caller
package, line number, etc.
Regardless, my understanding was that although perl's sub calls are
somewhat expensive to some other languages that
John, I don't know you, and it's quite possible that I am misinterpreting
your normal modes of communication, but your responses seem quite heated,
intemperate, and rather personal. There are certainly places for that on the
internet,
but the collegiality of this list is one of the reasons I, for
THEORY
Ever general computer science over-simplification has a BUT that is very
important.
Recursion is as efficient as iteration ...
... IF AND ONLY IF Tail Recursion Optimization is in effect.
When Tail Recursion is in effect, you do NOT have all that call stack,
you're only one level down
On Fri, Apr 5, 2013 at 6:39 PM, Bill Ricker bill.n1...@gmail.com wrote:
THEORY
Ever general computer science over-simplification has a BUT that is very
important.
Recursion is as efficient as iteration ...
... IF AND ONLY IF Tail Recursion Optimization is in effect.
When Tail Recursion is
On Fri, Apr 5, 2013 at 9:51 PM, Ben Tilly bti...@gmail.com wrote:
When in doubt, benchmark.
I always doubt my benchmarks.
I also doubt benchmarks, but doubt extrapolating from theory
to reality even more.
--
Bill
@n1vux bill.n1...@gmail.com
On Fri, Apr 05, 2013 at 09:03:13PM -0400, Conor Walsh wrote:
why is this that much faster than actual recursion? That speaks
poorly of lowercase-p perl.
This is not a perl specific issue (for the most part).
Most languages that support function calls need to maintain an activation
record for
function calls are relatively expensive.
Certainly more so than iteration or array operations.
Maybe we could get a new pragma:
no overhead 'subs';
;/
I've been fighting a perl script problem for a while now
and just recently figured out a potential solution
that just happens to involve a
Date: Fri, 05 Apr 2013 20:23:30 -0400
From: Uri Guttman u...@stemsystems.com
To: boston-pm@mail.pm.org
Subject: Re: [Boston.pm] Perl and recursion
Message-ID: 515f6b02.70...@stemsystems.com
Content-Type: text/plain; charset=ISO-8859-1; format=flowed
On 04/05/2013 08:13 PM, Adam Russell wrote:
I
at each level of recursion. What seems to be the case though is that when we
start going bac
up the stack that memory doesn't seem to be released at each pop. If, say, at
max depth
500mb of ram has been allocated I don't see that released at any point except
for when
perl exits and then of
On 04/05/2013 11:22 PM, Jerrad Pierce wrote:
at each level of recursion. What seems to be the case though is that when we
start going bac
up the stack that memory doesn't seem to be released at each pop. If, say, at
max depth
500mb of ram has been allocated I don't see that released at any
at each level of recursion. What seems to be the case though is that when we
start going bac
up the stack that memory doesn't seem to be released at each pop. If, say,
at max depth
500mb of ram has been allocated I don't see that released at any point
except for when
perl exits and then
On 4/6/2013 12:01 AM, Adam Russell wrote:
Ah! Ok, so maybe I was confused about this. Even if I set the last
reference to an object to undef perl will keep the memory until exit?
The high water mark for memory usage never goes down? Well, that is
fine I suppose, it isn't like this process will
24 matches
Mail list logo