On Wed, Sep 19, 2012 at 10:36 AM, Peter Hickman
<[email protected]> wrote:
> On 19 September 2012 09:24, Robert Klemme <[email protected]> wrote:
>> On Mon, Sep 17, 2012 at 8:20 PM, Peter Zotov <[email protected]> 
>> wrote:
>>> That being said, I won't write number crunching algorithms in Ruby, or work
>>> with gigabyte-sized datasets.
>>
>> Well, even that depends.  Sifting through logfiles of that volume is
>> almost certainly IO bound and the processing speed can be negligible.
>> That of course depends on the processing.
>
> Being a Ruby shop we try to use Ruby everywhere but when it does come
> to grinding through logfiles (some several gigs in size) we had to go
> with Perl. The string processing and regex matching was so much
> faster. We couldn't get Ruby even close :( Speed was essential, we
> couldn't just sit around and wait for the process to complete.

Was that 1.8.* or did you try that with 1.9.* MRI?  What kind of
processing did you do and how big were the differences?

> Right tools for the job I guess. Ruby is fine for everything else so
> switching it all over to Perl would not really gain us anything.
> Besides if you are a programmer knowing more than one language is a
> must.

+1

Cheers

robert

-- 
remember.guy do |as, often| as.you_can - without end
http://blog.rubybestpractices.com/

-- You received this message because you are subscribed to the Google Groups 
ruby-talk-google group. To post to this group, send email to 
[email protected]. To unsubscribe from this group, send email 
to [email protected]. For more options, visit this 
group at https://groups.google.com/d/forum/ruby-talk-google?hl=en

Reply via email to