Hi,

I rewrote the csv parser in order to learn how to use Parslet. I have no
problems using the built-in csv gem. However, as it stands know, with an
execution time of over 30 seconds for a decent large csv file I could not
use my own parser even if there was no alternative.

So, what I am interested in is the reason of why my parser takes that long.
Did I make a mistake or will Parslet always be relatively slow on larger
files? As I did not find any bottleneck in my code, I was hoping that
someone on this list could help me out.

Stefan

PS. Code of the csv parser benchmark:  https://gist.github.com/1192215


On Mon, Sep 5, 2011 at 05:04, Jonathan Rochkind <[email protected]> wrote:

> So... why not just use the other faster csv parser, what leads you to want
> to rewrite a parser if you already have a good solution?
>
> I don't see anything obviously in need or improvement in your parslet
> parser, but i'm no expert.
> ________________________________
> From: [email protected] [[email protected]] on behalf of
> Stefan Rohlfing [[email protected]]
> Sent: Sunday, September 04, 2011 12:02 AM
> To: [email protected]
> Subject: [ruby.parslet] Problem with Slow Parser
>
> Hi,
>
> I wrote a simple benchmark comparing the execution times of my csv parser
> based on Parslet and Ruby's csv gem.
>
> While the csv gem returned the result almost instantly, my parser took over
> 33 seconds on a large test file<
> https://github.com/circle/fastercsv/blob/master/test/test_data.csv>.
>
> Here is all the code: https://gist.github.com/1192215
>
> As I plan to use my parser in a real application, but cannot do so if it is
> too slow,  I took a look at the code by could not find anything that might
> explain the long execution time.
>
> Therefore I would be glad if someone could help me find this bottleneck.
>
> Thanks in advance!
>
> Stefan
>
>
>

Reply via email to