>>>>> Dan Sully <[EMAIL PROTECTED]> writes:

> It may be a puny amount of data, but we do a lot to make sure the data we
> gather is correct. Everytime someone wants a new feature, it just adds that
> much more.

I guess I just don't believe any of this sanity checking/fixing up of
such a small amount of data should take any appreciable time at all on
a reasonably fast CPU.

So I got motivated enough to poke a little and figure out where the
time's really going.. for my 8253 track library of MP3s, I've
determined the following time breakdown just based on --d_import, the
total scan time of 341 sec is composed of:

  304 - MusicFolderScan Scan
   18 - mergeVariousAlbums
    9 - findArtwork scan
    4 - dbOptimize scan
    6 - other

Next I went into Slim::Schema::newTrack and broke out the total time
spent in each of the major sections of that function:

   41.4576 - Slim::Formats->readTags
    3.5431 - $self->_preCheckAttributes
    1.2019 - walk list of attributes
   35.2437 - Slim::Schema->resultset($source)->create
  215.0250 - $self->_postCheckAttributes

so that accounts for all but about 7 seconds of the 304 seconds of
MusicFolderScan time.

I'm a little surprised the tag reading is so slow (remember my shell
utility takes 20 seconds to dump all the tags), I expect it could be
2-3 times faster, but for it's nowhere close to being the bottleneck.
Next step is to look deeper into postCheckAttributes, but I've not
done that yet.

Going in a somewhat orthogonal direction, I also instrumented the
database code, and have these stats to report:

  operation  number  total time
  =========  ======  ==========
   SELECT    60633    50.3012
   INSERT    27852    24.5248
   UPDATE    10085    10.2031
   DELETE       48     0.0389

So the db operations account for about 85 seconds which is not so bad.

cheers,
Greg
_______________________________________________
beta mailing list
beta@lists.slimdevices.com
http://lists.slimdevices.com/lists/listinfo/beta

Reply via email to