#248: Database dump failure
-----------------------+------------------------------
  Reporter:  tbrooks   |      Owner:  simko
      Type:  defect    |     Status:  assigned
  Priority:  major     |  Milestone:
 Component:  MiscUtil  |    Version:
Resolution:            |   Keywords:  INSPIRE OpenAIRE
-----------------------+------------------------------
Changes (by simko):

 * status:  new => assigned


Comment:

 Replying to [comment:13 lmarian]:
 > I modified my.cnf (as suggested above) but I am not convinced this was
 causing the problem.. it might be AFS playing tricks on us :(

 I had increased `max_allowed_packet` on INSPIRE boxes because of the big
 citation dictionaries, where ~1GB of blob data is stored in one row.  This
 definitely helped to remove the dumping blocker at the time, although the
 dump error re-appeared from time to time afterwards.

 On CDS, the dumps were working fine in the past even with smaller
 `max_allowed_packet` value, because there are no big blobs such as
 citation dictionaries.  So it should not be necessary to tweak this
 concrete parameter.  Still, some of the tables are huge, so we may need to
 tweak some other buffer parameters.

 I'll try to look at this problem closer now that it reoccurs more
 frequently again.  FWIW, dumping to local file system also caused troubles
 on INSPIRE when I was testing this one year ago, although this may have
 been before `max_allowed_packet` and buffer tweaks.  I'll have a look.

-- 
Ticket URL: <http://invenio-software.org/ticket/248#comment:14>
Invenio <http://invenio-software.org>

Reply via email to