Dale has not replied to this yet, but I don't think we have kept the interim files due to file space (big database).
Are their scripts needed to regenerate that table from what is in the db now? What is in that table anyhow? Frances McNamara University of Chicago -----Original Message----- From: [email protected] [mailto:[email protected]] On Behalf Of Mike Rylander Sent: Friday, June 12, 2009 7:12 AM To: Evergreen Development Discussion List Subject: Re: [OPEN-ILS-DEV] No rows in metabib.rec_descriptor after loading bibs. On Thu, Jun 11, 2009 at 7:23 PM, Dale Arntson<[email protected]> wrote: > Hi All, > > After loading our bib records into evergreen, I got zero hits on searches in > the evergreen client. I traced the problem back to the fact that there are > no records in the metabib.rec_descriptor table. The other metabib tables > seem fully populated. Here are the flags I used in parallel_pg_loader. > > perl parallel_pg_loader.pl -order bre -order mrd -order mfr -order mtfe > -order mafe -order msfe -order mkfe -order msefe -autoprimary mrd > -autoprimary mfr -autoprimary mtfe -autoprimary mafe -autoprimary msfe > -autoprimary mkfe -autoprimary msefe > > Any ideas what I did wrong? Any ideas how to fix it? The bibs took a long > time to load. I would rather not redo it, if I don't have to. If you have the intermediate bib processing files, particularly the output of direct_ingest, make sure that file contains 'mrd' rows. For OpenSRF 0.9 (Evergreen 1.2.x), those will start with '/*--S mrd--*/', and in OpenSRF 1.0.x they will start with '[{"__c":"mrd","__p"'. If you can find that file and it contains 'mrd' lines, then we can simply regenerate that table's worth of data. Otherwise you'll need to reprocess the bib records in order to get that data, but you won't need to reload the entire dataset. -- Mike Rylander | VP, Research and Design | Equinox Software, Inc. / The Evergreen Experts | phone: 1-877-OPEN-ILS (673-6457) | email: [email protected] | web: http://www.esilibrary.com
