Re: [OPEN-ILS-GENERAL] How to have multiple CPU's used, importing a million bib records in EG 2.0?

2011-05-03 Thread Repke de Vries
Hi Dan

eight piped processes are running fine right now - as per your suggestion.

However: on tying together with Unix pipe:

 So there's no delay when you pipe the
 commands; the import into the database begins immediately.


the evidence is somewhat circumstantial but our observation is that importing 
does *not* begin immediately and pg_loader seems to be the culprit:  it does 
*not* pass on to psql but keeps piling up 'till marc2bre has finished and only 
then starts feeding the database through psql.

The evidence being that though the pg_loader counter starts writing to the 
screen right away, it is only at the end of the MARC bib data chunk that (what 
we assume to be the) the pg_loader message SET .. BEGIN .. Writing file 
appears. 
Given CPU activity on the database server, only after that the stream of COPY 
statements starts getting the data imported in the db.  //Times eight because 
of running in parallel.  For the moment we don't fine tune any of the flags you 
mention.//  

Question: are we overlooking some extra parameter to pg_loader that should have 
it start passing on / importing in the database right away? 
Here is how we are calling [1] and thanks,

Repke, IISH

[1]
PERLLIB=/openils/lib/perl5 perl 
/usr/src/Evergreen-ILS-2.0.3/Open-ILS/src/extras/import/marc2bre.pl --marctype 
XML --db_host  xx --db_name xx --db_user xx --db_pw xx 
bibliographical.xml.1.xml | perl 
/usr/src/Evergreen-ILS-2.0.3/Open-ILS/src/extras/import/pg_loader.pl -or bre -a 
bre | psql -h xx -d xx -U xx


Op 3 mei 2011, om 01:37 heeft Dan Scott het volgende geschreven:

 Hi Repke:
 
 On Mon, May 2, 2011 at 3:40 PM, Repke de Vries re...@xs4all.nl wrote:
 Hi,
 
 though we worked out our combination of calling marc2bre followed by 
 pg_loader followed by psql as successive, separate steps  - we are dead in 
 the water 'cause PosgreSQL alone (last step) takes four hours for 100K 
 records: meaning 40 hours for all of our one million bib records.
 
 Right, you don't want to do each of those steps separately, that will
 elongate the process.
 
 snip
 
 Would connecting the steps with UNIX pipes and feeding it the big chunk of 
 one million records, do it?
 So: marc2sre [our calling parameters] [input = the one million bib records] 
 | pg_loader [our calling parameters] | psql
 
 We had that Unix pipes advice a couple of times but it seems 
 counter-intuitive: isn't the net result still one large file that goes into 
 PostgreSQL and therefore using one single instead of multiple CPU's ?
 
 Well - rather than having to finish creating each big file at each
 step of the process, psql can import each record immediately as it
 works its way through the pipe. So there's no delay when you pipe the
 commands; the import into the database begins immediately.
 
 To make it parallel, if you have 4 CPUs available, open 4 terminal
 sessions and run marc2*re against distinct subsets of the records in
 each terminal session. 
snip



[OPEN-ILS-GENERAL] How to have multiple CPU's used, importing a million bib records in EG 2.0?

2011-05-02 Thread Repke de Vries
Hi,

though we worked out our combination of calling marc2bre followed by pg_loader 
followed by psql as successive, separate steps  - we are dead in the water 
'cause PosgreSQL alone (last step) takes four hours for 100K records: meaning 
40 hours for all of our one million bib records.

Also does this approach only use one CPU while we have 4 or even 10 x 4 with 
our CentOS for the moment sitting on top of ten 4 CPU machines.  Memory has 
plenty of GB.

** What  can we do better ?  ** Need to get the 40+ hours down. 

The million bib records are available as one big chunk and as 8 smaller chunks. 

I studied parallel_pg_loader as alternative, assuming that parallel means 
spewing out such SQL files for pgsql that PostgreSQL starts working parallel 
processes and thus using all those extra CPU's automatically. 

But I can't a) find out how to call parallel_pg_loader to instruct PostgreSQL 
to work on my 8 smaller chunks simultaneously b) it seems to be designed to 
solve a different problem: smaller sizes of working memory (and we have plenty).

Would connecting the steps with UNIX pipes and feeding it the big chunk of one 
million records, do it?
So: marc2sre [our calling parameters] [input = the one million bib records] | 
pg_loader [our calling parameters] | psql

We had that Unix pipes advice a couple of times but it seems 
counter-intuitive: isn't the net result still one large file that goes into 
PostgreSQL and therefore using one single instead of multiple CPU's ?

Those experienced: please assist !  Colleagues of mine have been shopping with 
this practical problem at last week's EG conference (yeah !) but there was so 
much else too. Excellent conference I heard. 

Thanks, IISH - Amsterdam, Repke 

Re: [OPEN-ILS-GENERAL] EG 2.0: problem Marc2bre - Global Flags Maintain 001-003-035 according MARC and TCN=Internal ID

2011-03-05 Thread Repke de Vries
Exactly right Dan !

And there is a lesson or two here: 

- you are explaining what some of the calling parameters of importing scripts 
actually do; this is much needed: even reading these script's code this is not 
always clear or evident; and the scripts don't have a --h for Help parameter 
either (I know: building in Help information means even more work);
without adequate information people like me busy migrating or new comers 
evaluating Evergreen, blindly copy existing examples of calling parameter 
values - and easily get incomprehensible results or draw wrong conclusions on 
Evergreen's strengths and weaknesses;
I guess this type of what does this script actually do documenting and 
interviewing the developers if need be, is already going on in Evergreen DIG 
context - and the results are showing for both EG 1.6 and 2.0: 
http://docs.evergreen-ils.org/

- however: your explanations also reveal something else: a kind of scenarios 
of use where decisions upstream calling importing scripts for bibliographic 
and holdings data in certain ways (like the --idfield) influence everything 
that happens downstream big time - even to the extent that Global Flag settings 
are seemingly ignored; 
another example is holdings data where the 852 $b and $c  would get ignored if 
not downstream the Evergreen Organisational Structure definition and the Copy 
Location definitions are set correctly;

and these scenario's of use  (which are close to what you call behaviour 
that is expected from Evergreen and to your question is this what you are 
looking for?) we presently do *not* have in our EG DIG documentation: we  
document  *per* script,  *per* list of settings, *per* whatever  but don't 
explain how it all fits together, how what we want to happen downstream needs 
very particular measures to be taken upstream. 

The Staff Client (the GUI as you call it) has such scenarios of use, such 
assumptions built in - like when importing bib records and the way this works 
together with Global Flag and other Settings; this Staff Client behaviour is 
not explicitly   documented however (yet).

So: yes - this is what IISH is looking for given it's scenario of use but I 
also realise that there are other such scenario's migrating with particular 
bibliographic and holdings data  and that the official EG documentation could 
be helped very much with explicit scenario's: if you want this and these our 
your data, take these measures and call your scripts in such and such ways. 
Will contribute with IISH's. 

Is this scenario of use an approach appealing to others migrating or 
evaluating as well? 

Thanks ! Repke 

Op 5 mrt 2011, om 04:57 heeft Dan Scott het volgende geschreven:

 On Fri, Mar 04, 2011 at 10:00:02PM +0100, Repke de Vries wrote:
 Apologies Dan for the long gap discussing this - and thanks for your fix 
 (part of the 2.0.2 release) that makes the Global Flag Cat: Use internal ID 
 for TCN value  when set to TRUE, have the same effect for command line 
 importing bib records as it already had  for importing through the GUI. 
 
 However:  also the other Global Flag Maintain 001-003-035 according MARC  
 when set to TRUE,  behaves differently when importing bib records from the 
 command line then through the GUI:
 
 - through the GUI  the Control Number in the imported bib record's 001 field 
 gets pushed to the 035 as expected and the 001 receives the Evergreen 
 Internal Record ID etc.; and because of the Global Flag Cat: Use Internal 
 ID for TCN Value the TCN follows this Internal Record ID 
 - importing from the command line the Global Flag setting is ignored however 
 and the Control Number in the imported bib record's 001 field becomes the 
 new Evergreen Record ID;  the TCN also ignores it's Global Flag Setting -  
 this latter behaviour has been fixed now (as I understand it)
 
 I think there's a misunderstanding here. The --idfield parameter says
 Set the internal record ID to the value of (whatever field --idfield
 points to) - which describes the behaviour you're seeing, I think.
 
 If you want to disregard the existing 001 value in the records that
 you're importing, and just use whatever Evergreen gives you, you can
 import your records as follows:
 
 marc2bre.pl input.mrc | pg_loader.pl -or bre -a bre
 
 If you're importing from MARC21XML, you'll need to add the --marctype
 XML flag to marc2bre.pl, but you have that part sorted out already.
 Just drop the --idfield (and --idsubfield) flags entirely if you don't
 care about them. The part to focus on is the -a bre flag from
 pg_loader.pl; it says let the database assign a value for this object.
 So pg_loader.pl doesn't assign any value to the id column of
 biblio.record_entry, in this case; the column isn't specified in the
 COPY command and the database automatically assigns the next value in
 its sequence for the biblio.record_entry.id column. And that, in turn,
 will ensure that the 001 gets populated with that record ID and the old
 001 gets

Re: [OPEN-ILS-GENERAL] EG 2.0: problem Marc2bre - Global Flags Maintain 001-003-035 according MARC and TCN=Internal ID

2011-03-04 Thread Repke de Vries
Apologies Dan for the long gap discussing this - and thanks for your fix (part 
of the 2.0.2 release) that makes the Global Flag Cat: Use internal ID for TCN 
value  when set to TRUE, have the same effect for command line importing bib 
records as it already had  for importing through the GUI. 

However:  also the other Global Flag Maintain 001-003-035 according MARC  
when set to TRUE,  behaves differently when importing bib records from the 
command line then through the GUI:

- through the GUI  the Control Number in the imported bib record's 001 field 
gets pushed to the 035 as expected and the 001 receives the Evergreen Internal 
Record ID etc.; and because of the Global Flag Cat: Use Internal ID for TCN 
Value the TCN follows this Internal Record ID 
- importing from the command line the Global Flag setting is ignored however 
and the Control Number in the imported bib record's 001 field becomes the new 
Evergreen Record ID;  the TCN also ignores it's Global Flag Setting -  this 
latter behaviour has been fixed now (as I understand it)

Question though:  does your fix also remedy the other expected behaviour 
Maintain 001-003-035 according MARC when importing bib records from the 
command line?

And if not: what can we do about it?

A further comment on your  
 
 This is extremely useful  [setting the record ID to whatever number is in 
 001] if you have separate records for holdings, etc, that point to an 
 existing record ID that you need to sync them up with
 during your migration.

True that with Global Flag Maintain 001-003-035 according MARC  set to TRUE, 
life gets a bit more complicated linking MARC Holding and Bib Record but still 
doable: IISH (Chris Roosendaal) discussed this over IRC a while back with this 
outcome while using a staging table approach [quote from the IISH 
documentation]: 
..In our Evergreen 2.0 database this JOIN did not work, because the bibkeys 
were replaced by an internal auto-increment key when the bibliographic records 
were inserted. Fortunately there is a mapping between the original bibkeys and 
the new keys in the metabib.real_full_rec table, and therefore an extra JOIN is 
needed.
..  after which bib and holding record can be successfully linked. 
 
And one odd observation:  doing a command line import for authority records, 
the command line behaviour *is* the same as thorugh the GUI.

Repke de Vries, IISH


Op 17 feb 2011, om 00:36 heeft Dan Scott het volgende geschreven:

 On Fri, Feb 11, 2011 at 04:36:04PM +0100, Repke de Vries wrote:
 
 Command-line importing with marc2bre + pg_loader [1] in EG RC2
 Virtual Image, the import went fine but the Global Flags settings
 Maintain 001-003-035 according MARC = TRUE and Cat: Use Internal
 ID for TCN Value = TRUE are being ignored.
 
 In other words:  I am getting a Record ID and TCN that are
 derivatives of the control number in the imported 001 field: wrong.
 
 Well, the results for the record ID are what I would expect and want. By
 saying give me the value of 001 for my record IDs in marc2bre.pl with
 --idfield 001, that sets the record ID to whatever number is in 001.
 This is extremely useful if you have separate records for holdings, etc,
 that point to an existing record ID that you need to sync them up with
 during your migration.
 
 The TCN though... I took a look at where cat.bib.use_id_for_tcn actually
 gets used, and it's only in the Perl O:A:Cat::BibCommon module. Which
 doesn't get called by any of the in-database ingest code that is used in
 2.0, it would only be called when you're working with the GUI.
 
 A simple/simplistic way to fix this in the short term would be to run
 (after importing all of your bib records):
 
 UPDATE biblio.record_entry SET tcn_value = id;
 
 I would consider this a bug, though, as the expectation when you set
 Cat: Use internal ID for TCN value to TRUE is for that to be the case
 no matter how you get the records into the system. In which case, a
 minor tweak to the maintain_901 or maintain_control_number triggers
 should suffice (maintain_901 because it occurs before
 maintain_control_number, although perhaps that order should be
 reversed anyway to ensure that maintain_901 inserts the correct values
 in the correct spots...)



[OPEN-ILS-GENERAL] EG 2.0: problem Marc2bre - Global Flags Maintain 001-003-035 according MARC and TCN=Internal ID

2011-02-11 Thread Repke de Vries


Command-line importing with marc2bre + pg_loader [1] in EG RC2  
Virtual Image, the import went fine but the Global Flags settings  
Maintain 001-003-035 according MARC = TRUE and Cat: Use Internal  
ID for TCN Value = TRUE are being ignored.


In other words:  I am getting a Record ID and TCN that are  
derivatives of the control number in the imported 001 field: wrong.


Can this be a result of calling marc2bre with --idfield=001 rather  
than --tcnfield=001 ?


As per  my Global Flag settings: I don't want TCN's to follow from my  
imported record but maybe this is a way to kick the Global Flag  
settings into action when doing the command-line import?
Or should one of the pg_loader calling parameters be different to get  
the desired Global Flag settings behaviour?


The odd thing is:  using marc2are for authority records importing  
[2], everything works as expected with Global Flag settings being  
followed.


Help appreciated !

Repke de Vries, IISH

[1] according to http://svn.open-ils.org/trac/ILS/browser/trunk/Open- 
ILS/tests/datasets/README

and as follows:
PERL5LIB=/openils/lib/perl5/ perl /home/opensrf/Evergreen-2.0/Open- 
ILS/src/extras/import/marc2bre.pl --idfield 001 --idsubfield b -- 
marctype XML --db_host localhost --db_name evergreen --db_user  
evergreen --db_pw evergreen bibliographical.xml | perl /home/opensrf/ 
Evergreen-2.0/Open-ILS/src/extras/import/pg_loader.pl -or bre -or mrd  
-or mfr -or mtfe -or mafe -or msfe -or mkfe -or msefe -a mrd -a mfr - 
a mtfe -a mafe -a msfe -a mkfe -a msefe | psql -d evergreen -h  
localhost -U evergreen -h localhost


The --idsubfield b is a mistake 'cause the records are straight  
MARC with the ID in 001; marc2bre gracefully ignored this mistake


[2]
PERL5LIB=/openils/lib/perl5/ perl  /home/opensrf/Evergreen-2.0/Open- 
ILS/src/extras/import/marc2are.pl --marctype XML --startid 200  
bibliographical_authorities.xml  bibauth.are

followed by the pg_loader en psql steps.




[OPEN-ILS-GENERAL] Serials in 2.0: anyone working on import script for existing MFHD records

2011-01-05 Thread Repke de Vries

Hi all

with Serials in 2.0  I mean the new database setup introduced by  
Dan Wells + two sets of functionalities on top of that: Serial  
Control View and Alternate Serial Control (by two different  
programming teams).


And my question goes back to an early november 2010 Evergreen  2.0  
alpha4 discussion [1]  where we had to conclude that an importing  
script for MARC MFHD records to populate all the Serials 2.0  
database tables, did *not* yet exist - though several libraries (ours  
among them) expressed the need for it [2].


I wonder where we are now: anyone working on such a MFHD MARC records  
importing script or approach?


Our library might be able to contribute but one would expect that at  
least some script exists by now that could be further refined.


How else could we seriously introduce these new Evergreen 2.0 Serials  
features if we can't get the data in (not to mention: export them  
again) :-)


Yours, Repke de Vries, IISH

[1]  http://markmail.org/message/75tlejvni3mit5ke
[2]  I do *not* mean the MFHD options  + importing scripts introduced  
by Dan Scott in EG 1.6


Re: [OPEN-ILS-GENERAL] How can we delete Org Units with SQL for EG 2.0 beta3

2010-12-02 Thread Repke de Vries
Thanks Jason: please see my comments - good chance we have a catch-22  
and deleting Org Units is *not* possible with the Staff Client alone  
but neither with a combination of SQL and Staff Client: preparing for  
an eventual CONS Workstation Registration, you have to apply Can  
Have Users for CONS which probably means referencing  
actor.org_unit ... argh


See below - should we move the discussion to the DEV-LIST?

Repke

Op 2-dec-2010, om 19:21 heeft Jason Stephenson het volgende geschreven:


Repke,

What I sent you is meant to be run from the command line like so:

psql -f file.sql

where the contents of file.sql are the commands that I sent.

Sorry: should have guessed that from the BEGIN and COMMIT


Looks like you're having a problem with triggers. Have you put  
anything in serial.record_entry?


I might have: I not only had to do a  Can Have Users in the CONS  
Org Unit Type but also did a Can Have Volumes and Copies; I now  
unchecked that + did another Autogen + Stop and Start but maybe this  
isn't thorough enough?


The commands won't work if you've done anything in the database  
that references actor.org_unit. The code
Well: strictly speaking I didn't 'cause I only touched CONS and try  
to delete anything else than / below Consortium  ( your  1) but  
maybe there is an effect ?
that I sent appears originally at the top of a longer script that I  
use to set up the org_units for our consortium when I do a  
migration*. If you've gone in and messed with anything before  
running those commands, I guarantee nothing.
CATCH-22:  yeah: if you try a combination of Staff Client and SQL  
('cause the Staff Client can't handle the delete) you *have* to do  
some messing to guarantee later access at Consortium level .. which  
probably makes the SQL operation impossible;
which would mean: in EG 2.0 deleting anything in the Org Unit  
Structure and setting up a new Structure can only be done through a  
series of SQL statements ?!  Which means SQL knowledge 'cause there  
is no 2.0 How To on that


I suggest reloading the database schema from scratch, and then  
running the commands. They should work. They work for me in rel_2_0  
and trunk.


Yeah: you got a point but that means I destroy my Can Have Users  
for the Consortium Level - which means no more Staff Client access  
ever and therefore no setting up a new structure through the Staff  
Client ..


*We haven't migrated, yet, but I've done many test runs along the  
way as I figure out how to migrate more and more data.


We are re-planning our migration (eraly next year) and this Org  
Structure issue is one of those things :-) going through the same  
tests as you do

Jason





Quoting Repke de Vries re...@xs4all.nl:


Jason, I managed Workstation Registration at the Consortium Level

but following your advice literally with psql in a straight out- 
of- the-box EG 2.0 beta3 Virtual Image (different approach to  
maintaining  Org Structure than 1.6 and before), I have zero success.


Maybe take this offline?  My time-zone btw is six hours later or  
more  than than yours.


This is what psql [1] throws back at me:
..
evergreen=# BEGIN;
BEGIN
evergreen=# DELETE FROM actor.org_unit WHERE id  1;
NOTICE:  Use of uninitialized value $self in pattern match (m//)  
at / usr/local/share/perl5/MARC/File/XML.pm line 432.


CONTEXT:  SQL statement UPDATE ONLY serial.record_entry SET   
owning_lib = NULL WHERE $1 OPERATOR(pg_catalog.=) owning_lib
ERROR:  error from Perl function maintain_control_numbers: No   
_parse_* routine defined on this driver (If it is a filter,  
remember  to set the Parent property. If you call the parse()  
method, make sure  to set a Source. You may want to call  
parse_uri, parse_string or  parse_file instead.)  
[XML::LibXML::SAX=HASH(0x9bed980)] at /usr/share/ perl5/XML/SAX/ 
Base.pm line 2616.
CONTEXT:  SQL statement UPDATE ONLY serial.record_entry SET   
owning_lib = NULL WHERE $1 OPERATOR(pg_catalog.=) owning_lib

evergreen=# DELETE FROM actor.org_address WHERE id  1;
ERROR:  current transaction is aborted, commands ignored until end  
of  transaction block

evergreen=# COMMIT;
ROLLBACK
evergreen=#
..

Repke de Vries, IISH

[1]
..
[open...@localhost run]$ psql -U evergreen -h localhost evergreen
Password for user evergreen:
psql (8.4.5)
Type help for help.

evergreen=#
..


Op 1-dec-2010, om 19:42 heeft Jason Stephenson het volgende  
geschreven:



Repke,

Try this:

BEGIN;
DELETE FROM actor.org_unit WHERE id  1;
DELETE FROM actor.org_address WHERE id  1;
COMMIT;

The above gets rid of everything but the sample consortium.

HtH,
Jason


Quoting Repke de Vries re...@xs4all.nl:


Hi all

this is a repost of a previous thread [1] now focussing on the
following:


deleting Organisational Units is different in 2.0 from 1.6 and   
can  not be achieved through the Staff Client.


The only information how to do it directly in the database with   
SQL  comes from 1.6 documentation [2] and does indeed not work  
for  2.0

[OPEN-ILS-GENERAL] How can we delete Org Units with SQL for EG 2.0 beta3

2010-12-01 Thread Repke de Vries

Hi all

this is a repost of a previous thread [1] now focussing on the  
following:


deleting Organisational Units is different in 2.0 from 1.6 and can  
not be achieved through the Staff Client.


The only information how to do it directly in the database with SQL  
comes from 1.6 documentation [2] and does indeed not work for 2.0:  
this is what I get back in psql:

..
evergreen=# delete from actor.org_unit where shortname = 'BR4' ;
ERROR:  update or delete on table org_unit violates foreign key  
constraint org_address_org_unit_fkey on table org_address

DETAIL:  Key (id)=(7) is still referenced from table org_address.
evergreen=#
..

Can someone tell me what I should do (this will also help 2.0 DIG  
documenting)?


Either exclusively SQL or in some combination with the Staff Client  
Administrator functionality?


Thanks, Repke, IISH

[1]
http://georgialibraries.markmail.org/search/?q=Unexpected+results 
+changing+Organisational+Units+through+Staff+Client+in+EG+2.0 
+alpha4#query:Unexpected%20results%20changing%20Organisational%20Units 
%20through%20Staff%20Client%20in%20EG%202.0%20alpha4+page:1 
+mid:t3qerg3qbws34hat+state:results

[2]
http://www.open-ils.org/dokuwiki/doku.php?id=evergreen-admin:policies:ou



Re: [OPEN-ILS-GENERAL] Serials in alpha4

2010-11-07 Thread Repke de Vries
Exactly the scenario at IISH Tim, at least for part of our serials  
collection.


And Dan Scott is all too right for both of us:
..

looking for a way to populate the new
serials tables with legacy MFHD data. And I'm not sure there  
currently
is a way to get there from here, where here =  
serial.record_entry or

your previous ILS's export of MFHD records.

..

And I am not a (PostgreSQL) database expert but it seems attractive  
to create some sort of in-database solution that parses the rows of  
MFHD Holding Record information in serial.record_entry and uses that  
to populate those new serials tables and that way build on the  
already existing marcbre + marcsre tools rather than create an  
external tool from scratch - probably based on a staging table  
approach?


I would also like to point out and stress the following:

also an EXPORTING command line tool for these new serials tables is  
missing: this would rebuild a full MFHD record from those tables.

That way you can:
- be sure that the new Serials Management functionalities do the  
right thing :-) with your MFHD (legacy) data
- avoid data lock-in : if at some point you wish to move from the  
Evergreen ILS to  Futuregreen you can be sure of MARC compliant  
MFHD to do so


Repke

Op 6-nov-2010, om 23:04 heeft Tim Spindler het volgende geschreven:


Specifically we are looking at exporting Marc with 853, 863 , etc and
importing into Evergreen. If I understand serials module correctly, it
builds prediction schedule from mtgs publication pattern.

Sent from my iPhone

On Nov 6, 2010, at 2:02 AM, Dan Scott d...@coffeecode.net wrote:


On Fri, Nov 05, 2010 at 03:36:06PM -0400, Mike Rylander wrote:
On Fri, Nov 5, 2010 at 2:15 PM, Tim Spindler  
tjspind...@gmail.com wrote:

Regarding the command line and the MFHD, will/does Vandelay support
importing the MFHD fields?� We are hoping this will be our  
path to migrating

serials data when we go live.



There is, indeed, such a command line tool.  The indomitable Dan  
Scott

added Open-ILS/src/extras/import/marc2sre.pl a while back for the
purpose of priming MFHD records.  It requires that you have the
internal ID of your Evergreen bib records embedded in the MFHD  
records

that link to said.  Beyond that restriction, you simply point it at
your MFHD MARC and it will generate a file full of 'sre' records
suitable for loading with pg_loader, just like 'bre' and 'are'
records.


Aw, that's nice of you Mike, but I think the question was more  
about the

new serials support (check-in UIs and circulating serials) and less
about the non-circulating serials MFHD-based support. If I read  
Tim and
Repke's questions right, they're looking for a way to populate the  
new
serials tables with legacy MFHD data. And I'm not sure there  
currently
is a way to get there from here, where here =  
serial.record_entry or

your previous ILS's export of MFHD records.






Re: [OPEN-ILS-GENERAL] 852 broken in 1.6 Staff Client Batch Import too ? Stuck with 2.0 alpha4 Staff Client importing 852 Item Information

2010-11-07 Thread Repke de Vries

Any help appreciated!

Did additional testing in 1.6.0.1 and  what used to work for us in  
that version a number of months back , doesn't anymore.


Here is an an example [1]: BroGroepCanadaBR1Combi.mrc  in http:// 
www.xs4all.nl/~repke/Evergreen/

adopted for out-of-the-box Evergreen

And these are the steps in the Staff Client:
1) Cataloging - MARC Batch Import/Export
2) create an upload queue;  mark auto import non-colliding records',  
mark import attached holdings;  choose file to upload

3) click Upload: the record gets imported
4) search the catalog for the record

Copy information should be shown now for BR1 but it doesn't anymore  
in 1.6.0.1 and neither does it in 2.0 alpha4. The latter could be  
caused by  additional MARC Batch Importer / Exporter settings that we  
maybe don't understand yet (see below).


Can anyone reproduce the situation or in anyway help with the right  
additional settings in 2.0 for MARC Batch Importer / Exporter?


Especially for quick tests, the Importer is very convenient.

Thanks a lot, Repke, IISH

[1]
=LDR  00569npm  2200181   45 0
=001  IISGb11016444
=003  IISG
=005  20081119170854.0
=008  070221s\
=040  \\$aNe$dAmISG
=041  0\$akh
=044  \\$aeng
=245  10$aDemocratic Kampuchea / Kampuchea Democratique : collection  
of brochures$f1975-1978

=300  \\$a0.1 m
=500  \\$aGroepsbeschrijving
=520  8\$aA collection of 33 brochures and 4 illustrated magazines  
from Democratic Kampuchea (1975-1978)

=605  \0$aSEA$eCambodia
=852  00$aIISG$bBR1$bBR1$hBro 2541 fol$pN10745146


Op 5-nov-2010, om 17:42 heeft Repke de Vries het volgende geschreven:


Hi all

trying to MARC Batch Import / Export import a single record +  
attached holding [3] that used to import fine in EG 1.6


and stuck: the 852 item information keeps being ignored:
- because we changed the Org Units Structure [2] I went back to a  
fresh alpha4 Virtual Image and tried the usual 852 $bBR1 etc.: no  
result
- given the changed  Org Units Structure I tried an import with  
$bBR2 'cause that branch still hangs around: no result
- blush: a few hours I learned from this list (Dan Scott answers)  
that in 2.0 we now have an Edit Import Item Attributes: very  
welcome tool, noticed the attribute Keep being False and made  
that True (see [1] for the Item Attributes), started new staff  
client session: still no result


I now have the feeling I miss some finer point in 2.0 that wasn't  
there in the more blunt 1.6:  any pointers ?
Maybe changing Item Attributes in the Staff Client needs an  
additional  command line Autogen or xx to complete?


Stuck !!

Thanks, Repke

[1] the attributes in the Edit Import Item Attributes screen:
Part 1: http://screencast.com/t/vrkLGsiCZ
Part 2: http://screencast.com/t/S7lFazlwx6e

[2] our Org Units Structure:  http://screencast.com/t/lByuAYLlbAIK
and the (lowest) IISG level can have Items and Copies
[3] we always import  in .mrc  but for mortals the record looks  
like this and with BR1 instead of IISG imported fine in EG 1.4 and  
1.6:


=LDR  00734nav  2200217 a 45 0
=001  IISGb10923000
=003  IISG
=005  20040426114843.0
=008  040426s\
=040  \\$aNe$dAmISG
=245  10$kBeelddocument = Visual document$f1985-1992
=603  \0$aAffiche$bPoster
=604  \0$aXX-4
=605  \0$aCAN$eCanada
=710  0\$aPublic Service Alliance of Candada
=852  00$bIISG$bIISG$cNEHA$hBG D32/958$p30051001607255
=852  00$aIISG$bIISG$bIISG$hBG D32/959$p30051001607644
=852  00$aIISG$bIISG$bIISG$hBG D32/960$p30051001607594
=852  00$aIISG$bIISG$bIISG$hBG D32/961$p30051001607545
=852  00$aIISG$bIISG$bIISG$hBG D32/962$p30051001607495
=852  00$aIISG$bIISG$bIISG$hBG D32/963$p30051001607446




[OPEN-ILS-GENERAL] Stuck with 2.0 alpha4 Staff Client importing 852 Item Information

2010-11-05 Thread Repke de Vries

Hi all

trying to MARC Batch Import / Export import a single record +  
attached holding [3] that used to import fine in EG 1.6


and stuck: the 852 item information keeps being ignored:
- because we changed the Org Units Structure [2] I went back to a  
fresh alpha4 Virtual Image and tried the usual 852 $bBR1 etc.: no result
- given the changed  Org Units Structure I tried an import with $bBR2  
'cause that branch still hangs around: no result
- blush: a few hours I learned from this list (Dan Scott answers)  
that in 2.0 we now have an Edit Import Item Attributes: very welcome  
tool, noticed the attribute Keep being False and made that True  
(see [1] for the Item Attributes), started new staff client session:  
still no result


I now have the feeling I miss some finer point in 2.0 that wasn't  
there in the more blunt 1.6:  any pointers ?
Maybe changing Item Attributes in the Staff Client needs an  
additional  command line Autogen or xx to complete?


Stuck !!

Thanks, Repke

[1] the attributes in the Edit Import Item Attributes screen:
Part 1: http://screencast.com/t/vrkLGsiCZ
Part 2: http://screencast.com/t/S7lFazlwx6e

[2] our Org Units Structure:  http://screencast.com/t/lByuAYLlbAIK
and the (lowest) IISG level can have Items and Copies
[3] we always import  in .mrc  but for mortals the record looks  
like this and with BR1 instead of IISG imported fine in EG 1.4 and 1.6:


=LDR  00734nav  2200217 a 45 0
=001  IISGb10923000
=003  IISG
=005  20040426114843.0
=008  040426s\
=040  \\$aNe$dAmISG
=245  10$kBeelddocument = Visual document$f1985-1992
=603  \0$aAffiche$bPoster
=604  \0$aXX-4
=605  \0$aCAN$eCanada
=710  0\$aPublic Service Alliance of Candada
=852  00$bIISG$bIISG$cNEHA$hBG D32/958$p30051001607255
=852  00$aIISG$bIISG$bIISG$hBG D32/959$p30051001607644
=852  00$aIISG$bIISG$bIISG$hBG D32/960$p30051001607594
=852  00$aIISG$bIISG$bIISG$hBG D32/961$p30051001607545
=852  00$aIISG$bIISG$bIISG$hBG D32/962$p30051001607495
=852  00$aIISG$bIISG$bIISG$hBG D32/963$p30051001607446


Re: [OPEN-ILS-GENERAL] Serials in alpha4

2010-11-05 Thread Repke de Vries

Dear Kathy

here at IISH ( http://www.iisg.nl) a team of testers has been sinking  
its teeth in alpha4 Serials Management as well - even without the  
help of the release notes you are mentioning: thanks for the pointer !!


Like you we hit stumbling blocks and will contact you off list the  
coming Monday [1] if  we can work together getting a basic grip on  
functionality: right now it is complete trial and error here in  
Amsterdam.  This documentation would help testing would help  
feedback to the [Serials, Acquisition] developers issue was also  
raised at the EG IRC Developers meeting last Tuesday.


Right now I wish to bring up a related issue and would like to ask  
Dan Wells or Lebbeous (sorry: don't know your first name) the following:


is there a command line MFHD importing script yet that will populate  
all the Serials Management related database tables ?


That would fast track functionality testing and down the road I guess  
your library Dan (Wells) would need to import legacy MFHD data just  
as much as IISH needs to do so for that part of our collection where  
the raw MFHD record editing [2] approach is not enough.


Thanks, Repke Eduard de Vries, IISH

[1] also impressed with your work here Kathy:  http:// 
masslnc.cwmars.org/node/2039  (to the extent that we have access)
[2] we are well aware of Dan Scott's work here: http://svn.open- 
ils.org/trac/ILS/browser/trunk/Open-ILS/tests/datasets/README  ;

works fine but not for all of our serials collection

Op 5-nov-2010, om 16:19 heeft Kathy Lussier het volgende geschreven:


Hi all,

I've been digging into serials on a 2.0 alpha4 installation by  
following the

notes at
http://open-ils.org/dokuwiki/doku.php? 
id=acq:serials:release_notes:initial_t

runk_merge, but I've run into various stumbling blocks


[OPEN-ILS-GENERAL] Unexpected results changing Organisational Units through Staff Client in EG 2.0 alpha4

2010-10-22 Thread Repke de Vries
Your help appreciated: setting up our library structure by changing  
Organisational Unit Types and Organisational Units through Server  
Administration in the Staff Client, gives unexpected and incomplete  
results.


We are talking the EG 2.0 alpha4 Virtual Image and we are prototyping  
the organisational structure in anticipation of 2.0 beta and going  
live later this year.


After some changes to text labels in Organisational Unit Types,  we  
made changes to the hierarchy of Organisational Units + the Units  
themselves.
Here is a screenshot where for example you see the out-of-the-box BR1  
changed: http://screencast.com/t/fWRTdaCnxy


First of all: actually *deleting* anything not needed proved  
impossible: sometimes the Org Units admin interface turns red and  
nothing happens; sometimes deleting did seem to happen only to have  
the out-of-the-box structure return later; this is irrespective of  
having any holdings or users or workstations attached to, say: a branch.


And testing the consequences of such changes :
- all the OPAC and staff client screens persistently keep the *old*  
text labels like for example Example Branch 1 - also after stopping  
and restarting Evegreen + Apache; and of-course we started new staff  
client sessions;  here is a screenshot:  http://screencast.com/t/ 
NR9Od8oMh96
- Holding Maintenance screens show a mixed result:  changes to Unit  
Types come through, changes to the Units themselves don't come  
through:   http://screencast.com/t/XqdW9S9jr
- Exporting records picks up the new Organisation Unit Policy Code  
IISG  (instead of BR1) and uses it on the 852

- Importing records does *not* recognise this new code IISG in 852 $b
- the new MARC 001/035 + TCN =  MARC 001 feature (see Server Admin - 
Global Flags) however *does* pick up the new Policy Code defined in  
the Consortium Org Unit and uses it for issuing authority
- there seems to be a time delay ?  After a few days now for example  
the Holding Maintenance screen + pick lists *do* seem to have picked  
up more of the changes: is that in any way likely?


What are we missing ? Which other steps than Server Admin in the  
Staff Client are necessary to have changes in the Organisation Units  
fully take effect?


Thanks, Repke de Vries, IISH








[OPEN-ILS-GENERAL] Com IRC meeting's agenda item 5 and IISH planning for going live with 2.0 beta Re: Community IRC Meeting (was: Reminder: Developer IRC Meeting, Tuesday October 19)

2010-10-19 Thread Repke de Vries

Dear all

the Community Meeting's starting time is not too bad for Europeans (8  
p.m. Amsterdam time [IISH's base], right?) but unfortunately I am  
away for the evening.


The following is to inform on agenda item 5:
 .. 2.0 alpha4 to 2.0 gold - works in progress, discuss of what will  
make it in for general release, hash out schedule ..


At IISH we are steadily working towards going live with 2.0 beta late  
November or early December - this schedule being based  on the  
Evergreen developers' planning so far, i.e. having the beta out in  
the course of November. And based on discussions with Dan Scott that  
it would be very helpful to have a library going live with the 2.0  
beta after KCLS having gone into production with one of the 2.0 alpha's.


My question to the meeting: are you still aiming at November for the  
beta ?


Our side it is too early still to really tell if we can make it by,  
say, late November or that mid December is more realistic: we are in  
the middle of rounding off feedback on all the new authorities  
functionality commissioned by IISH [1], have work going on (in-house)  
on some of the staff client functionality and only start testing data  
import beginning of November.  The alpha4 Virtual Image is our yard  
stick right now and all preparations and tests based on it.
But trying hard to keep our planning for end of November, beginning  
of December (DIG list colleagues [and Robert Soulierre]: not  
forgetting about DIG contributions, being it 2.0 based):


thanks, the IISH team, Repke de Vries

[1] http://coffeecode.net/archives/229-Authorities-in-Evergreen-an- 
Amsterdam-trip-report.htmlis basically done [thanks Dan !!]



Op 18-okt-2010, om 23:03 heeft Dan Scott het volgende geschreven:


Just a heads-up that many of us on IRC on a day-in, day-out basis
generally think this should really be called the Community IRC
Meeting due to the focus on communication about the project,
including an agenda with items calling for reports from:

  * Documentation Interest Group
  * Reports Taskforce
  * Web Site Committee (and Communications Committee?)
  * Governance Committee
  * Developers, with a particular focus on release status

One of the goals of these meetings is to communicate the state of the
project across committee/group boundaries and to find problems that
need to be solved / resources to help solve identified problems.

Dan

On 15 October 2010 10:33, Ben Shum bs...@biblio.org wrote:
 Time for another developer meeting!  The next Evergreen  
development IRC

meeting will be held at:

 * 11:00:00 a.m. Tuesday October 19, 2010 in America/Los_Angeles

 * 02:00:00 p.m. Tuesday October 19, 2010 in Canada/Eastern

 * 06:00:00 p.m. Tuesday October 19, 2010 in UTC

This is a public meeting for Evergreen developers that will be  
held on

the #evergreen channel on the Freenode IRC network
(http://open-ils.org/irc.php). All members of the community with an
interest in contributing to the development of Evergreen are  
welcome to
attend.  If you are unable to attend at the designated time,  
please feel

free to submit comments for any of the agenda items in advance to the
Evergreen development mailing list.

The agenda is evolving at
http://www.open-ils.org/dokuwiki/doku.php?id=dev:meetings:2010-10 -
please extend and amend to ensure that it meets the immediate  
concerns

of the project.

For agenda items that have the potential to be too long to express
during a single IRC meeting, it would probably make sense to post  
more

considered opinions in advance on this mailing list. Examples of such
agenda items might include major release process changes or  
drastically
revising our bug tracking processes. If a given discussion item  
starts

eating up too much meeting time and a decision is not immediately
necessary, we can also delegate the responsibility to a volunteer
sub-team for investigating alternatives and coming up with a proposal
for adoption at the next meeting.







[OPEN-ILS-GENERAL] ***SPAM*** Re: ***SPAM*** Re: Why is Placing a Hold on Volume Level not possible in the OPAC ?

2010-06-07 Thread Repke de Vries
Thanks Bill for the hint to make the Volume Hold conditional by  
taking a cue from record and/or call number -  the call number idea  
in particular.


Jason, your idea  to

2) have the main Place Hold link for the record prompt the user  
for a specific volume


is very attractive from a usability point of view (with tens and tens  
of  bound volumes, the Copy Information list quickly grows too long)  
and ties in well with experimental work Laurentian University did on  
their Search Results page which has the Place Hold button straight in  
the middle of the screen:
   http://laurentian-test.concat.ca/opac/en-CA/skin/btresult/xml/ 
rresult.xml?rt=keywordtp=keywordt=Journal%20unescol=105d=1f=av=


however:  our library's scenario of use is more complicated than  
prompting the user for a specific volume if Place Hold is clicked:


evaluating Evergreen we are struggling with a situation where we have  
printed journals with bound volumes (highest level in the hierarchy)  
but also article level bibliographic descriptions (lowest level of  
the hierarchy); at the article level a MARC 773  $g points to the  
highest level - like Vol. 13, No. 2, p. 273-287


It is well possible to make that 773 a clickable link and help the  
user jump to the printed journal where he can Place the Volume Hold   
but both in Bill's approach and in your single action Place Hold With  
Dialogue, the user has to remember from the previous step which  
Volume to pick:  O, yes, I believe it was in Volume 13 .. or was it  
twelve ?


Somehow you want the user to
- either arrive at the level where he can place a Volume Hold with  
Evergreen remembering the right Volume (would probably need  
additional information in the 773 at article level)
- or stay at the level of the search result and have a Place Volume  
Hold right there; the physical copy however is not at that article level


While at it: having a journal article bibliographic description show  
up at all in the OPAC, we need a 852 with dummy barcode and other  
fake information. It would be very interesting to extend the non  
barcoded resource discussion in the Evergreen community to resources  
like journal articles (or Twin Peaks reviews as in Bill Ott's) and  
instead of a 773 / 852 combo, use the 856 which gives you both OPAC  
visibility *and* a clickable link for jumping to the right level for  
Place Volume Hold -  be it with usability issues as explained above.


Appreciating your or anyone's comments,

Repke de Vries, PPL


Op 7-jun-2010, om 1:29 heeft Bill Ott het volgende geschreven:


On 6/4/10 12:06 PM, Jason Etheridge wrote:
On Fri, Jun 4, 2010 at 11:28 AM, Repke de Vriesre...@xs4all.nl   
wrote:


while it is possible from the Staff Client. Or did we miss  
something ?



I believe this may be a case of overprotectiveness.  However, IMO,
ideally, if were to allow patrons to place volume level holds, we
would:

1) only do it for volumes where it made sense (where the items are
indeed different), and not for any volume on any record.  How to
designate this, I don't know.
2) have the main Place Hold link for the record prompt the user for a
specific volume (based on the same designations mentioned above)



Any suggestions or advice ? Change the OPAC code ?


It appears you can do the following without adverse effects:

1) add the VOLUME_HOLDS permission to the User (or equivalent)  
permission group

2) modify /openils/var/web/opac/skin/default/js/rdetail.js (even if
you're using the craftsman skin) by searching the file for  
'hold_div',

and changing the preceding if (isXUL()) { line to if (true) {

Better would be to replace the if condition with a permission check
against the owning library of the volume, and again, it would be nice
if we could somehow allow this for some volumes and not others.




We do allow patrons to place volume holds.  Just as Jason notes, in  
many cases there's no reason, so I take a cue from the record and/ 
or call number.  If there is series data or a call number ending in  
a v.#', I turn on the link in the opac.


Here's an example:
http://grpl.michiganevergreen.org/opac/en-US/skin/default/xml/ 
rdetail.xml?r=5235487ol=9t=twin% 
20peakstp=keywordl=9d=1hc=13rt=keyword






[OPEN-ILS-GENERAL] How to delete imported authority records

2010-03-18 Thread Repke de Vries
folks, if there is no overlay mechanism for outdated authority  
records in Evergreen [1]


how can we delete them: has someone done that, say from the command  
line with psql ?


thanks a lot,

Repke de Vries

[1] http://markmail.org/message/77br5rlelhlhoyzd




[OPEN-ILS-GENERAL] Can a second import update already imported authority records ?

2010-03-17 Thread Repke de Vries
This is 1.6.1 and we have staff client imported authority records but  
want to make changes to some of them.


The MARC authority records standard seems to facilitate such a scenario:
- in the leader in position 5 you would have c for corrected
- you keep the record ID or Control Number in  field 001 the same of- 
course for matching with the outdated authority record already in  
Evergreen
- in field 005 you would have a more recent date than the one in the  
outdated record


Or fulfilling our need to delete obsolete authority records in  
Evergreen by importing a second time with Leader position 5 set to  
d for deleted.


QUESTION:  trying these scenarios does not work: the staff client  
batch import simply refuses to do the import second time round and  
that seems to be it.


Can anyone help:
- is there an automatic overlay (and delete) mechanism for authority  
records by re-importing and how should the record look like

- if not: how can we update or delete outdated authority records


Thanks a lot,

Repke
IISG and PPL libraries, the Netherlands


[OPEN-ILS-GENERAL] MFHD import problem with marcsre.pl etc.

2010-02-23 Thread Repke de Vries
Following Dan Scott's README instructions  [1]  (hi Dan) we are  
importing a Bibliographic + corresponding MFHD records  example  
(attached).


Steps for bibliographic record import and metarecord map generation  
went fine (as far as I can see).


Step for MFHD records (in this case: 2) has a problem:
- marc2sre.pl --startid 004 --password open-ils GramsciMFHDPart.mrc   
separately  works OK
- piping the marc2sre result to pg_loader.pl -or sre   
GramsciMfhd21.sql  produces the SQL table; there are two \N for each  
record and changed the last one into 4 (the numeric ID of our [test]  
BR1) in both cases; result attached


- psql -d evergreen1 -h localhost -U evergreen -h localhost -f  
GramsciMfhd21.sql
has a problem:  duplicate key value violates unique constraint  
record_entry_pkey ; full message: [2]


Where did we go wrong ?

Thanks, Repke, IISG

[1] http://svn.open-ils.org/trac/ILS/browser/trunk/Open-ILS/tests/ 
datasets/README


[2]
r...@devevergreen:~/SerialsImport$ psql -d evergreen1 -h localhost -U  
evergreen -h localhost -f GramsciMfhd21.sql

Password for user evergreen:
SET
BEGIN
psql:GramsciMfhd21.sql:8: ERROR:  duplicate key value violates unique  
constraint record_entry_pkey
CONTEXT:  COPY record_entry, line 1: t 11081733now  
1   f  now  1   4   IMPORT-1266952292.02193  
record xmlns:xsi=http://www.w3.org/2001/XMLS...;
psql:GramsciMfhd21.sql:10: ERROR:  current transaction is aborted,  
commands ignored until end of transaction block

ROLLBACK
r...@devevergreen:~/SerialsImport$



GramsciBibPart.mrc
Description: Binary data


GramsciMFHDPart.mrc
Description: Binary data


GramsciMfhd21.sql
Description: Binary data


Re: [OPEN-ILS-GENERAL] Another try importing holdings using MARC importer/exporter (Vandelay)

2010-01-18 Thread Repke de Vries

Hi Francene, hi Bill

Any fresh ideas ? Would the Vandelay bug repairs in the forthcoming  
1.6.0.1 cover this (I couldn't find out) ?


After first using our own data (bib record *with* 852 and the 852  
matching Vandelay expectation) I have tried to do the same as you did  
Francene:

- exporting a record with Vandelay
- changing just enough that Evergreen sees it as a different one
- importing with Vandelay / MARC Batch Importer in the staff client -  
assuming that Vandelay *at least* reads back in what it just exported  
when you change enough to avoid collision detection with the one  
that is already in the database


Not so:  the 852 is not interpreted as holding information and goes  
straight into the database as part of the rest of the bib record and  
shows up along with it in MARC view.  This answers your question too  
Bill: bib imported OK but copy information (852) is not recognised as  
such.


I have given up (even for quick evaluation purposes) and started  
working with an excellent command line script by David Christensen,  
published on this list October 24th 2008. See here: http:// 
markmail.org/message/ibb5zrkwikaydwf3


Repke
IISH

Op 30-nov-2009, om 19:10 heeft Bill Erickson het volgende geschreven:


Hi Francene,

Your data looks correct.  Are the bib records importing OK or do  
both bibs and copies fail?


-b


On Wed, Nov 25, 2009 at 3:17 PM, Francene Lewis fle...@calvin.edu  
wrote:
Is there any written documentation for the MARC importer/exporter  
tool (Vandelay)?  I've watched the webinar but it doesn't seem to  
cover the questions I have on holdings. I've tried importing  
holdings now using two different fields.  We exported a record and  
copied the 852 field.


For example:
852 4  |a gaaagpl |b GENERAL |c 5TH-FLOOR |j call number |g BOOK |p  
33108007755038 |x nonreference |x holdable |x circulating |x  
visible |z Available


999   |a call number |c 1 |m GENERAL |l GENERAL |i 33108004425087 | 
k Available |p 20.00 |t BOOK


I also tried importing using a 999 field for holdings.  Both of  
these fields are listed if you click the Import holdings button in  
Vandelay.  Am I missing something in creating these fields?  Do you  
need to create item attributes for the various fields and subfields  
for the import to work?  Is there something in the underlying  
coding that needs to be turned on to allow holdings to be imported  
using Vandelay?

Thanks for the help in advance,
Francene Lewis
Cataloging Librarian
Hekman Library
fle...@calvin.edu



--
Bill Erickson
| VP, Software Development  Integration
| Equinox Software, Inc. / The Evergreen Experts
| phone: 877-OPEN-ILS (673-6457)
| email: erick...@esilibrary.com
| web: http://esilibrary.com

Please join us for the Evergreen 2010 International Conference,  
April 20-23,
2010 at the Amway Grand Hotel and Convention Center, Grand Rapids,  
Michigan.

http://www.evergreen2010.org/




Re: [OPEN-ILS-GENERAL] introduction and quick database question

2009-12-08 Thread Repke de Vries
Acquisitions is one addition in 1.6, Serials and all holding  
information in *separate* records (following MFHD) another:


is there an update of  the 1.4 schema below  that covers the 1.6  
serials schema and everything related ?


Did a Google search and found bits and pieces but nothing systematic  
in one place - may have overlooked.


Thanks (thx for this 1.4 pointer),

Repke
IISH

Op 8-dec-2009, om 0:39 heeft Don McMorris het volgende geschreven:

snip



If you haven't yet seen it, this
(http://open-ils.org/documentation/evergreen-schema-1.4.0.2.html) can
be pretty helpful.  There are some additions in 1.6 (Ex: Acq), but I
find it quite helpful quite often.

Hope this helps!

--Don



snip


Re: [OPEN-ILS-GENERAL] importing authority records

2009-12-04 Thread Repke de Vries

Hi Chris, hi Jason

meanwhile Chris answered further [1]  but answering Jason's to keep  
the thread on this issue together.


My library (evaluating Evergreen) is also struggling with (Vandelay)  
authority batch import and I have emailed this list on exactly the  
same phenomenon in 1.4.0.6:  all fine 'till the actual import and a  
progress bar that sits still at 0 % and nothing getting imported.


Chris: here are some results from trying the same on Dan Scott's  
1.6.0.0 virtual image [2] on my iMAC as an 1.6 implementation example  
and with  a small subset of  our IISH authority records for the bib  
record MARC 100 field; this subset is available as test set in the  
Evergreen SVN [3]:


1) at first same story after selecting just one of the authority  
records and ask for Import Selected: progress bar sits at 0 % and  
just trying Validating or Right Mouse Click in a new bib record in  
the 100 field doesn't do anything


2) then I tried Import all and all of a sudden it did work and all  
17 records got reported as Imported (you have to leave the Batch  
Importer and go back to it again and to Inspect Queue etc. to  
actually see the change) and trying out in new bib record worked fine  
too


3) next trials were first a large set of  Authority MARC 400 records  
which did  upload but *not* import and then a small subset of the  
same (someone with SVN access could add that test set - let me know  
off list) which did *not* import either:  progress bar sitting for  
ever at  0 % (and nothing reported as imported)


4) however:  at some point I just tried to use it in a new MARC  
record and there they were the 400 authority data ??!!


So: a dead progress bar sometimes means nothing happened, sometimes  
it means nothing at all and authority data do find their way into the  
database.


I have no clue what causes all this and inspected the Postgres  
database log, see no errors (I think) but lack the expertise to  
interpret what happens or does not happen at that level.


Vandelay / batch importing authority records  seems an unpredictable  
roller coaster. Anybody ideas ?

I can send you my  Postgres log or ay other of the EG logs.

Repke de Vries, IISH

[1] December first:
..
Hi Jason,

No, I haven't.  I'm just the cataloger at WJU, and Jeff, our IT guy who
did our data migration and Evergreen installation, just accepted a job
elsewhere.  Has anyone else had trouble with importing authority records
in 1.4.0.6?  Will upgrading to 1.6 solve this?

Thanks,
Chris
..

[2] Downloadpage http://evergreen-ils.org/downloads.php   under  
Other versions of Evergreen software


[3] called auth-subset100.mrc at http://svn.open-ils.org/trac/ILS/ 
browser/trunk/Open-ILS/tests/datasets


Op 1-dec-2009, om 17:24 heeft Jason Etheridge het volgende geschreven:

On Tue, Nov 24, 2009 at 11:51 AM, Chris Caughey  
ccaug...@jessup.edu wrote:
We were having trouble using the authority batch import.  The  
whole process
would move along without incident until it came to the progress  
bar.  Then
it would sit at 0% and do nothing indefinitely.  Last week, we  
upgraded to
1.4.0.6, and now the progress bar moves a little bit, but then it  
stalls out

again.  Our IT department tells me that Evergreen is showing an error
related to “arn_value.”  Has anyone else experienced this  
problem?  If so,

what did you do to fix it?


Hey Chris, did you make any headway on this?

--
Jason Etheridge
 | VP, Tactical Development
 | Equinox Software, Inc. / The Evergreen Experts
 | phone:  1-877-OPEN-ILS (673-6457)
 | email:  ja...@esilibrary.com
 | web:  http://www.esilibrary.com

Please join us for the Evergreen 2010 International Conference,  
April 20-23,
2010 at the Amway Grand Hotel and Convention Center, Grand Rapids,  
Michigan.

http://www.evergreen2010.org/





[OPEN-ILS-GENERAL] Error in 1.6.0.0 staff client Workstation Registration: g.aout_list is null

2009-11-24 Thread Repke de Vries

Anybody a suggestion how to solve this ?

We just did the 1.4.0.7 to 1.6.0.0 major upgrade without a glitch  
(thx for the upgrade instructions !)


Now I am getting this error message [1] in the staff client when  
entering the (first time use) Work Station Registration:

..
The page at (my server's IP address) says:
TypeError: g.aout_list is null
..

This happens both with the official Windows 1.6.0.0 staff client  
(Installable from EG Download Page)

and on a MAC (Sitka binary [2])

Must be our upgrade ?

Thanks, Repke, IISH

[1]the screenshot is here:
http://screencast.com/t/ZGIzNjYxMDYt
[2] http://sitka.bclibraries.ca/support/staff-client-executables/
The screenshot is for this MAC version



Re: [OPEN-ILS-GENERAL] Problem of importing the Gutenberg records

2009-10-29 Thread Repke de Vries

Hi Araik, Mike and list

on the wiki How to Import  [Bib] Records  page you refer to Araik  
[1]:  we should be aware that this page needs updating !


a) it does not  mention what can and cannot be done with MARC Batch  
Importing in the Staff Client (under Cataloging; aka Vandelay)


b) it does not mention that in 1.6 we have a completely new situation  
with MARC MFHD: Evergreen can Import [2] and maintain separate  
Holdings Records instead of demanding single MARC records keeping  
both the bibliographic and the (limited) holding information;
to Import MFHD (bibliographic record followed by one or more  
corresponding Holdings data records [or more broadly defined: Serials  
data - the MARC data needed for Serials Management]) you would need  
to use marc2sre.pl  which is not yet explained on this wiki page  
but elsewhere - see Dan Scott on marc2sre.pl  here
http://georgialibraries.markmail.org/search/?q=vandelay#query:vandelay 
+page:1+mid:6vt6yqpqtvdq7xc3+state:results


c) it needs tidying up and maybe needs 1.6 specific instructions  for  
the Gutenburg test set (of single MARC  records and no physical  
collection holding information in that record but pointing to an  
electronic resource)


Wishing you success, Repke (and work to do for the Evergreen  
Technical Side Documentation Initiative as introduced by Roma Mattot  
to this list October 27th)


[1]
http://www.open-ils.org/dokuwiki/doku.php?id=evergreen- 
admin:importing:bibrecordss[]=imports[]=record

[2] not sure if there are MFHD Export facilities yet

Op 29-okt-2009, om 8:13 heeft Araik Manukyan het volgende geschreven:


Thank you for your quick response.

We are sending your requested files that we have received during the
conversion.

- Original Message - From: Mike Rylander  
mrylan...@gmail.com
To: Evergreen Discussion Group open-ils- 
gene...@list.georgialibraries.org

Sent: Wednesday, October 28, 2009 8:54 AM
Subject: Re: [OPEN-ILS-GENERAL] Problem of importing the Gutenberg  
records



Would you mind sending that file?

TIA,

--miker

2009/10/28 araik ar...@flib.sci.am:

Hello list,

We installed Evergreen 1.6 version on ubuntu 8.10 server.
We tried importing the Project Gutenberg records to our server, as  
described

on the following link
http://www.open-ils.org/dokuwiki/doku.php?id=evergreen- 
admin:importing:bibrecordss[]=imports[]=record.

We passed 3 steps successful and we have file gutenberg.sql.
After trying to implement the step 4, \i ~/gutenberg.sql,
we have the following error

*


[OPEN-ILS-GENERAL] How do authorities kick in when Validating

2009-10-23 Thread Repke de Vries

Can someone actually using authority data in Evergreen enlighten me ?

We are evaluating Evergreen, imported some authority records through  
the staff client (Cataloging -- MARC Batch Import) but they do not  
seem to take effect.


Is the following assumption correct  - 'cause if it is than evidently  
something went wrong with the Import or is wrong with the Authority  
records it self:


one of  the imported authority records [1] has:
=100  0\$aMaloy, Eileen

I would expect then that Creating a New MARC Record (in Cataloging)  
and typing Meloy instead of Maloy and then Validate (or even just the  
letter M), would bring up this 100 field Maloy, Eileen  as  
suggestion or a list of suggestions all starting with M when just  
typing a M - and I would be able to just pick the right one and have  
it  automatically put in the 100 field ?


If it works completely different and you have a screenshot that you  
can send me off list: appreciated.


In our case the only response when Validating is: no authority  
records available which could mean anything: no match though import  
successful (but why), failed import (even though the records were  
signaled as imported) ...


Thanks a lot, Repke (IISH, Amsterdam)

[1]
Or as Evergreen displays it in the Import Queue for Authority Data:
 LDR00208nz a2200097o 45 0
001 IISGa11554924
003 IISG
005 20091004142342.0
008 021207n| acannaabn |n aac d
040 .   ‡aIISG ‡cIISG
100 1   .   ‡aMaloy, Eileen

[OPEN-ILS-GENERAL] Original cataloging: the produced TCN does not seem to follow MARC

2009-09-22 Thread Repke de Vries


We are evaluating with Evergreen 1.4  and testing  first or  
original cataloging where our library  archive has unique material  
and is the first in the world to catalogue.  Here is for example how  
our template looks like for cataloging audio material:   http:// 
screencast.com/t/mM0HJcEgY


In MARC field 040 [1] in the template we pass on to Evergreen  who  
we are as Cataloging Source: AmISG


Cataloging finished we expected Evergreen to
- construct a Title Control Number (TCN) starting with AmISG and  
followed by some unique sequential number;
- to create a MARC 001 field [2]  holding that  concatenated string  
AmISG unique number


What we observe is:
- AUTO-GENERATED instead of AmISG, followed by a unique number
- there is no 001 field created at all; instead - only visible after  
exporting the record: a 900 of Evergreen's choice:

901  \\$aAUTOGENERATED-11$bAUTOGEN$c20

Is there any Evergreen setting we forgot to look at ? Should we add a  
003 field with AmISG to our cataloging templates ?


Surely Evergreen can do original cataloging according to MARC ?

(TCN-s already present in MARC 001 when pre-cataloging or importing  
existing records are handled fine)
(I know from a question on this list by UPEI in Canada that the  
string AUTO-GENERATED can be changed but that leaves the lacking  
automatic 001 field generation)


Thanks, Repke, IISH

[1] http://www.loc.gov/marc/bibliographic/bd040.html
[2] http://www.loc.gov/marc/bibliographic/bd001.html
[3] http://www.loc.gov/marc/bibliographic/bd003.html


[OPEN-ILS-GENERAL] Consequences of having to upgrade the staff client too Re: Any more recent doc on upgrading Re: Evergreen 1.4.0.6 released

2009-09-07 Thread Repke de Vries

Thanks Mike (and Karen for download page update)

Mike: your doc focusses on the server side upgrade but all staff PC's  
will need the new 1.4.0.6 staff client before the combination works.


Question:  the new staff client install on the PC will keep all data  
on the PC in tact ?
Like someone's workstation registration and everything else Evergreen  
stored uniquely on your own PC ?


Thanks, Repke



Op 7-sep-2009, om 15:07 heeft Mike Rylander het volgende geschreven:


On Mon, Sep 7, 2009 at 2:57 AM, Repke de Vriesre...@xs4all.nl wrote:

Great news indeed.

Question:  is there any previous version  1.4.0.4 to this new version
1.4.0.6  upgrading How To ?



Starting with the 1.2-1.4 upgrade docs, I created a trimmed set that
covers 1.4.0.0-1.4.0.6:

http://open-ils.org/dokuwiki/doku.php?id=upgrading:1_4_0_to_1_4_0_6

I'll see about getting this linked from the download page, too.

--
Mike Rylander
 | VP, Research and Design
 | Equinox Software, Inc. / The Evergreen Experts
 | phone:  1-877-OPEN-ILS (673-6457)
 | email:  mi...@esilibrary.com
 | web:  http://www.esilibrary.com