Importing and updating bibliographic records
Dear list, My library wishes to use koha and so we have to inject our existng catalog into koha. Our current catalog is stored in a mysql database and we already have the necessary tools written in Perl to play with the records in the database. The strategy I have in mind for doing the migration is to convert all our bbliographic records to UNIMARC using the MARC::Record tool and then import it into koha. I have several questions. 0. Does this strategy look correct to you ? 1. According to perldoc MARC::Record, it seems that the module can handle only USMARC and MicroLIF styles. Is one of these two okay for handling UniMarc and, if yes, which one ? 2. After the initial import has been done, we will certainly want to refine it by using data from our national library, sch as leaders for instance (?), or perhaps or initial import won't be complete and we will want to add other things we did not at first to keep the first import possible. So, is the incremental modification of records possible and how ? Wil we have to export the records to marc, then modify them and re-inject them, or is it recommended to do things another way ? Please consider that I know nothing concerning the tomic I'm trying to address in this post and don't hesitate to give pointers to things you thnk I should read. Many thanks in advance for any help, Sébastien.
Dear List Poster,
The strategy I have in mind for doing the migration is to convert all our bbliographic records to UNIMARC using the MARC::Record tool and then import it into koha.
0. Does this strategy look correct to you ?
Yes.
1. According to perldoc MARC::Record, it seems that the module can handle only USMARC and MicroLIF styles. Is one of these two okay for handling UniMarc and, if yes, which one ?
USMARC ($record->as_usmarc) produces valid UNIMARC records. This way you can populate an ISO2709 MARC file that can be uploaded into Koha with bullkmarcimport.pl script.
2. After the initial import has been done, we will certainly want to refine it by using data from our national library, sch as leaders for instance (?), or perhaps or initial import won't be complete and we will want to add other things we did not at first to keep the first import possible. So, is the incremental modification of records possible and how ? Wil we have to export the records to marc, then modify them and re-inject them, or is it recommended to do things another way ?
It can work this way. You can also use Koha API to modify biblio records directly. Take a look at Biblio.pm Koha module. You will read all biblio records. For each biblio record, you will query your (our) national library. You will merge both records and write the result into your Koha DB. Cheers, -- Frédéric DEMIANS http://www.tamil.fr/u/fdemians.html
Hi Frederic, Thanks a lot for your prompt and precise reply.
USMARC ($record->as_usmarc) produces valid UNIMARC records.
Ah this is good news, thanks !
This way you can populate an ISO2709 MARC file that can be uploaded into Koha with bullkmarcimport.pl script.
2. After the initial import has been done, we will certainly want to refine it by using data from our national library, sch as leaders for instance (?), or perhaps or initial import won't be complete and we will want to add other things we did not at first to keep the first import possible. So, is the incremental modification of records possible and how ? Wil we have to export the records to marc, then modify them and re-inject them, or is it recommended to do things another way ?
It can work this way. You can also use Koha API to modify biblio records directly. Take a look at Biblio.pm Koha module. You will read all biblio records. For each biblio record, you will query your (our) national library. You will merge both records and write the result into your Koha DB.
Okay I see, thanks a lot for he pointer. In fact, could thissz biblio.pm module also be used for the initial immort, rather than using MARC directly ? According to you, how do these solutions compare ? And also, are there functionalities in Koha's web interface that could be used to enrich an existing bibliographic record (or set of existing bibliographic records) with data fetched, say, from a Z39.50 server ? Thanks again for helping, Sébastien.
In fact, could thissz biblio.pm module also be used for the initial immort, rather than using MARC directly ? According to you, how do these solutions compare ?
Yes, but it could be tricky. bulkmarcimport.pl script invokes biblio.pm functions as you can imagine. But it do it correctly, ie it call several functions in order to add both biblio records AND their related item records. If you want to do it directly you have to hack bullmarcimport.pl script and understand how it works.
And also, are there functionalities in Koha's web interface that could be used to enrich an existing bibliographic record (or set of existing bibliographic records) with data fetched, say, from a Z39.50 server ?
(1) In Koha data entry page, you can perform a Z39.50 search. But you can't merge the result of a Z39.50 search with an existing biblio record. You can just overlap it or create a new record. (2) Another solution would be to use a standard z39.50 client in order create iso2709 file. Then you import this file into Koha with Tools > Stage MARC Records. In Manage Staged MARC Record, you can specify a matching rule. But here again, you can't choose to merge records (new and old). You just can replace, add or ignore. So there are two places in Koha where some improvements would help you to do what you need to do. -- Frédéric
Hi Frederic, thanks again for your prompt and helpful reply.
Yes, but it could be tricky. bulkmarcimport.pl script invokes biblio.pm functions as you can imagine. But it do it correctly, ie it call several functions in order to add both biblio records AND their related item records. If you want to do it directly you have to hack bullmarcimport.pl script and understand how it works.
Then it is of course more straightforward to just produce unimarc records and feed them to bulkmarcimport... Thanks for having clarified this.
And also, are there functionalities in Koha's web interface that could be used to enrich an existing bibliographic record (or set of existing bibliographic records) with data fetched, say, from a Z39.50 server ?
(1) In Koha data entry page, you can perform a Z39.50 search. But you can't merge the result of a Z39.50 search with an existing biblio record. You can just overlap it or create a new record. (2) Another solution would be to use a standard z39.50 client in order create iso2709 file. Then you import this file into Koha with Tools > Stage MARC Records. In Manage Staged MARC Record, you can specify a matching rule. But here again, you can't choose to merge records (new and old). You just can replace, add or ignore. So there are two places in Koha where some improvements would help you to do what you need to do.
Well, not ure whether this could be automated. As I understand things, what I have to do is to first generate MARC records for all the titles we have in our catalog and then import them into koha. Then, as a second step, get the sbject headings for those titles from e.g. the national library and add them to the existing records. If anybody thinks this doesn't make sense, please do not hesitate to let me know. I'm completely new to the field so I ay very well be completely wrong. Thanks, Sébastien.
participants (2)
-
Frederic Demians -
Sébastien Hinderer