[Koha] Springer ebook records + Zebra Indexing and Searching
Susan Mustafa
susan.mustafa at gmail.com
Wed Apr 21 18:10:42 NZST 2010
Dear Stacy,
I have fixed the issue. My problem is that the zebra_rebuild process is
running from the ROOT crontab. I recommend that all koha jobs should be in
KOHA Crontab.
Once I chown koha:koha /var/lib/koha/zebradb <--Where the zebra database is,
then I was able to see the data from within Koha.
So now I will change my crontab process to koha into Koha Crontab.
Relative to customizations, I will look into them now :)
Thanks,
On Wed, Apr 21, 2010 at 8:37 AM, Susan Mustafa <susan.mustafa at gmail.com>wrote:
> Dear Stacy,
>
> At the moment, I did an import of the 14,000 records MARC file into
> Koha-Test.
>
> I did a rebuild_zeba -r -a -b <-- I believe this clears all of the previous
> indexes and does indexing from 0.
>
> Now, I know that there is a record called [Extreme Sports]. This record is
> from Springer, and it is shown under biblios table in MySQL Db after I
> finished importing the Marc file into Koha. However once rebuilding zebra,
> searching for this record returns 0.
>
> Some springer records are searchable, but others Koha can't find. Does
> this mean that Zebra didn't index them [I see no errors when I do rebuild]
> OR does this mean that something else is wrong?
>
>
> I did random checking of some records, obviously there is no way for me to
> check and see that all 14,000 RECORDs were indexed and Searchable? OR is
> there? You know this better than me.
>
>
> Kindly bear with me on this. I might need your help since you did the
> Springer Import before.
>
> Best Regards,
>
>
> Now once I go into the MySQL DB, I see that the record for example [Extreme
> Sports] exists
>
> On Tue, Mar 9, 2010 at 12:35 AM, Stacy Pober <stacy.pober at manhattan.edu>wrote:
>
>> I loaded about 17,000 Springer ebooks into our catalog. I did not
>> directly upload them into Koha, though. First, I edited them in the
>> free MarcEdit program to add some fields.
>>
>> We use EZproxy to allow our users to access IP-restricted resources
>> from off-campus. If you use something similar, you would use the
>> Edit Subfield Data function to alter the 856$u data. You would
>> replace "http: with something like
>> "http:ezproxy.yourlibrary.edu:2048/login?url=http:"
>>
>> Additionally, the Springer ebook MARC records do not come with any
>> custom link text. We added custom link text by adding an 856$y
>> field. with the text "Available for our library via Springer eBooks.
>> Click here for access". The custom link text is often provided in an
>> 856$w field rather than the $y field, and the $w and $y but they seem
>> to display exactly the same in our catalog, so if you get other
>> vendor-supplied records with 845$w fields, you don't need to change
>> them unless you want to customize the link text.
>>
>> We also add a 952 field to specify the branchcode, ITYPE, shelving
>> location and CCODE:
>> \\$aMAN$bMAN$cEBOOKS$oELECTRONIC BOOKS$yEBOOKS$8EBOOKS
>>
>> The branch code needed to be added because without it, each ebook
>> appeared in Koha with messages about availability that we did not want
>> to see. Adding the ITYPE allowed us to add a "View this eBook" link
>> to each summary record which is a nice convenience for end-users.
>>
>> We also add a 500$a field which lists the ebook service name (Springer
>> eBooks) because that's the standard way we've been listing all other
>> ebook packages (not all of the vendors have the package name anywhere
>> in the record) and a 099$a field for Local Call Number of ELECTRONIC
>> BOOKS because that's the way our other ebooks have always been listed.
>>
>> You may not need or want this level of customization, but it's nice to
>> know it's available.
>>
>> When I first tried to upload the Springer records, I could not
>> upload them in one step. Our catalog is hosted by LibLime and at the
>> time I processed these records (a couple of months ago), the server
>> response time was very slow and it would time out for large uploads.
>> This was the only batch of MARC records I had that was quite so large,
>> and I had to split the 17,000 records into batches of about 1500
>> records. I don't know if there is any effective limit on file size
>> for uploads in the official version of Koha.
>>
>> The processing of the initial file of the large Springer ebook record
>> set was tedious but routine. However, Springer issues updated record
>> sets for books they add to your subscription each month, and I am
>> currently trying to work through some problems in the December update
>> file. The monthly updates come in separate files for each publication
>> year. We get five years of books, and rather than having to process
>> them separately, I used MarcJoin in MarcEdit to concatenate them into
>> one large file. This did not work, because it turned out there were
>> some invalid characters in two of the records that caused errors in
>> the joined file. I'm still working through that. The individual
>> files validate correctly using the validator in MarcEdit, so I need
>> to find a 'pickier' MARC validator to find the invalid characters.
>>
>> One more thing about ebook MARC records generally: I find that the
>> quality of publisher-supplied MARC records is very variable. One
>> problem we ran into with the MARC record sets for netLibrary and a
>> couple of other vendors is that some of the records included multiple
>> $856$u fields with URL's leading to other subscription-based online
>> services to which we have no access. I had to find those bad URLs
>> and strip them out of our records, and that was a little tedious.
>> --
>> Stacy Pober
>> Information Alchemist
>> Riverdale, NY 10471
>> stacy.pober at manhattan.edu
>>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://lists.katipo.co.nz/pipermail/koha/attachments/20100421/160f97bb/attachment-0001.htm
More information about the Koha
mailing list