On Wed, Feb 22, 2006 at 08:56:53PM -0600, Steven F. Baljkas wrote:
That said, Christine, you may need to revisit the call numbers used because Koha is still a touch fussy because it was originally designed for a specific type of call number (Dewey) and I'd bet that Christine will be making use of unique LC numbers. Koha is able to support unique LC numbers so long as your MARC links are set up correctly.
Third: Joshua, I am sorry I keep having to repeat this, but the simple fact is that Koha doesn't properly use MARC records and so it is prudent -- and in fact, at least in my legal jurisdiction and probably in yours, too, Christine, since it would fall under the legal category of **due diligence** -- to protect the data that you have already harvested AT COST against loss (OCLC or not, your cataloguing department presumably isn't run by volunteers, directed by volunteers, on donated computers with free power from the electric company). In the past, I have worked at libraries where we did double back-ups of both our raw retrieved cords and system-specific edited records (call it paranoid if you want, but it can help prevent disasters). First off, I'm all for being paranoid. By all means, make backups. However, I must reemphasize one fact: there is a major difference between 'properly using' MARC records and 'properly storing' MARC records. It's true that Koha does not _use_ MARC records to the fullest extent possible ... see below ... however, it's missleading to imply that Koha can't _store_ MARC properly.
Fourth: yes, Virginina there is a Santa Claus, but he doesn't give stuff away for free.
This is my smarmy way of saying that, yes, there are ILS that actually do use MARC completely, Joshua. Think Voyager for one. There are others. Of course, they are not free and they do not respond to users' requests/ demands/pleas for improvements/corrections/customisations at nearly the rate Koha attempts.
This is simply not true: no ILS fully uses MARC, they all use a subset of it (which is usually how standards are adopted, think SQL, POSIX, W3C and even Z39.50). If you've ever attempted to program something to a standard you'll know why most standards are never fully realized :) Let me cite just one example. Standard MARC requires the following of the 245 tag: If $f $g $h $k follow $b they refer to subtitle. If they follow $a they refer to title. No ILS that I've ever seen handles this in its indexes. As a friend once remarked: if the sun shines in ecuador then 245z equals title in swahili. Many bits of semantic information are lost the moment you import your records into an ILS because the ILS doesn't have any reasonable way to present such a vast semantic domain to humans in a way that it ends up improving our use of the system. Of course, there is also the issue of what do you mean by MARC. Are you refering to MARC syntax, MARC data elements, AACR? Another topic I suppose, though feel free to clarify...
But, PLEASE, PLEASE, PLEASE, Joshua et alia, don't get into the habit of excusing Koha's current deficiencies in that regard by assuming that nothing else gets it completely right either!
Joshua, I know that attitude isn't typical of what you have expressed in the past; it certainly isn't on the progressive path that people need to be on if problems are ever to be fixed and the software improved for everyone. If course, my goal as release manager for 3.0 is to improve Koha's use of MARC. That's the whole reason we're switching to Zebra. I am aware of our current limitations with regard to handling, for instance, Standard MARC Holdings, and I fully intend to fix those limitations so that Koha will be a strong candidate for even the most MARC-centric libraries.
(And for the record, yes, I realise it is easy for us who were trained to catalogue to spot and harp on Koha's problems with cataloguing. Similarly, yes, I realise that I and most cataloguing- oriented folk have not contributed and likely will never contribute anything concrete towards resolving the problems we can detect, other than pointing them out to the talented programmers and developpers who (may) have the skill set to solve the problems. And as a final concession/confession, yes, it is true that most ILS have quirks with their cataloguing components and how they handle MARC and that many don't use MARC completely/completely correctly: the fact that Koha has quirks isn't the problem; it is a problem that it doesn't completely handle MARC by the MINIMUM STANDARD rules yet -- because you have been so helpful to me, Joshua, I could clarify what that entails, but I can guarantee you would not find it pretty).
It's worth bringing yet another distinction between Koha's ability to _catalog_ Standard MARC and Koha's ability to _store_ Standard MARC. Koha can store standard MARC fine. And give the proper framework it can catalog Standard MARC as well. If you don't believe me, take a look at the proof-of-concept here: http://koha.liblime.com/cgi-bin/koha/acqui.simple/addbiblio.pl That demo handles a leader and Directory, 003, 005, 006, 007,008, and to my knowledge can catalog a BOOK material designation as per the MARC standard. If you find otherwise please let me know. Also, I should point out that I don't know of any other MARC editor that provide such a great interface for selecting possible values for the fixed fields ... The problem is, none of our clients want to catalog using 'Standard MARC' so noone has bothered to write a complete set of frameworks to accomodate that. Also, no professional cataloger that I know of has take the time to customize his/her own framework to support the subset of the standard that he/she needs.
Fifth: I have yet to see any example of Koha-exported MARC, so I cannot say definitively, but I would suggest that any system that does the violence of separating the record into multiple levels may not be capable of reintegrating into valid MARC. I would have to see actual samples to know. I have to receive any takers on that (and I have requested Koha-ised MARC exports in the past).
I'd be happy to supply some MARC records exported from Koha ... you could of course export some yourself from the LibLime demo: http://koha.liblime.com/cgi-bin/koha/export/marc.pl
To begin with the obvious, though, Joshua, exactly how do you propose that Koha re-encodes the Directory portion (the series of numerals coding tag occurence and field length, immediately following the Leader), when it currently ignores all the control fields? Any changes a Koha-library made to a record would be lost; any attempt to export that record would result in invalid MARC as the Directory, even if kept somewhere, would match the original MARC, not the edited version. Again, you're missing the point. Koha doesn't need to 'use' the MARC record to properly store it and export it, just like a directory doesn't need to 'use' the MARC to store a MARC file. The field length in the Directory portion is managed by the MARC::Record module.
Sixth: yes, the majority of problems Koha-ites complain of/question on list-serv do indeed seem to be, as Joshua suggests, occurences of improper or incomplete configuration by the new would-be Koha library.
That admitted, Joshua, you, like most of the programming-savvy, tend to forget that most of us actually working in library science/library tech are used to turn-key systems that require a great deal less set-up. (Take Voyager circulation config for one small example: it can autofill the circulation fields saving a lot of time Koha wastes requiring cut and paste.) Absoutely, which is why some of us have formed support companies to help out with that proccess. (BTW: autofill circulation fields, I'm curious, could you expand on that?).
That is not a complaint, in and of itself, just an observation of differences in experience that are apt to cause friction unless taken into account. I think it's a healthy conversation. I'm certainly not going to claim that I've got as much direct knowledge about the ins and outs of the MARC standard as you; I know enough to build some tools that can manage MARC correctly though, and I can read the free online specifications. Certainly, if you are willing to lend help in defining how Koha should 'use' MARC in ways that it doesn't currently, I'm all ears.
(I still think it would be ideal, one day in the distant future, to get Koha to a level where it would behave like a more typical turn-key ILS: that said, I do like the fact that Koha reminds us about all that goes into making the system work and what configuration really requires.) ditto
I truly hope we can get a good thread going here ... I think it would be good for the project if we addressed these issues now, as we're designing the next major version of Koha. Sincerely, -- Joshua Ferraro VENDOR SERVICES FOR OPEN-SOURCE SOFTWARE President, Technology migration, training, maintenance, support LibLime Featuring Koha Open-Source ILS jmf@liblime.com |Full Demos at http://liblime.com/koha |1(888)KohaILS