So I've cobbled together a server for my primary school library project. It's an AMD Athlon 700 MHz chip on an Asus motherboard, 20 GB Seagate HDD, a 10/100 card, 128 MB SD-RAM (which I'll double before hooking it up to the network) all wrapped up in a cheap Koala ATX case. Some donated, some bought new. I'm almost done with installing Linux (doing the Kernel compile as I write this. So the next stage is upon me: getting the library data out of Catalist and into Koha. My first question is, what's the best way to export the Catalist database? Bear in mind I'm not a librarian (I grok a fair bit of database-speak though). I'm assuming this involves MARC in some way, but I'm going to need some details on how to proceed. -- Cheers, Rich.
On Thu, Sep 25, 2003 at 01:45:16PM +1200, Rich Churcher said:
So I've cobbled together a server for my primary school library project. It's an AMD Athlon 700 MHz chip on an Asus motherboard, 20 GB Seagate HDD, a 10/100 card, 128 MB SD-RAM (which I'll double before hooking it up to the network) all wrapped up in a cheap Koala ATX case. Some donated, some bought new.
That should run nicely.
I'm almost done with installing Linux (doing the Kernel compile as I write this. So the next stage is upon me: getting the library data out of Catalist and into Koha.
My first question is, what's the best way to export the Catalist database? Bear in mind I'm not a librarian (I grok a fair bit of database-speak though). I'm assuming this involves MARC in some way, but I'm going to need some details on how to proceed.
Ill let Olwen answer here, she did the export of data from Catalist for HLT. So im sure she can give you a good answer Chris -- Chris Cormack Programmer 027 4500 789 Katipo Communications Ltd chris@katipo.co.nz www.katipo.co.nz
I've emailed Rich about this. IO have the routines used for Horowhenua. I say my A/Rev backup just a few days ago. But I need comments from the gurus about how mauch databse layout has changed with Koha 2.0 Chris Cormack wrote:
On Thu, Sep 25, 2003 at 01:45:16PM +1200, Rich Churcher said:
So I've cobbled together a server for my primary school library project. It's an AMD Athlon 700 MHz chip on an Asus motherboard, 20 GB Seagate HDD, a 10/100 card, 128 MB SD-RAM (which I'll double before hooking it up to the network) all wrapped up in a cheap Koala ATX case. Some donated, some bought new.
That should run nicely.
I'm almost done with installing Linux (doing the Kernel compile as I write this. So the next stage is upon me: getting the library data out of Catalist and into Koha.
My first question is, what's the best way to export the Catalist database? Bear in mind I'm not a librarian (I grok a fair bit of database-speak though). I'm assuming this involves MARC in some way, but I'm going to need some details on how to proceed.
Ill let Olwen answer here, she did the export of data from Catalist for HLT. So im sure she can give you a good answer
Chris
-- Olwen Williams See my B&B site http://www.bandbclub.com and my new site http://www.handyman.co.nz - A virtual shed for real kiwi blokes.
On Thu, Sep 25, 2003 at 10:16:42PM +1200, Olwen Williams said:
I've emailed Rich about this. IO have the routines used for Horowhenua. I say my A/Rev backup just a few days ago.
But I need comments from the gurus about how mauch databse layout has changed with Koha 2.0
Hi Olwen Well a big chunk of the database is the same as when you and I first built it :) Issues, accountlines, aqorders, borrowers etc are all the same. In fact items biblioitems and biblios are still the same too. What has happened is there are a now a pile of MARC tables as well. Im not sure if there is a nice way to convert the data in the koha biblio table to the MARC tables, but thats essentially what we need. (Its what we need for the script to upgrade from 1.2.3 to 2.0 as well) Paul will hopefully have an answer for this, if such a script exists, you could then just populate the koha tables, and run the script to populate the MARC tables. This is just for the bibliographic data, all the other data is the same format. Hope this helps Chris -- Chris Cormack Programmer 027 4500 789 Katipo Communications Ltd chris@katipo.co.nz www.katipo.co.nz
The box is now closed and purring away quietly under my desk. It ended up with a 700 MHz Athlon and 380 MB SD-RAM which should be ample to deal with the small library site plus some minor networking jobs. Versions: Gentoo Linux 1.4 (2.4.20-gentoo-r7 kernel) MySQL 4.0.13-r4 Apache 2.0.47 Koha 2.0.0pre4 I'll do the Koha install now and see what happens. -- Cheers, Rich.
On 2003-09-26 05:14:20 +0100 Rich Churcher <rmch@xtra.co.nz> wrote:
MySQL 4.0.13-r4 Apache 2.0.47
Do we work with MySQL 4? I've not tried, myself. Remember to tune Apache 2 as the docs on http://httpd.apache.org/ describe.
I'll do the Koha install now and see what happens.
Apologies for the installer brokenness. The worst is the z3950 bug. Searching bugs.koha.org may help. -- MJR/slef My Opinion Only and possibly not of any group I know. http://mjr.towers.org.uk/ gopher://g.towers.org.uk/ slef@jabber.at Creative copyleft computing services via http://www.ttllp.co.uk/
On Fri, Sep 26, 2003 at 10:35:08AM +0100, MJ Ray said:
On 2003-09-26 05:14:20 +0100 Rich Churcher <rmch@xtra.co.nz> wrote:
MySQL 4.0.13-r4 Apache 2.0.47
Do we work with MySQL 4? I've not tried, myself. Remember to tune Apache 2 as the docs on http://httpd.apache.org/ describe.
Mysql 4 works a treat :) HLT is running MySQL 4 .. the master/slave replication is quite a bit nicer than in 3 Im still using MyISAM tables tho, I havent done any testing with InnoDB yet. CHris -- Chris Cormack Programmer 027 4500 789 Katipo Communications Ltd chris@katipo.co.nz www.katipo.co.nz
participants (4)
-
Chris Cormack -
MJ Ray -
Olwen Williams -
Rich Churcher