Very small library, low-end hardware
Hi all. I'm peripherally involved with a local primary school (I host their website, manage email, help out with tech support etc.) I'm told that they pay something of the order of $400 per year for Catalist to manage the collection in their one-room library. This strikes me as excessive. I'm investigating the possibility of installing Linux on one of their rather old Pentium boxen and running Koha. I'm quite comfortable with the Linux admin side of things, and with SQL, but know absolutely nothing of library cataloguing systems. I have some questions which I hope developers or experienced users might answer for me: 1. What is the absolute, barest minimum hardware you would consider installing Koha on, given that the database is not likely to be very large? 2. Will most older barcode readers work with Linux + Koha? 3. How difficult is the conversion going to be from Catalist to Koha? Is there any kind of automated conversion (possibly via MARC), or would they be stuck re-scanning everything in their library? 4. Are there any other primary schools in New Zealand running Koha who might be able to comment on the experience? What I've seen so far is very promising, but I'd like a little more info before recommending the system to a school which is not exactly overflowing with disposable income ;o) Hard to argue with the price though. -- Cheers, Rich.
Rich Churcher wrote:
1. What is the absolute, barest minimum hardware you would consider installing Koha on, given that the database is not likely to be very large?
I've installed Koha on a Celeron 600, with 192MB RAM & 20GB HD. (server & client). The librarian finds it too slow. It's a problem with Perl compiling i think. I tried on a 350MHz PC (server only). It's unuseable : 5 seconds for every page, minimum.
2. Will most older barcode readers work with Linux + Koha?
Yes if your barcode is considered like a keyboard.
3. How difficult is the conversion going to be from Catalist to Koha? Is there any kind of automated conversion (possibly via MARC), or would they be stuck re-scanning everything in their library?
via MARC / bulkmarcimport you should be able to migrate your datas.
4. Are there any other primary schools in New Zealand running Koha who might be able to comment on the experience?
I dunno -- Paul POULAIN Consultant indépendant en logiciels libres responsable francophone de koha (SIGB libre http://www.koha-fr.org)
On 2003-09-03 10:46:06 +0100 paul POULAIN <paul.poulain@free.fr> wrote:
I've installed Koha on a Celeron 600, with 192MB RAM & 20GB HD. (server & client). The librarian finds it too slow. It's a problem with Perl compiling i think.
In the longer term, there are some tricks we can do to speed this up (such as CGI::SpeedyCGI), but they will require very clean perl code and a bug hunt. I think we've enough on for now with 2.0. That said, the machine above should be fast enough. My development system is only a K6/2 500. Not so long ago I was running really heavy perl on a 486... Perhaps some performance tuning would help? I think there are guides on http://httpd.apache.org/docs/ and http://www.mysql.com/ in addition to the normal distribution guides. A small, lean, server-orientated distribution would be a better choice than Mandrake, for example, or maybe you should try a distribution compiled from source with tuning options. You can compensate for hardware a bit by spending more time on configuration, but I still wouldn't try current koha on less than a Pentium MMX. Hope that helps, -- MJR/slef My Opinion Only and possibly not of any group I know. http://mjr.towers.org.uk/ gopher://g.towers.org.uk/ slef@jabber.at Creative copyleft computing services via http://www.ttllp.co.uk/
MJ Ray wrote:
On 2003-09-03 10:46:06 +0100 paul POULAIN <paul.poulain@free.fr> wrote:
I've installed Koha on a Celeron 600, with 192MB RAM & 20GB HD. (server & client). The librarian finds it too slow. It's a problem with Perl compiling i think.
In the longer term, there are some tricks we can do to speed this up (such as CGI::SpeedyCGI), but they will require very clean perl code and a bug hunt. I think we've enough on for now with 2.0.
That said, the machine above should be fast enough. My development system is only a K6/2 500. Not so long ago I was running really heavy perl on a 486... Perhaps some performance tuning would help? I think there are guides on http://httpd.apache.org/docs/ and http://www.mysql.com/ in addition to the normal distribution guides. A small, lean, server-orientated distribution would be a better choice than Mandrake, for example, or maybe you should try a distribution compiled from source with tuning options. You can compensate for hardware a bit by spending more time on configuration, but I still wouldn't try current koha on less than a Pentium MMX.
Mandrake is OK for server. You just need a few tuning with /etc/my.cnf. BUT, maybe the problem comes from MDK9.1, which is the system i installed. It goes with Apache 2.0, NOT with 1.3. I already had some problems with 2.0, maybe it's slow on "old" PC ? any idea anyone ? -- Paul POULAIN Consultant indépendant en logiciels libres responsable francophone de koha (SIGB libre http://www.koha-fr.org)
On 2003-09-03 16:52:33 +0100 paul POULAIN <paul.poulain@free.fr> wrote:
Mandrake is OK for server. You just need a few tuning with /etc/my.cnf.
...and to stop it from putting on any GUI etc. [...]
I already had some problems with 2.0, maybe it's slow on "old" PC ? any idea anyone ?
No idea. Others have mentioned some problems, but koha 2 is running fine on the 500MHz system with Apache 2 (and Debian GNU/Linux) here. -- MJR/slef My Opinion Only and possibly not of any group I know. http://mjr.towers.org.uk/ gopher://g.towers.org.uk/ slef@jabber.at Creative copyleft computing services via http://www.ttllp.co.uk/
paul POULAIN <paul.poulain@free.fr> writes: > Rich Churcher wrote: >> 1. What is the absolute, barest minimum hardware you would >> consider installing Koha on, given that the database is not >> likely to be very large? >> > I've installed Koha on a Celeron 600, with 192MB RAM & 20GB HD. > (server & client). The librarian finds it too slow. It's a problem > with Perl compiling i think. I tried on a 350MHz PC (server > only). It's unuseable : 5 seconds for every page, minimum. My testing box at home is a 266Mhz AMD K-2 with 160Mb RAM. While I wouldn't exactly call it speedy, it serves up most pages in less than a second with a database of 14,000 + entries to search through. In console mode using a text only browser, it's fast enough to use for real work. But I agree that perl seems to be the slowest link in the query chain. I've played around some with writing gawk scripts to query the MySQL database and it seems to return the results much faster than perl does. I've no idea why though. -- Larry Stamm, Chair McBride and District Public Library McBride, BC V0J 2E0 Canada http://www.mcbridebc.org/library
On Wed, Sep 03, 2003 at 08:54:39PM -0700, Larry Stamm said:
paul POULAIN <paul.poulain@free.fr> writes:
> Rich Churcher wrote: >> 1. What is the absolute, barest minimum hardware you would >> consider installing Koha on, given that the database is not >> likely to be very large? >> > I've installed Koha on a Celeron 600, with 192MB RAM & 20GB HD. > (server & client). The librarian finds it too slow. It's a problem > with Perl compiling i think. I tried on a 350MHz PC (server > only). It's unuseable : 5 seconds for every page, minimum.
My testing box at home is a 266Mhz AMD K-2 with 160Mb RAM. While I wouldn't exactly call it speedy, it serves up most pages in less than a second with a database of 14,000 + entries to search through.
In console mode using a text only browser, it's fast enough to use for real work.
But I agree that perl seems to be the slowest link in the query chain. I've played around some with writing gawk scripts to query the MySQL database and it seems to return the results much faster than perl does. I've no idea why though.
Yep, I think to get some real speed increases, we'd need to look into mod_perl and Apache::DBI to get some connection pooling. I suspect that starting up perl, and making database connections are 2 of the bottlenecks. But once we get 2.0 out, we can do some work with the profiler tools and spot any bottlenecks and fix them. Chris -- Chris Cormack Programmer 027 4500 789 Katipo Communications Ltd chris@katipo.co.nz www.katipo.co.nz
On 2003-09-04 05:08:58 +0100 Chris Cormack <chris@katipo.co.nz> wrote:
Yep, I think to get some real speed increases, we'd need to look into mod_perl and Apache::DBI to get some connection pooling.
I'd rather weaken our Apache dependency, not strengthen it. CGI::SpeedyCGI should help with that. But yes, I agree that we need profiling after 2.0. -- MJR/slef My Opinion Only and possibly not of any group I know. http://mjr.towers.org.uk/ gopher://g.towers.org.uk/ slef@jabber.at Creative copyleft computing services via http://www.ttllp.co.uk/
On Thu, 4 Sep 2003, Chris Cormack wrote:
Yep, I think to get some real speed increases, we'd need to look into mod_perl and Apache::DBI to get some connection pooling.
I plan on doing this after 2.0 comes out if noone else beats me to it. CGI is great for development, but crapola for serious deployment. :) mod_perl rocks and they've got the 'beta' mod_perl for apache 2.0 thoroughly bug fixed at this point. 2.0 offers whole new avenues for tweaking things to improve performance. "Maybe a multi-threaded apache would make intel hyperthreading useful under linux, let's try...."
I suspect that starting up perl, and making database connections are 2 of the bottlenecks.
Oh yes. Even MySQL takes a significant time to setup a database connection compared to the rest of the perl script. Sadly that's nothing compared to the heavyweight time required to connect to heavyweight database like Oracle and DB2. I actually had a really scary script come across my desk a year ago, written by a very smart scientist (sigh). The script had a database connection to a server three states away inside the inner loop that ran 50 times for every script invocation. Apparently they needed a consultant to explain to them that the script as written was going to take 45 secs per CGI invocation, that it couldn't run in mod_perl because the Perl was so very ugly, and oh by the way, pull this out of the inner loop and things will get much quicker. Despite that, in looking at profiling data throughout that process to see where the script was really sucking wind the SQL connection portion of things and loading modules were the real crunchers. Loading DBI and doing a database connection on a P90 isn't too painful. On older hardware the slow I/O tends to make Perl seem slower because it pulls in so many different files and processes them when it starts up. That still doesn't mean the old hardware is useless for Perl/Koha, but that a different strategy might be required - like a Perl daemon accepting telnets.
But once we get 2.0 out, we can do some work with the profiler tools and spot any bottlenecks and fix them.
Profiling is pretty easy to do and and can be extremely helpful. It may or may not surprise you with its answers, but it's so nice to see and have real numbers. -- </chris> The death of democracy is not likely to be an assassination from ambush. It will be a slow extinction from apathy, indifference, and undernourishment. -Robert Maynard Hutchins, educator (1899-1977)
On Fri, 2003-09-05 at 01:04, Christopher Hicks wrote:
I suspect that starting up perl, and making database connections are 2 of the bottlenecks.
Oh yes. Even MySQL takes a significant time to setup a database connection compared to the rest of the perl script.
Remembering I'm coming in cold on this, but--is there any benefit to be had from using persistant db connections? Just curious. Regarding the Apache dependency issue, it seems that one might be able to achieve a loose coupling to the server (ie. have mod_perl as an option, rather than a requirement). Though I suppose that splits the development effort somewhat. Once again, an entirely uninformed comment! Been writing to a few people describing the sort of hardware we need for the school project, no nibbles yet. Phil, any and all help gratefully received, maybe once I scrape a server together we can swap notes on the setup. Olwen, I'd actually read that obituary before; I played some chess for awhile there and it came up on a web search. Your father sounds like an interesting man, an 'enthusiast'. My nieces and nephews went to Corstorphine school, and my brother looks after their small Windows LAN, so there's a bit of a family connection for me too. -- Cheers, Rich.
Librarianss without a technical bent are probably already ignoring all of my e-mails anyway, but if you're still around feel free to skip the rest of this one and stock your library with Perl and O'Reilly books! :) On 6 Sep 2003, Rich Churcher wrote:
On Fri, 2003-09-05 at 01:04, Christopher Hicks wrote:
I suspect that starting up perl, and making database connections are 2 of the bottlenecks.
Oh yes. Even MySQL takes a significant time to setup a database connection compared to the rest of the perl script.
Remembering I'm coming in cold on this, but--is there any benefit to be had from using persistant db connections? Just curious.
There are several clear benefits in doing so. Primarily it allows the databse server and the web server to spend their time doing "real" work instead of continuously re-establishing connections. In practice this is so helpful that there's a standard official mod_perl module to do it - Apache::DBI. Numerous expensive big name products and a few other open source projects provide similar functionality for other operating environments.
Regarding the Apache dependency issue, it seems that one might be able to achieve a loose coupling to the server (ie. have mod_perl as an option, rather than a requirement).
mod_perl should only ever be an option. Having CGI to test and develop with is too handy to lose IMHO.
Though I suppose that splits the development effort somewhat. Once again, an entirely uninformed comment!
It really doesn't split effort much at all. The wrapper scripts and glue code we use to have sites work in either mode are less than 70 lines of total code. Beyond that everything is the truly the same. Writing code that works in both does require some extra testing, but once you get in the habit of writing in strict and otherwise being a good mod_perl programmer it doesn't really take much thought to deal with any mod_perl vs. CGI issues. The mod_perl guide maintained by Stas Bekman and a roving band of mod_perlers on http://perl.apache.org/ does a very good job of illucidating the issues that are important to not lose sight of. -- </chris> No, no, you're not thinking, you're just being logical. -Niels Bohr, physicist (1885-1962)
On Thu, 2003-09-04 at 15:54, Larry Stamm wrote:
My testing box at home is a 266Mhz AMD K-2 with 160Mb RAM. While I wouldn't exactly call it speedy, it serves up most pages in less than a second with a database of 14,000 + entries to search through.
The school does have a 700 MHz PC, but I doubt they'd want to install Linux on it, and I don't trust Windows + Apache. Their library computers are apparently 100 MHz, which is probably pushing it. I'm sure there must be somewhere around here that's selling cheap second-hand or ex-office equipment that might donate some of it to a good cause. My preferred distribution is Gentoo, which can achieve quite a minimal console install when asked, though I've yet to try it on a low-end system. I know someone with a spare 266 MHz box, I might do a trial install and see how it goes. There's always Knoppix :o) The conversion from Catalist sounds promising. I would appreciate any and all information people are able to give me regarding this, scripts via private mail if possible. Thanks so much for all your input... this should be an interesting experiment. Looks like I might have to take the time to learn Perl (where's my camel book?) -- Cheers, Rich.
I have located my A/Rev extract from Catalist. Have to remember how to get into Catalist and use it. The Horowhenua conversion was the last time I used A/Rev. although my day-to-day work is in MvBase which is linguistically similar. Where is the school? I extolled the virtues of koha to a contact who volunteers with a school library in rural Marlborough. If you are within a day's drive of Blenheim I'll come and help with the conversion. Otherwise I'll can take the data and run the extracts. Rich Churcher wrote:
On Thu, 2003-09-04 at 15:54, Larry Stamm wrote:
My testing box at home is a 266Mhz AMD K-2 with 160Mb RAM. While I wouldn't exactly call it speedy, it serves up most pages in less than a second with a database of 14,000 + entries to search through.
The school does have a 700 MHz PC, but I doubt they'd want to install Linux on it, and I don't trust Windows + Apache. Their library computers are apparently 100 MHz, which is probably pushing it. I'm sure there must be somewhere around here that's selling cheap second-hand or ex-office equipment that might donate some of it to a good cause.
My preferred distribution is Gentoo, which can achieve quite a minimal console install when asked, though I've yet to try it on a low-end system. I know someone with a spare 266 MHz box, I might do a trial install and see how it goes. There's always Knoppix :o)
The conversion from Catalist sounds promising. I would appreciate any and all information people are able to give me regarding this, scripts via private mail if possible. Thanks so much for all your input... this should be an interesting experiment. Looks like I might have to take the time to learn Perl (where's my camel book?)
-- Cheers, Rich.
_______________________________________________ Koha mailing list Koha@lists.katipo.co.nz http://lists.katipo.co.nz/mailman/listinfo/koha
-- Olwen Williams See my B&B site http://www.bandbclub.com and my new site http://www.handyman.co.nz - A virtual shed for real kiwi blokes.
On Thu, 2003-09-04 at 21:14, Olwen Williams wrote:
Where is the school? I extolled the virtues of koha to a contact who volunteers with a school library in rural Marlborough. If you are within a day's drive of Blenheim I'll come and help with the conversion. Otherwise I'll can take the data and run the extracts.
Alas, the school is in Dunedin (Corstorphine primary). They know I might have a solution for them, but that's about as far as it's gone at this stage. I'd like to set a box up, get a dump of their data and make sure everything's working before I show it to them. If it went well, it sure would be an interesting project to offer the same solution to other schools. At the very least, I can write it up as a case study for small-scale Koha if all goes well. I hate the idea of these people being gouged for a proprietary solution, especially one that's klunky and out of date. Right now I'm just casting about for appropriate hardware for the server. Will definitely take you up on assistance with data conversion. -- Cheers, Rich.
I definately owe Corstorphine. My father (Gerry Williams) taught there for most of his career. Many of the parents or grandparents were probably taught by him. Rich Churcher wrote:
On Thu, 2003-09-04 at 21:14, Olwen Williams wrote:
Where is the school? I extolled the virtues of koha to a contact who volunteers with a school library in rural Marlborough. If you are within a day's drive of Blenheim I'll come and help with the conversion. Otherwise I'll can take the data and run the extracts.
Alas, the school is in Dunedin (Corstorphine primary). They know I might have a solution for them, but that's about as far as it's gone at this stage. I'd like to set a box up, get a dump of their data and make sure everything's working before I show it to them.
If it went well, it sure would be an interesting project to offer the same solution to other schools. At the very least, I can write it up as a case study for small-scale Koha if all goes well. I hate the idea of these people being gouged for a proprietary solution, especially one that's klunky and out of date.
Right now I'm just casting about for appropriate hardware for the server. Will definitely take you up on assistance with data conversion.
-- Cheers, Rich.
_______________________________________________ Koha mailing list Koha@lists.katipo.co.nz http://lists.katipo.co.nz/mailman/listinfo/koha
-- Olwen Williams See my B&B site http://www.bandbclub.com and my new site http://www.handyman.co.nz - A virtual shed for real kiwi blokes.
On 5 Sep 2003, Rich Churcher wrote:
Alas, the school is in Dunedin (Corstorphine primary). They know I might have a solution for them, but that's about as far as it's gone at this stage. I'd like to set a box up, get a dump of their data and make sure everything's working before I show it to them.
I am in Dunedin and will be sweating over 1.2.3 installation for an even smaller library in ChCh when it come to upgrading to 2.0.0. May be we can get together. Phil. -- Philip Charles; 39a Paterson Street, Abbotsford, Dunedin, New Zealand +64 3 488 2818 Fax +64 3 488 2875 Mobile 025 267 9420 philipc@copyleft.co.nz - preferred. philipc@debian.org I sell GNU/Linux & GNU/Hurd CDs. See http://www.copyleft.co.nz
Hi There
3. How difficult is the conversion going to be from Catalist to Koha? Is there any kind of automated conversion (possibly via MARC), or would they be stuck re-scanning everything in their library?
The original Koha install was done with data from Catalist. The hardest part is getting the data out. Getting it into Koha is pretty easy, im sure I still have the scripts around to do that, that you can take a look at. Chris -- Chris Cormack Katipo Communications Programmer www.katipo.co.nz 027 4500 789
participants (8)
-
Chris Cormack -
Christopher Hicks -
Larry Stamm -
MJ Ray -
Olwen Williams -
paul POULAIN -
Philip Charles -
Rich Churcher