Jared: thanks again for your hints - it worked now. We have a running zebra and all recordsets have been imported succfully. There is still the mystery with the ignored -s parameter, but everything worked fine. The index finally needs about 60 GB on disk. I saw that the exported_records file (which in our case was about 8 GB after it finished the export) is hold in /tmp and had reserved plenty of space there. So there was not a problem at all. But thank you for the note, Robin! Am 24.07.2013 14:07, schrieb Robin Sheat:
Op 23/07/13 13:40, Jared Camins-Esakov schreef:
If you do a full index with shadow enabled, you'll run out of space on the 100GB disk, but my experience is that in ordinary day-to-day work, it's unlikely you'll have a problem. It's also worth noting that /tmp (or whatever is defined as your temporary directory) is used to hold the extracted records, so with a large collection, that may become an issue.
-- Oliver Goldschmidt TU Hamburg-Harburg / Universitätsbibliothek / Digitale Dienste Denickestr. 22 21071 Hamburg - Harburg Tel. +49 (0)40 / 428 78 - 32 91 eMail o.goldschmidt@tuhh.de -- GPG/PGP-Schlüssel: http://www.tub.tu-harburg.de/keys/Oliver_Marahrens_pub.asc