EPrints Technical Mailing List Archive

Message: #03402


< Previous (by date) | Next (by date) > | < Previous (in thread) | Next (in thread) > | Messages - Most Recent First | Threads - Most Recent First

[EP-tech] Re: Injecting gigabyte-scale files into EPrints archive - impossible?


Am 18.08.2014 um 16:17 schrieb Paolo Tealdi:
Il 04/08/2014 10:13, Yuri ha scritto:
The only option seems to enlarge the tmp :-)


or trying to convince the system to change the TMPDIR to a bigger filesystem.


Thank you both, Yuri and Paolo. :-)

I am afraid, though, that this is not quite a sustainable, scalable approach. If for instance one used Recollect Plugin or whatever, anyway EPrints as a research data store, even bigger files would be to serve, say hundred gigabyte large database dumps. EPrints could not handle that giant files this way, as it would need absurd *several hundred* gigabyte of RAM and swap, let alone other involved bottle-necks in the program flow. If you don't mind my assuming that in regard to its easygoing use of join("", <STDIN>).

Concerning large files, we thought we could serve files up to a maximum of a DVD image. For the time being I will prefer the described copy & hack database method (but as said, I'll try use the API next time).

Look, as a public university library it is not overwhelming hardware resources which we have got for the given purpose, so we must use wise and economically what is available.


Kind regards
Florian

Best regards,
Paolo Tealdi



--
UB Heidelberg (Altstadt)
Plöck 107-109, 69117 HD
Abt. Informationstechnik
http://www.ub.uni-heidelberg.de/