Imagine you want install GNU/Linux but your bandwidth won’t let you…
tl;dr: I wrote a rather small Bash script which splits huge files into several smaller ones and downloads them. To ensure the integrity, every small files is being checked for its hashsum and file size.
That’s the problem I was facing in the past days. In the school I’m working at (Moshi Institute of Technology, MIT) I set up a GNU/Linux server to provide services like file sharing, website design (on local servers to avoid the slow internet) and central backups. The ongoing plan is the setup of 5-10 (and later more) new computers with a GNU/Linux OS in contrast to the ancient and non-free WindowsXP installations – project „Linux Classroom“ is officially born.
But to install an operating system on a computer you need an installation medium. In the school a lot of (dubious) WindowsXP installation CD-ROMs are flying around but no current GNU/Linux. In the first world you would just download an .iso file and ~10 minutes later you could start installing it on your computer.
But not here in Tanzania. With download rates of average 10kb/s it needs a hell of a time to download only one image file (not to mention the costs for the internet usage, ~1-3$ per 1GB). And that’s not all: Periodical power cuts cancel ongoing downloads abruptly. Of course you can restart a download but the large file may be already damaged and you loose even more time.
My solution – splitDL
To circumvent this drawback I coded a rather small Bash program called splitDL. With this helper script, one is able to split a huge file into smaller pieces. If during the download the power cuts off and damages the file, one just has to re-download this single small file instead of the huge complete file. To detect whether a small file is unharmed the script creates hashsums of the original huge and the several small files. The script also supports continuation of the download thanks to the great default built-in application wget. [ » Read More…]