• Build FSFE websites locally

    Note: This guide is also available in FSFE’s wiki now, and it will be the only version maintained. So please head over to the wiki if you’re planning to follow this guide.

    Those who create, edit, and translate FSFE websites already know that the source files are XHTML files which are build with a XSLT processor, including a lot of custom stuff. One of the huge advantages from that is that we don’t have to rely on dynamic website processors and databases, on the other hand there are a few drawbacks as well: websites need a few minutes to be generated by the central build system, and it’s quite easy to mess up with the XML syntax. Now if an editor wants to create or edit a page, she needs to wait a few minutes until the build system has finished everytime she wants to test how the website looks like. So in this guide I will show how to build single websites on your own computer in a fraction of the FSFE’s system build time, so you’ll only need to commit your changes as soon as the file looks as you want it. All you need is a bit hard disk space and around one hour time to set up everything.

    The whole idea is based on what FSFE’s webmaster Paul Hänsch has coded and written. On his blog he explains the new build script. He explains how to build files locally, too. However, this guide aims to make it a bit easier and more verbose.

    Before we’re getting started, let me shortly explain the concept of what we’ll be doing. Basically, we’ll have three directories: trunk, status, and fsfe.org. Most likely you already have trunk, it’s a clone of the FSFE’s main SVN repository, and the source of all operations. All those files in there have to be compiled to generate the final HTML files we can browse. The location of these finished files will be fsfe.org. status, the third directory, contains error messages and temporary files.

    After we (1) created these directories, partly by downloading a repository with some useful scripts and configuration files, we’ll (2) build the whole FSFE website on our own computer. In the next step, we’ll (3) set up a local webserver so you can actually browse these files. And lastly we’ll (4) set up a small script which you can use to quickly build single XHTML files. Last but not least I’ll give some real-world examples. [ » Read More…]

  • splitDL – Downloading huge files from slow and unstable internet connections

    Imagine you want install GNU/Linux but your bandwidth won’t let you…

    tl;dr: I wrote a rather small Bash script which splits huge files into several smaller ones and downloads them. To ensure the integrity, every small files is being checked for its hashsum and file size.

    That’s the problem I was facing in the past days. In the school I’m working at (Moshi Institute of Technology, MIT) I set up a GNU/Linux server to provide services like file sharing, website design (on local servers to avoid the slow internet) and central backups. The ongoing plan is the setup of 5-10 (and later more) new computers with a GNU/Linux OS in contrast to the ancient and non-free WindowsXP installations – project „Linux Classroom“ is officially born.

    But to install an operating system on a computer you need an installation medium. In the school a lot of (dubious) WindowsXP installation CD-ROMs are flying around but no current GNU/Linux. In the first world you would just download an .iso file and ~10 minutes later you could start installing it on your computer.

    But not here in Tanzania. With download rates of average 10kb/s it needs a hell of a time to download only one image file (not to mention the costs for the internet usage, ~1-3$ per 1GB). And that’s not all: Periodical power cuts cancel ongoing downloads abruptly. Of course you can restart a download but the large file may be already damaged and you loose even more time.

    My solution – splitDL

    To circumvent this drawback I coded a rather small Bash program called splitDL. With this helper script, one is able to split a huge file into smaller pieces. If during the download the power cuts off and damages the file, one just has to re-download this single small file instead of the huge complete file. To detect whether a small file is unharmed the script creates hashsums of the original huge and the several small files. The script also supports continuation of the download thanks to the great default built-in application wget. [ » Read More…]

  • Sharing is caring – my Git instance

    Some days ago I noticed another time that I have far too less knowledge about Git.
    „Time to change that!“, I thought and set up my own Git instance and also installed gitweb for better usability.

    Upside 1: I can keep track of the many (mainly bash) scripts I wrote in the past and all the changes I will adopt in the future.
    Upside 2: You can hopefully benefit from using and reading my code. All code is licensed under GNU GPL v3 so please feel free to use, study, share and improve my work!

    Some noteworthy projects I’m (a bit) proud of:

    Any questions, ideas or improvements? Please contact me!

    Update 26.02.2016:
    I washed away the quite basic gitweb instance and moved to Gogs. Here’s why and how. Links to the project may have changed because of that (and I’m too lazy to change them here).

  • Mounting a SFTP storage in GNU/Linux

    This (longer than expected) post explains how to transfer files securely between your device and an external storage. The first part may be useful for you if you only have little knowledge of terms like (S)FTP(S) and want to learn something about widely used technologies. The second part will help you to mount an external storage so you can manage all files as if they are on your local device and the third, fourth and fifth part will concentrate on easing the mounting process by the help of hostnames, Private/Public Keys and a shell script.
    This guide will be very detailed and is also (and especially) suited for beginners. Maybe also some advanced users can learn something or give hints for improvements.

    Update: With improving Bash skills and more time, I was able to heavily improve the script in the end. Have a look at my Git instance to download the latest version.

    But let’s be honest: All in all, this post will show you again, why Free Software, GNU/Linux and Open Standards are great, easy to use and why Windows users are to be pitied.

    I. Short excursus

    (Nearly) everybody knows FTP. FTP is a protocol which enables you to transfer files between your device and a remote space. Maybe you want to present your documents or images to visitors of your homepage and simply want to upload these files on your webspace. In most cases this could be done by the use of a seperate program like FileZilla.

    So far so good, but there’re several problems. Two of them:

    1. FTP is insecure. Period.
    2. Using an external program (and not your personal file manager) is really annoying if you want to edit the files very often. A realistic example: You have a complicated script running on your website which you’d like to edit in a graphical editor. Using an external client forces you to download the file, open it in your editor, save it and upload it again. Some FTP clients like FileZilla have the functionality to ease this pain in the a**, but trust me: after the twentieth reupload you want to toss your computer away…

    Now we know why FTP is insecure. So what alternatives do we have?

    [ » Read More…]