Installing Pi-Hole

Once I’d got my Raspberry Pi back up and running, one of the things I was desperate to try out was Pi-Hole, a DNS sinkhole capable of running even on my extremely old Pi. DNS sinkholes make it possible to block ads, trackers and malware at source, speeding up browsing and improving privacy. The particular advantage of a Pi-Hole is that it cannot operate at the network level, blocking ads automatically on every device that connects to a WiFi network. Well, it’s got to be worth a go, right?

Continue reading Installing Pi-Hole

Raspberry Pi revisited

As documented a long, long time ago, I’ve got an old Raspberry Pi (B 1.2) kicking around. I wanted to try to teach it some new tricks, but once again my first challenge was getting it set up. This time, I don’t even have a TV I can plug its composite output into and I couldn’t be bothered to get HDMI working. I’m going to have to do this completely blind. Continue reading Raspberry Pi revisited

Constitution of Uganda: primary sources

I’m doing quite a lot of research at the moment into the history of Uganda (1890 to present). Unfortunately it is not always easy to find primary sources and documents online. This is my running list, published here in case it helps anyone else:

SSD part 3: Upgrading to Windows 10

The first SSD I installed was into my laptop, back in April 2014. But since then I’ve shuffled them around, and I was back to having an old HDD. This was just about tolerable, but with the 240GB OCZ Trion 100 (a ‘meh’ quality SSD) available for under £40, and Windows 10 upgrades nagging, I thought I’d rectify the situation. Unfortunately, it didn’t exactly go well.

Continue reading SSD part 3: Upgrading to Windows 10

Installing CyanogenMod 12.1 onto a Nexus 4

And so it came to pass that the Nexus 4, which was already a year old when I bought two years ago, started getting slow. I was hopefully initially that Android M (marshmallow), Google’s latest version of the operating system, would help: it apparently has a lower RAM footprint than L (lollipop). But unlike in the case of my Nexus 5 (which has already received the update) Google does not intend to ship M to Nexus 4s. Thus I had three three options: slug it out; hack a version of Android M onto it; or install a maintained fork that would (eventually) pick up the benefits of M. I chose the latter, going with the most popular fork: CyanogenMod, version 12.1 (a stable version 13 – i.e. one based on M – is expected around the New Year).

Installing CyanogenMod is not a walk in the park, even following the detailed instructions available on the CyanogenMod website and working out of Ubuntu. The following commands were useful (gapps.zip is an OpenGApps package and TWRP is, well, TWRP).

Unfortunately the “stock” OpenGApps package doesn’t seem to fit in the system partition of a Nexus 4 (and hence use of Advanced Options is advisable). To ascertain how many Google Apps needed excluding, and to effect that change, the following commands were useful:

Factorise

factoriseAs I mentioned in my previous blogpost, I am now the proud owner of a Pebble Time. This week I took the plunge and decided to make my own watchface in honour of xkcd #247 (“Factoring the time”). Pebble watchfaces have historically been written in the programming language C, although (given the relative unpopularity of C) the team there have also built a JavaScipt API, PebbleJS. I mean, I’ve never written any C before, but there’s no time like the present eh?

C turns out to be a particularly intensive language to work in, especially without the benefit of large utility libraries. For example, its array handling is poor, lacking push/pop functions, an accurate count function or indeed join/implode. The other thing I struggled with was performance enhancement. The major challenge when developing a watchface is battery conservation. Although Pebble provides some guidance covering this aspect, it remains vague (and difficult to even deduce through trial and error. For example, my initial design factorises the time afresh every minute using a blunt force recursive technique (given the maximum target is 2359, this is not too inefficient). But equally, I could store the pre-computed factorisations in a file. Would this be better or worse for battery conservation? Who knows.

The full code to the Factorise watchface is available from Github under the MIT licence.

Pebble Time: First Thoughts

Pebble TimeAbout a fortnight ago I added at Pebble Time to my collection of gadgetry. It’s one of the more basic second gen smartwatches, with a 64-colour e-ink screen rather than a touchscreen, as well as a microphone. Most of its functionality is derived from a bluetooth connection to a mobile phone. To buy one in the shops will set you back around £200. Continue reading Pebble Time: First Thoughts

SSD transfer (again)

Sims 3 screenshotThis week’s task was move by existing my old SSD from my laptop to my gaming desktop, which I’ve had since September and am otherwise very happy with — certainly, the AMD FX-6300 Six Core CPU + Radeon R9 270X can handle most things thrown at them, with the unusual exception of The Sims 3 (no idea why the Sims is an exception; maybe something to do with the age of the game). Continue reading SSD transfer (again)

Wikimedia Hackathon 2015

I am once again delighted to be able to attend the Wikimedia Hackathon, an event that rotates around Europe. This year’s is in the picturesque town of Lyon in south-central France — its winding boulevards and riverside façades looking rather beautiful (and very French) in the summer sun. Conveniently, Eurostar have just begun direct trains from London St Pancras, and by booking in advance tickets were competitively priced (£110pp outbound, £65pp inbound). Okay, so the trains took a while (4h45 outbound, 5h45 inbound) but I booked early enough to get a table, and on the return journey at least was pretty productive.

TranslateSvg

Five men surround three laptops on a table
Developers sitting at the i18n table

My main work was on TranslateSvg, a project I started several years ago as part of a Google Summer of Code project. Admittedly it is annoying not to have the extension live (although Brian tells me that the feature we did eventually manage to land is actually being used, so that’s something). On the other hand, I can understand why Wikimedia now demands high quality code (see below), and in particular good unit tests. I simply haven’t been able to put in the time required to deliver those (except in very short bursts), and that’s fundamentally my fault.

Anyhow, to focus on the positive, I used Lyon — and in particular the train back — to commit a load of patches. These get test coverage up to about 50% on a line-by-line basis, and, more importantly, led me to uncover a bunch of bugs I hadn’t found before. I also re-ran an analysis I first conducted almost 3 years ago and found that TranslateSvg was performing worse now than then! As ever, uncovering the bug was 90% of the challenge and the project is now back to where it was in August 2012 on that particular metric.

A more professional MediaWiki

I guess my other contribution during the Lyon Hackathon was a question to Lila Tretikov, ED of the Wikimedia Foundation. Someone else had asked by the relative balance between professional and volunteer developers had (it seemed) shifted away from the latter to the former. Other people had quite rightly pointed out that the WMF had hired many of the former volunteers, and, in particular, had hired many of the most “omnipresent” volunteers.

The point I wanted to make, however, is that MediaWiki as a platform has come a long way. It is a lot more professional, and that means standards are higher. By definition, you make it harder for part-timers (many of whose skillsets are understandably incomplete or out-dated) to contribute on a level footing. FOr example, a move from CSS to LESS reduces overhead for “experts” but makes it harder for those who just know CSS (i.e. most developers) and do not have the time to retrain to contribute. It was also pointed out that moving to a system of pre-commit review (as MediaWiki did in March 2012) encourages high standards: you’re not able to join the elite club of MediaWiki contributors without having your commit peer-reviewed first, whereas before you just had to fix it later (and even then you had status quo bias working with you rather than against you).

Lila’s response was to point to the ongoing work moving MediaWiki from being a monolithic pile of code, to something much more modular and service oriented so newcomers. I think this goes both ways: yes, it means newcomers can find a happy corner that they can work in, but it also allows our increasingly professionalised developer base to fulfil their burning desire to ditch PHP in favour of their own preferred language, with unintended consequences for the volunteer community.