Raspberry Pi print & scan server

I was pleasantly surprised at how easy it was (with a bit of Googling) to set up the Raspberry Pi 2 I’d had gathering dust on my desk as a print server. My increasingly venerable USB-only Canon MP480 works just fine under Debian, but having to boot my desktop to use it was becoming tedious – much of what I get done these days is laptop-based or even done on my phone.

Having set it all up, I can scan and print from any Linux box on the network (theoretically from Windows too, only I have none of those left), and, with the addition of the CUPS Print app, I can print PDFs and the like directly from my Android phone.

Update – putting some IPv6-only DNS on the Pi and pointing a Windows 10 VM at it via IPP, printing Just Works. Even more impressively, Windows drivers for your printer do not need to exist, as long as they exist for CUPS on the Pi. I just chose the generic HP PostScript printer suggested in the article and it works perfectly, which is handy as Canon have no intention of providing Win10 drivers.

Digging up the road

Virgin Media’s work in my street reminds me of the poor state of broadband in the UK

A few weeks ago, I awoke to find my street being dug up by several pneumatic drills simultaneously. We’d seen the markings sprayed on the pavements some time previously, of course, but I had no idea what they were. Couldn’t be cable TV – Virgin Media, as far as I knew, never laid cable of their own – they just bought up all the other companies who went bust under the cost of doing so.

It turns out I was wrong, though, as our road now boasts a shiny new cable TV cabinet, and connection points for every house and block of flats. There was an outraged article in the local paper about how this work was done with zero notice to residents; we were indeed given no notice, but given that I was gone for work on all but a couple of days when the work was taking place, I didn’t mind too much. I can understand those who actually spend their days here being upset, though.

Given that this street was built less than ten years ago, it seems a bit screwed up that Virgin are digging it up now – surely they should have tried to co-operate with the developer at the time and saved making a mess of our pavements after the fact?

Whatever the ins and outs of that, it feels rather sad that the cables being pulled are presumably copper coax rather than fibre. Come to that, it feels backwards that BT are still laying copper and not fibre to new-builds. For all their clever research on delivering data at high bandwidth over copper, surely having fibre to the individual houses is the only long-term solution.

One of my neighbours is quoted in the article saying he’d be surprised if Virgin got custom from his neighbours. For sure, I’ve never been a fan in the past, but the offer of a 200 megabits per second connection for £44 per month is hugely tempting – particularly when all the providers operating over BT’s infrastructure can’t match that speed, and most of them are participating in a race to the bottom where competition on price leaves no money to spend on providing a decent service.

If our pavements start sinking, of course, I might decide otherwise.


So I’m back from my holiday, not with much of a tan, but feeling suitably relaxed. Myself and a couple of friends went to Cyprus for five days and four nights.

Getting there

Cyprus is a long way from the UK – it’s about the furthest European destination you can fly to, and it took over four hours on the plane each way. Still, the combination of Norwegian Air on the way out and EasyJet on the way back did the job.

Places to stay

The apartments we stayed at were pretty good for the cheap and cheerful end of the scale – I’ve added my review to that site, so I won’t repeat it here.

Things to do

I bagged The Rough Guide To Cyprus and the Lonely Planet guide to Cyprus from Oxford central library, and they did us well for museums and other sights to see, as well as general background.


We were surprised at how cheap the food was – if you avoid the obvious tourist traps and the presence of KFC, McDonald’s et al, you can get some nice local stuff for not very much a head – certainly cheaper than eating out in most of the UK. The guide books came through for us here again. The portion sizes and provision of free extras like bread were all nice touches.


We were surprised when stepping off the bus from the airport at how poor many neighbourhoods looked – even in the developed, beachfront, touristy part of Larnaca, you didn’t have to go far to find crumbling buildings and poorer businesses crammed in next to the more modern and western ones. Still, the people were all very friendly, and English was almost universally spoken.

E-mail for the last three days – my part in its downfall

You know Tuesday afternoon is going well when your mother calls and asks why she hasn’t received any e-mail for the last three days. Come to think of it, I hadn’t had much e-mail lately either.

Five minutes later on the mailserver (IP addresses and domain names changed to protect the guilty)…

$ exim4 -bhc

RCPT TO bar@example.com
>>> check hosts = ${sg {${lookup sqlite{ /path/to/my/sqlite.database SELECT host FROM blacklist WHERE domain = '$domain'; }}}{\n}{: }}
>>> no IP address found for host foo.example.com (during SMTP connection from (somewhere) [])
>>> foo.example.com in dns_again_means_nonexist? no (option unset)
>>> host in "foo.example.com"? list match deferred for foo.example.com
>>> deny: condition test deferred in ACL "acl_check_rcpt"
451 Temporary local problem - please try later
LOG: H=(somewhere) [] F=<foo@example.com> temporarily rejected RCPT <bar@example.com>

$ host foo.example.com
Host foo.example.com not found: 2(SERVFAIL)

Yes,  if you have a host-based blacklist which contains names as well as IP addresses, Exim will defer messages if it gets a “temporary” DNS failure when looking up names on the list. So not only did the owner of foo.example.com screw me over by sending me spam, but their broken DNS deferred all of my incoming mail. Excellent. I’ll let you know if the internet comes up with a solution to this other than the per-domain one.

Yubico Yubikey 4: PGP, U2F and other things

I recently bought myself a Yubikey 4, as previously mentioned.

Here’s what I’ve managed to do with it so far …

General observations

Since this is very new hardware, it comes as no surprise that the versions of the Yubikey tools in Debian stable aren’t new enough to talk to it. I used the Windows versions off the Yubico website to have a poke around the settings, but if the device is already in OTP+U2F+CCID mode, then that should cover all bases.

The device arrived just over 1 business day after I’d ordered, sent from a UK distributor – having had to order and pay in dollars via PayPal, it wasn’t clear that this would be so snappy. Well done Yubico.

Physically, it fits nicely onto my keyring and is barely any more noticeable than the two small USB drives and other bits and pieces hanging on there. Though its arrival did inspire me to purge my keyring of quite a lot of keys I’d been carrying around for years and almost never using. Weight is a concern when you’re dangling your keys on the end of a USB-connected device (mostly in case they pull it out of the port rather than the risk of physical damage in this case), so digging out an extension cable to let the keys rest on my desk was necessary to connect to my desktop. Laptops are naturally more convenient here.

U2F (Universal Second Factor)

I didn’t buy it for this particularly – until there’s Firefox support, U2F is a bit too edge-case for me. However, with the addition of the relevant udev rules (easily Google-able), it does work with Google Chrome on Debian and is a more convenient alternative to the Google-Authenticator based two-factor auth I already had enabled on my Github and Google accounts. The nice thing is that both services will fall back to accepting codes from the app on your phone if a U2F token isn’t present, so it’s low-risk to turn this on. The Yubikey is more convenient than having to dig out my phone and transcribe a code, though (not least because a security-conscious nerd like me now has over a screenful of entries in the app) – and is also impervious to the phone running out of battery or otherwise failing.


Other than confirming it works on Yubico’s demo site, I’ve not found a use for this just yet. It’s worth noting that, if I understand correctly, it relies on Yubico having the corresponding public key to the private key embedded in the device, and thus uses require authenticating against Yubico’s servers. It is possible to insert your own key onto the device and run your own auth service, but that seemed a bit much effort for one person (and hard to convince myself it’d be more secure in the long run). Meanwhile, I see I could use the Yubikey as a second factor for key-based SSH authentication, or even a second factor for sudo. But in both cases it would require Yubico’s servers being up to authenticate the second factor, and I’m not sure what my backup plan for that would be. At the very least, I’d want to own a second Yubikey as fallback before locking down any service to the point where I couldn’t get in without one.


This was the main use-case I bought the Yubikey for. Previously I’d had a single RSA PGP key which I used for both encrypting and signing. The difficulty with this set-up is that you end up wanting the private key key on half a dozen machines which you regularly use for e-mail, but the more places you have it, the more likely it is to end up on a laptop that gets stolen or similar.

For my new key, I’ve switched to a more paranoid setup which involves keeping the master key completely offline, and using sub-keys on the Yubikey to sign/encrypt/authenticate – these operations are carried out by a dedicated chip on the Yubikey and the keys can’t be retrieved from it. You’re still required to enter a PIN to do these operations, in addition to having physical possession of the Yubikey and plugging it in. However, the nice trick about the sub-keys is that in the event of the Yubikey being lost/stolen, you can simply revoke them and replace them with new ones – without affecting your master key’s standing in the web of trust.

The Yubikey 4 supports 4096 bit RSA PGP keys – unlike its predecessors which were capped to 2048 bits. To make all this work, you need to use the 2.x series of GnuPG – 1.x has a 3072 bit limit on card-based keys and even that turned out to be more theoretical than achievable. I couldn’t initially persuade GnuPG 2.x (invoked as gpg2 on Debian/Ubuntu) to recognise the Yubikey, but a combination of installing the right packages (gnupg-agent, libpth20, pinentry-curses, libccid, pcscd, scdaemon, libksba8) and backporting the libccid configuration from a newer version finally did the trick, with gpg2 –card-status displaying the right thing. Note that if running Gnome, some extra steps may be required to stop it interfering with gnupg-agent (but you should get a clear warning/error from gpg2 –card-status if that’s the case).

I generated my 4096 bit subkeys on an offline machine (a laptop booted from a Debian live CD) and backed them up before using “keytocard” in gpg to write them to the Yubikey.

I got into a bit of a tangle by specifying a 5 digit PIN – gpg (1.x) let me set it but then refused to accept it, saying it was too short! Fortunately I’d set a longer admin PIN and the admin PIN can be used to reset the main PIN. I’m not sure if that was just a gpg 1.x bug, but I’ve gone for longer PINs to be safe.

Having waded through all that, the pay-off is quite impressive: Enigmail, my preferred solution for PGP and e-mail, works flawlessly with this set-up. It correctly picks and uses sub-keys marked as for encryption purposes, and it prompts for the Yubikey PIN exactly as you’d expect when signing, encrypting and decrypting. It’s worth having a read of the documentation, and giving some consideration to the “touch to sign” facility. This goes some way towards making up for the Yubikey’s weakness compared to some more traditional PGP smart cards: it doesn’t have a trusted keypad for you to enter your PIN, so there’s always the risk of it being intercepted by a keylogger. But touch to sign means malware can’t ask the key to sign/encrypt/decrypt without you touching the button on the key for each operation. In any case, this set-up is a quantum leap over my old one and means I can make more use of PGP on more machines, with less risk of my key being compromised.

It’s worth noting that the Yubikey is fully supported on Windows and is recognised by GnuPG there too. This set-up might finally persuade me that a Windows machine can be trusted with doing encrypted e-mail.

Making the Yubikey 4 talk to GPG on Debian Jessie

I’ve just bought myself a Yubikey 4 to experiment with.

The U2F and OTP features are of some interest, but the main thing I bought it for was PGP via GnuPG. I was disappointed to discover that it works (at least as far as gpg –card-status showing the device) on current Ubuntu (15.10) and even on Windows (!), but not Debian stable (Jessie). Still, this is quite a new device…

In every case on Debian/Ubuntu, you need to apt-get install pcscd

For the non-internet-connected machine on which I generate my master PGP key and do key signing, I can just use the Ubuntu live CD, but since my day-to-day laptop and desktop are Debian, this does need to work on Debian stable for it to be of much use to me. A bit of digging revealed this handy matrix of devices, and the knowledge that support for the Yubikey 4 was added to libccid in 1.20. Meanwhile, jessie contains 1.4.18-1. Happily, a bit more digging revealed that retrieving and using the testing package’s version of /etc/libccid_Info.plist was enough to make it all start working:

[david@jade:~]$ gpg --card-status                            (28/11 20:20)
gpg: detected reader `Yubico Yubikey 4 OTP+U2F+CCID 00 00'
Application ID ...: XXX
Version ..........: 2.1
Manufacturer .....: unknown
Serial number ....: XXX
Name of cardholder: David North
Language prefs ...: en
Sex ..............: male
URL of public key : [not set]
Login data .......: david
Private DO 1 .....: [not set]
Private DO 2 .....: [not set]
Signature PIN ....: not forced
Key attributes ...: 2048R 2048R 2048R
Max. PIN lengths .: 127 127 127
PIN retry counter : 3 0 3
Signature counter : 0
Signature key ....: [none]
Encryption key....: [none]
Authentication key: [none]
General key info..: [none]

I’ve raised a bug asking if the new config can be backported wholesale to stable, but meanwhile, you can copy the file around to make it work.

For U2F to work in Google Chrome, I needed to fiddle with udev as per the suggestions you can find if you Google about a bit. The OTP support works straight away, of course, as it’s done by making the device appear as a USB keyboard.

When my beeper goes off

As regular readers will know, I run a co-located server with a few friends. I’m also responsible for things like the WiFi network at church. Inevitably, these things run just fine for weeks, months or even years at a time – and then fail. The first I hear of it is usually from a would-be user, sometimes days later.

The world of open-source has a solution to this, and it’s called Nagios (actually, I think the cool kids have jumped ship to the more recent Icinga, but I haven’t played with that). It’s a monitoring system – the idea is that you install it somewhere, tell it what computers / devices / websites to monitor, then it can e-mail you or do something else to alert you when things fail or go offline.

The really nice thing about it is that it’s all pluggable – you can write your own checks as scripts, and also provide your own notification commands, for example to send you a text.

The only problem is, where to put it? Hosting it on the co-lo box I’m trying to monitor is a bit self-defeating. Fortunately this is a rather good use for that Raspberry Pi I had lying around – I found a convenient internet link to plug it into, removed all the GUI packages, and it’s running Nagios just fine. So far, without really even trying, I’ve got it monitoring 76 services across 29 hosts. Some of the checks are a bit random – e.g. a once-per-day check on whether any of my domain names have expired – but now it’s possible to buy ten-year renewals it’s nice to have an eye on this sort of thing, as who knows if one’s registrar will still have the right contact details to send reminders in a decade?*

So far it’s working a treat, texting me about problems so I can fix them before users notice, although if I add many more checks I suspect I’ll have found an excuse to invest in a Raspberry Pi 2 to keep up with the load.

If you’re doing it right, you’ll configure Nagios to know which routers and computers stand between it and the rest of the internet (and hence what it’s trying to monitor). This means it won’t generate a slew of alerts if its internet connection dies, just one. It also means you get a useful map of what you’re monitoring:

* I’m pretty sure Mythic will do, because they’re great, but it never hurts to have a backup in place.

Windows and SSDs

So, having had such good results with the SSD I put in my ageing desktop, I decided to get one for my ageing laptop too. I did the copying a bit differently – since the SSD was 50GB smaller than the spinning rust it was replacing, a straight dd wasn’t going to work. In any case, it’s a bit boring waiting hours for blank space to be copied across. There was only 30GB of actual data.

I connected both drives to another machine – I couldn’t get hold of a SATA caddy to have them both on the laptop – and again, booted to SystemRescueCD. This time, though, I used GParted to copy-paste the boot partition, made an empty NTFS “main” partition and used rsync to copy the contents of the old C: drive to the new. Much faster to just copy the data. Some quirk of ntfs-3g or rsync seems to have lost the “hidden” attribute from some files, but otherwise it all worked.

You’ll have realised from my mention of NTFS that this laptop runs Windows. The purists may call me a traitor to Linux, but having at least one Windows box around is useful for all sorts of things, and this is mine.

The final annoyance was that Windows refused to boot from the new drive:

A required device is inaccessible

It did however prompt me to boot from the Windows CD and hit repair, which worked. Given how rapidly it worked, I can only assume Microsoft write the UUID of the drive to the boot sector or some such, as a tedious anti-piracy measure.

The SSD magic definitely seems to have made this laptop usable again – previously it was slow to the point of being unusable and you could hear the disk crunching inside it. Hopefully the SSD should give battery life a slight boost and be more robust against being dropped or jolted too.

Since I’m now all SSD’d up, I’ve got my desktop running DBAN to erase the old mechanical hard disks I’ve replaced. Did I mention how handy having a network boot server is?

Down the rabbit hole: OAuth, service accounts and Google Apps

I have quite a few small programs which use Google’s APIs for one thing or another:

  • Updating pages on Google Sites automatically
  • Reading a Google Docs spreadsheet and sending SMS reminders for a rota inside it
  • Reading a Google Calendar

Until recently, reading a public Google Calendar didn’t require authentication – one could simply consume the XML from the calendar’s URL and work with it using the GData API. Google knocked that on the head at some stage. Shortly afterwards, I woke up to a slew of error e-mails marking the apocalypse.

The apocalypse is how I refer to the day when Google finally removed the ability for scripts like mine to authenticate by presenting a username and password. For sure, hard-coding these into a script is not good practice, but it worked well for many years, and was a lot simpler and better documented than the alternative…


Much has been written elsewhere about OAuth, but the main problem I had was that Google’s examples all seemed to centre around the idea of bouncing a user in their web browser to a prompt to authorize your use of their data. This is all very well for interactive web applications (and, indeed, much better than asking users to trust you with their password), but where does it leave my non-interactive scripts?

I eventually dug up the documentation on service accounts. The magic few lines of Python are these:

from oauth2client.client import SignedJwtAssertionCredentials
json_key = json.load(open('pkey.json'))
scope = ['https://spreadsheets.google.com/feeds']
credentials = SignedJwtAssertionCredentials(json_key['client_email'], json_key['private_key'], scope)

The JSON for the key can be obtained via the Google APIs developer console. Most Google APIs, and things built on top of them, can then take the credentials object, and authenticate as a service account. Apparently you can also do really clever things like impersonating arbitrary users in your Google Apps domain – for example, to send calendar invites as them – but that’s for another day.