Office 365

I want to give Microsoft Office 365 a chance. But it’s just a terrible and incomplete experience. Herein my tale of woe.


I received a link to a folder from a co-worker. I need to keep up-to-date on the files within it. Great news, there’s a “Follow” button that looks like it will do the trick:

office 365 follow

Except, when I click on it, I get an error:

office 365 follow errror

Ok, worst case, I guess I can just sync the folder to my local filesystem and monitor it that way. There is a big shiny sync button that leads to this:

office 365 sync

Well, it makes sense that I’d need to download a desktop app to perform the sync, like with Dropbox. What happens when I click “Get the OneDrive for Business app that’s right for me”?

onedrive fail

“Only on a PC”? That’s … disappointing.


Microsoft really should know that table stakes for cloud-hosted services is that the clients work seamlessly across multiple devices. After this experience, they’ll have a really hard job to convince me to take another look in a few years’ time.

Posted in Uncategorized | Tagged , , | Comments Off

Mastering the mobile app challenge at Adobe Summit

I’m presenting a 2 hour Building mobile apps with PhoneGap Enterprise lab at Adobe Summit in Salt Lake City on Tuesday, with my awesome colleague John Fait. Here’s a sneak preview of the blurb which will be appearing over on the Adobe Digital Marketing blog tomorrow. I’m posting it here as it may be interesting to the wider Apache Cordova community to see what Adobe are doing with a commercial version of the project…


Mobile apps are the next great challenge for marketing experts. Bruce Lefebvre sets the the scene perfectly in So, You Want to Build an App. In his mobile app development and content management with AEM session at Adobe Summit he’ll show you how Adobe Marketing Cloud solutions are providing amazing capabilities for delivering mobile apps. It’s a must-see session to learn about AEM and PhoneGap.

But what if you want to gain hands-on practical experience of AEM, PhoneGap, and mobile app development? If you want to roll up your sleeves and build a mobile app yourself, then we’ve got an awesome lab planned for you. In “Building mobile apps with PhoneGap Enterprise“, you’ll have the opportunity to create, build, and update a mobile application with Adobe Experience Manager. You’ll see how easy it is to deliver applications across multiple platforms. You’ll also learn how you can easily monitor app engagement through integration with Adobe Analytics and Adobe Mobile Services.

If you want to know how you can deliver more effective apps, leverage your investment in AEM, and bridge the gap between marketers and developers, then you need to attend this lab at Adobe Summit. Join us for this extended deep dive into the integration between AEM and PhoneGap. No previous experience is necessary – you don’t need to know how to code, and you don’t need to know any of the Adobe solutions, as we’ll explain it all as we go along. Some of you will also be able to leave the lab with the mobile app you wrote, so that you can show your friends and colleagues how you’re ready to continuously drive mobile app engagement and ROI, reduce your app time to market, and deliver a unified experience across channels and brands.

Are you ready to master the mobile app challenge?


All hyperbole aside, I think this is going to be a pretty interesting technology space to watch:

  • Being able to build a mobile app from within a CMS is incredibly powerful for certain classes of mobile app. Imagine people having the ability to build mobile apps with an easy drag-and-drop UI that requires no coding. Imagine being able to add workflows (editorial review, approvals etc) to your mobile app development.
  • No matter how we might wish we didn’t have these app content silos, you can’t argue against the utility of content-based mobile apps until the mobile web matures sufficiently so that everyone can build offline mobile websites with ease. Added together with over-the-air content updates, it’s really nice to be able to have access to important content even when the network is not available.
  • Analytics and mobile metrics are providing really useful ways to understand how people are using websites and apps. Having all the SDKs embedded in your app automatically with no extra coding required means that everyone can take advantage of these metrics. Hopefully this will lead to a corresponding leap in app quality and usability.
  • By using Apache Cordova, we’re ensuring that these mobile app silos are at least built using open standards and open technologies (HTML, CSS, JS, with temporary native shims). So when mobile web runtimes are mature enough, it will be trivial to switch front-end app to front-end website without retooling the entire back-end content management workflow.

Exciting times.

Posted in Computing, Planet, Work | Tagged , , , , , | Comments Off

Login problems on Mac OS X Snow Leopard

These are notes from a tech support call with my parents last night, saved here for the next time stuff breaks.

If you’re running Mac OS X Snow Leopard (and possibly other versions), you may find you can’t log in. Symptoms are:

  • You click on your username and enter your password
  • The login screen is replaced by a blue screen for a short time
  • You are returned to the login screen.

After searching the interwebs I found Fixing a Mac OSX Leopard Login Loop Caused by Launch Services. It seems the problem is caused by corrupted cache files (which could be caused by the computer shutting down abruptly, or may just be “one of those things” that happens from time to time). This gave me enough information to come up with these “easy” steps to resolve it:

  1. Log in to the Mac as a different user*
  2. Press cmd-space to open Spotlight, type “Terminal”, and click on the Terminal application.
  3. Work out the broken user’s username by typing: ls /Users and look for the appropriate broken account name e.g. franksmith or janedoe.
  4. Find out the user ID of the user from the previous step by typing: id -u janedoe which will print a number something like 501
  5. Delete the user’s broken cache files. In the following command, be sure to substitute the correct username (in place of janedoe) and the correct user ID after the 023 (in place of the 501): su -l janedoe -c ‘rm /Library/Caches/*’ (be very careful with this, you don’t want to delete the wrong things).
    • If you’re super-confident in figuring out backticks you could of course skip step 4 and instead of step 5 do: su -l janedoe -c ‘rm /Library/Caches/`id -u janedoe`.*’
  6. Test by logging in to the troublesome user account.
Note that if you had any apps configured to launch at login, you may need to re-add these.

* This makes me think it’s good practice when setting up a Mac to always set up an extra user account, just in case stuff breaks.

Posted in Computing, Planet, Technology | Tagged , , , , , , | Comments Off

Super markets

I’ve been using our local Lidl recently, because their policy of regularly baking throughout the day means I can pick up fresh croissants and pains au chocolat whenever I go, whereas the local Tesco, Sainsburys, and Waitrose have usually run out by mid-morning. Are the so-called discount supermarkets really cheaper than the mainstream supermarkets? Here’s the result of one unscientific survey.

This morning I checked my till receipt against Tesco online.
There are some items that cost the same regardless of which supermarket (fabric softener, fresh orange juice). There are some items that don’t have direct equivalents across stores, so price comparisons aren’t possible. And there are some items where the price is not significantly different (fresh milk, toilet paper).
On today’s basket of comparable items, Lidl was £10.62 cheaper (costing £18.46 instead of £29.08).
There are some real eye-openers. Eggs are 1.5x more expensive at Tesco. Fresh vegetables were often almost twice the price at Tesco. And what about my fresh croissants and pains au chocolat? £0.29 and £0.39 at Lidl, vs £0.80 each at Tesco. Over twice the price — on today’s shop, buying just these alone saved me £4.70. And they were fresh from the oven, still warm when I got them home.
Posted in Personal, Planet, Research | Comments Off

Netflix versus Blu-Ray

Which is better … streaming content or buying shiny plastic discs?

It’s surely an unfair comparison because the constraints of a service like Netflix (delivering uninterrupted video over a range of network qualities) are very different to delivering content on physical media. But I was looking to justify my purchase of Battlestar Galactica (BSG) on Blu-Ray so here’s some screen captures as a very unscientific comparison. The screen captures are only approximately the same frame on each media, but should be close enough for a rough comparison.

The Netflix screen captures were done streaming on a fibre internet connection using a Mac with Silverlight installed, but with no other adjustments. The Battlestar Galactica Blu-Ray claims to be “High Definition Widescreen 1.78:1” (but doesn’t define high definition further … is it supposed to be 720p or 1080p?).

Here’s the two sources at 100% magnification (click for full-size):

Battlestar Galactica Netflix vs Blu-Ray 100%

And at 200% magnification:

Battlestar Galactica Netflix vs Blu-Ray 200%

And at 300% magnification:

Battlestar Galactica Netflix vs Blu-Ray 300%

In this instance, Netflix does a pretty good job, although it looks a little blurry. BSG is probably a bad choice for showing off Blu-Ray in comparisons: it’s heavily processed to make it look grainy like film, and almost every single scene is very dark. I doubt it was shot on the latest HD cameras, either (the miniseries first aired in 2003, the full series began in 2004). But since more recent series like Game of Thrones aren’t available on Netflix, this is the best I’ve got for comparisons at the moment.

Of course, even if the Blu-Ray and Netflix versions were identical, a big benefit of those shiny plastic discs is they can be used off-line and don’t require a continuing subscription.

Posted in Personal, Planet, Research, Technology | Tagged , , , , , , | Comments Off

Home Automation: buttons on the desktop

I want a button on my Mac’s desktop to turn on or turn off the lights I have controlled by the Raspberry Pi. Here’s what I’ve got so far.

First, I wrote a script on the Pi to turn the lights on:

/usr/bin/tdtool --on 1
/usr/bin/tdtool --on 2
/usr/bin/tdtool --on 3

For everything here, I also wrote the equivalent to turn the lights off.

Next, I tested running the script via SSH from my Mac:

ssh -f user@raspberrypi.local /home/user/ &>/dev/null

You can use the Applescript Editor to make pseudo-apps, so I wrote an Applescript to execute the SSH command:

tell application “Terminal”

    do script “ssh -f user@raspberrypi.local /home/user/ &>/dev/null”


end tell

delay 15

tell application “Terminal”


end tell

I saved this as file format “Application” in my Applications folder, then dragged it to my desktop (an alias is automatically created, leaving the original in Applications). I can now double-click the app and the script runs, or launch it from Spotlight or the wonderful Alfred:
It works but it’s ugly as I have a Terminal window pop up for ~ 15 seconds. There must be a better way.
After some searching, I came across Use automator to create X11 shortcuts in OSX. It’s similar to the Applescript trick, but uses Automator instead, so there’s no need for Terminal. I put the SSH command into the “Run shell script” workflow action, and saved it as an application:
It works! Now I can turn the lights on or off without the Terminal window popping up.
There’s a couple of issues:
  • The tdtool command takes a long time to execute on the Pi. It takes about three seconds for all three plugs to switch, whereas the smart plug remote has an “all on / all off” button which is instant. I need to find out why the command is so slow, and/or a way to control all three in one go.
  • I don’t really want “Lights on” and “Lights off” apps, I want a single app that toggles the state. This could be done by making the server-side script smarter, but I’d really like the app icon to reflect the status of the lights too.
Room for improvement, but this is good enough for now.
Next up: automating sunsets.
Posted in Computing, Home Automation, Personal, Planet, Research | Tagged , , , , , , , | Comments Off

Home Automation: Turn it on again

In Pi three ways I wrote:

what happens when you combine an RF transmitter, smart sockets with RF receivers, and a Raspberry Pi?

I’m still finding out what can be done, but this is what I’ve discovered so far.


This is what I bought:

bye bye standby kit

First, I set up each of the smart plugs, and paired them with the remote control.

This is already a big improvement: my home office has bookshelves filled with LED lights, but with inaccessible switches. Being able to turn them all on and off from the remote is awesome, but I’d really like them to turn on automatically, for example at sunset. So I need some compute power in the loop. Time for the Pi.

The Pi is running Raspian. I followed the installation instructions for telldus on Raspberry Pi. See also R-Pi Tellstick core for non-Debian instructions.

Next I tried to figure out the correct on/off parameters in tellstick.conf for the smart plugs. The Tellstick documentation is a bit sparse. Tellstick on Raspberry Pi to control Maplin (UK) lights talks about physical dials on the back of the remote control; sadly the Bye Bye Standby remote doesn’t have this.

Each plug is addressed using a protocol and a number of parameters. In the case of the Bye Bye Standby, it apparently falls under the arctech protocol, which has four different models, and each model uses the parameters “house” and sometimes “unit”.

Taking a brute-force approach, I generated a configuration for every possible parameter for the arctech protocol and codeswitch model:

for house in A B C D E F G H I J K L M N O P ; do
    for unit in {1..16} ; do
      cat <<EOF
device {
   id = $count
   name = "Test"
   protocol = "arctech"
   model = "codeswitch"
   parameters {
      house = "$house"
      unit = "$unit"

I then turned each of them on and off in turn, and waited until the tellstick spoke to the plugs:

count = 0
for house in A B C D E F G H I J K L M N O P ; do
    for unit in {1..16} ; do
        echo "id = $count, house = $house, unit = $unit"
        tdtool --on $count
        tdtool --off $count

This eventually gave me house E and unit 16 (and the number of the corresponding automatically generated configuration, 80):

tdtool --on 80
Turning on device 80, Test – Success

bye bye standby plugs

But this only turned on or off all three plugs at the same time. I wanted control over each plug individually.

I stumbled upon How to pair Home Easy plug to Raspberry Pi with Tellstick, and that gave me enough information to reverse the process. Instead of getting the tellstick to work out what code the plugs talk, in theory I need to get the tellstick to listen to the plug for the code.

So this configuration should work, in combination with the tdlearn command:

device {
    id = 1
   name = "Study Right"
   protocol = "arctech"
   model = "selflearning-switch"
   parameters {
      house = "1"
      unit = "1"

However this tiny footnote on the telldus website says: 

4Bye Bye Standby Self learning should be configured as Code switch.

So it seems it should be:

device {
    id = 1
   name = "Study Right"
   protocol = "arctech"
   model = “codeswitch"
   parameters {
      house = "1"
      unit = "1"

… which is exactly what I had before. Remembering of course to do service telldusd restart each time we change the config, I tried learning again:

tdtool --learn 1
Learning device: 1 Study Right - The method you tried to use is not supported by the device

Well, bother. Looking at the Tellstick FAQ:

Is it possible to receive signals with TellStick?

TellStick only contains a transmitter module and it’s therefore only possible to transmit signals, not receive signals. TellStick Duo can receive signals from the compatible devices.

So it seemed like I was stuck with all-on, all-off unless I bought a TellStick Duo. Alternatively, I could expand my script to generate every possible combination in the tellstick.conf, and see if I can work out the magic option to control each plug individually. But since there are 1 to 67108863 possible house codes, this could take some time.

Rereading Bye Bye Standby 2011 compatible? finally gave me the answer. You put the plug into learning mode, and get the Tellstick to teach the right code to the plug by sending an “off” or an “on” signal:

tdtool --off 3

So setting house to a common letter and setting units to sensible increments, I can now control each of the plugs separately.

Next up: some automation.

Posted in Computing, Home Automation, Planet, Research | Tagged , , , , | Comments Off

Mobile modem

I was trying to get a handle on how much mobile data download speeds have improved over the years, so I did some digging through my archives. (The only thing I like more than a mail archive that spans decades is a website that spans decades. Great work, ALUG!) Here’s some totally arbitrary numbers to illustrate a point.

Sony-Ericsson t68i

In response to A few Questions, this is what I wrote in May 2002:

[Nokia] 7110 works fine with IR (7110 has a full modem). The 7110 supports 14.4bps
connections, but I bet your telco doesn’t :-)

That should have been 14.4kbps (14,400bps). In 2002 the phones were ahead of the network’s ability to deliver. In 2014, not much has changed.

In GPRS on Debian, this is what I wrote in November 2002:

I finally took the plunge and went for GPRS [..]  (up to 5x the speed of a dialup connection over a GSM mobile connection)

Remember when we did dialup over mobile connections? GSM Arena on the Sony-Ericsson T68i states 24-36 kbps. I’m assuming I got the lower end of that.

Nokia 3650

In 2003 I was using a Nokia 3650 GPRS connection. GSM Arena on the Nokia 3650 states 24-36 kbps. Let’s be generous and assume reality was right in the middle, at 30 kbps.

In 2004 I got a Nokia 6600, which according to GSM Arena could also do 24 – 36 kbps. It was a great phone, so let’s assume the upper bound for the 6600.

In 2008 I upgraded to 3G with the Nokia N80, and wrote:

3G data connections are dramatically better than GPRS

… but sadly I didn’t quantify how much better. According to GSM Arena, it was 384 kbps.

That’s a pretty good and pretty dramatic speed increase:

Mobile download speeds 2003-2008

Nokia N900

But then in 2009 I was using the Nokia N900 (and iPhone, HTC Hero, Google Nexus One, …). GSM Arena on the Nokia N900 states a theoretical 10Mbps … quite the upgrade, except O2 were limited to 3.6 mbps.

In 2012 I was using the Samsung Galaxy SII. GSM Arena on the Samsung Galaxy SII promises 21 mbps.

And now the Sony Xperia Z Ultra supports LTE at 42 MBPS and 150MBPS. Sadly, the networks don’t yet fully support those speeds, but if they did, the chart would be truly dramatic. 2003-2008 starts to look like a rounding error:

Mobile download speeds 2003-2013

I don’t need to use a modem or infrared, either. Things have really improved over the last twelve years!

(This post is probably best read in conjunction with Tom’s analysis of Mobile phone sizes.)

Posted in Computing, Mobile Tech, Planet, Research | Tagged , , , , , , , | Comments Off

Multi media

My movie collection is a bit of a mishmash, a bunch of different file formats all sat on a Drobo. In the early days I would create AVI, MKV or MP4 rips of my DVDs depending on how and where I wanted to watch them. Sometimes the rips would be split across multiple files. More recently I just copied the DVD wholesale, for conversion later. As a result, ensuring a consistent set of files to copy onto my phone or tablet is a bit of a pain.

With the arrival of the RaspBMC media server, I decided to clean everything up. Some constraints I set:

  • I want to avoid loss of quality from source material (so no re-encoding if possible, only copying).
  • I should be able to do everything from the command line so it can be automated (manipulating video files can be a slow process even without encoding).
  • I want to combine multiple DVDs where possible for easier viewing.
  • My end goal is to have MKV files for most things.
Here’s what I’ve got working so far. Bug fixes and improvements welcome.


AVI files

You can glue AVI files together (concatenate them) and then run mencoder over the joined up file to fix up the indexes:

brew install mplayer
cat part1.avi part2.avi > tmp.avi && \
/usr/local/bin/mencoder -forceidx -oac copy -ovc copy tmp.avi -o whole.avi

This forces mencoder to rebuild the index of the avi, which allows players to seek through the file. It encodes with the “copy” audio and video codec, i.e. no encoding, just streamed copying.


MKV files

MKV is Matroska, an open source open standard video container format. The process is similar to AVI files, but the mkvmerge tool does everything for you: 

brew install mkvtoolnix
/usr/local/bin/mkvmerge -o whole.mkv part1.mkv +part2.mkv

This takes the two parts and joins them together. Again, no re-encoding, just copying.


DVD rips

I started using RipIt to back up my DVDs; it can automatically encode DVDs, but once I got my Drobo I opted to keep the originals, so I always have the option to re-encode on a case-by-case basis for the target device without losing the best quality original.

I don’t need to touch most of the DVD copies, but a number of my DVDs are split across several disks, for example Starship Troopers and The Lord of the Rings.

One option would be to encode each DVD at the highest possible quality and then merge the AVI or MKV using the mechanisms above, but I want to avoid encoding if possible.

It turns out that the VOB files on a DVD are just MPEG files (see What’s on a DVD? for more details), so there’s no need to convert to AVI or MP4. We can glue them together as we did with the AVIs, then package them as MKV. The basic method is:

cat *.VOB > movie.vob

The problem is that we need to be selective about the VOB files that are included; there’s no point including DVD menu and setup screen animations, for example. A dirty hack might be to select only the VOB files bigger than a certain threshold size, and just hope that the movie is divided into logical chunks. Something like this, run in a movie directory:

find -s . -name '*.VOB' -size +50M

There’s a catch: the first VOB (vts_XX_0.vob) always contains a menu, so we need to skip those, and we don’t want the menu/copyright message (video_ts.vob):

find -s . \( -iname '*.VOB' ! -iname 'VTS_*_0.VOB' ! -iname 'VIDEO_TS.VOB' \) -size +50M 

We can then use ffmpeg to copy the output of find (a list of our VOB files) into an MKV file. So far we’re assuming we only want the first audio stream (usually English), and I haven’t investigated how best to handle subtitles yet. The command is:

ffmpeg -i - -vcodec copy -acodec copy foo.mkv

There’s a couple of issues with this:

So our final command is:

find -s . \( -iname '*.VOB' ! -iname 'VTS_*_0.VOB' ! -iname 'VIDEO_TS.VOB' \) -size +50M -exec cat {} \; \
| ffmpeg -fflags +genpts -i - -f matroska -vcodec copy -acodec copy -c:s copy foo.mkv

The output should be an mkv file roughly the same size as the constituent .dvdmedia directories. You can test it using mkvinfo foo.mkv, which should output information on the mkv file. For some reason, using ‘file foo.mkv’ does not recognise it as an mkv file, only as data.


Putting it all together

Now we know how to handle several individual file formats, we can script the whole process.

The next step is to trawl through a disk full of movies and to normalise them into one format. At this point, we’re well into XKCD territory (The General Problem, and Is It Worth The Time?), so that’s left as an exercise for the reader ;-)



Posted in Computing, Personal, Planet | Tagged , , , , , , , , , , , | 3 Comments

Pi three ways

My name’s Andrew and I have a Raspberry Pi addiction.

I’ve played with a lot of IT kit over the years, and set up a number of servers for a variety of reasons. File servers, media servers, web servers; from do-it-yourself bare bones solutions held together with scripts and bits of string, to complete vendor solutions that work flawlessly straight out of the box.

None have come close to the perfect storm of adoption, usability, and economy that the Pi represents. It’s terrific fun and I wish I’d jumped on the bandwagon sooner.

Mobile phone-class CPU boards are nothing new – see for example the BeagleBoard – but they’ve always taken a fair amount of effort to be productive. The high profile and huge community around the Pi makes it much more accessible. It even has an 80s-style magazine, MagPi, that reminds me of fun times typing in programs from computer magazines as a child.

Now I’m addicted to the Pi, the two biggest problems I’m faced with are:

  • What is the correct pluralisation and collective noun for more than one Pi? “A bake of Pis” looks horrible. “A bush of Raspberries Pi” is arguably wrong. “A transcendence of pien” sounds pompous. Suggestions welcome. Until then, I’ll go to lengths to avoid referring to the Pi in the plural.
  • How many is too many? I have two, I’m about to order a third, but I could make an argument for one in every room in the house. And maybe one in the shed, one in the car. One for every television. One in each drawer?

Here’s three ways I’m using the Raspberry Pi collection I’m assembling.

The print server

Pi in a Cup

Pi in a cup. For ages I’ve been using a big old clunky PC running Linux as a print server. It was never really called on to do much else, and so it represented power-hungry overkill for the occasional bit of printing.

With minimal tweaks, I got Raspian installed. It took longer to pick the correct printer from the list in CUPS than it did to get everything else set up. And now I have a low power, ultra quiet server and a whole drawer of space reclaimed. And yes, that is an HP TouchPad USB charger powering the Pi…

The media server

I am a huge fan of the original Apple TV: it’s the first device that really made home media make sense on the TV. Back in 2007 I wrote “Apple TV looks like a gorgeous little box”, and it wasn’t long before I bought one. But I soon found it limiting. I wanted to play the high resolution backups of all of my DVDs, but the Apple TV wasn’t powerful enough to do full HD playback. So I “fixed” it with an upgrade in August 2010.

As I wrote later in 2012:

“The AppleTV has been upgraded with a Broadcom Crystal HD chip in the internal PC Card slot – which normally has a wifi card in it. I figured it was worth sacrificing wifi, since with the size of HD video you really want to be using a fast wired network anyway. In order to make use of the HD chip, I have XBMC installed on the AppleTV.”

Sadly, this has always been bit of a hacked-together solution, even with the polish that the FireCore folk provided with aTV Flash. Occasionally something random would get bit-rot, and I’d have to reinstall. It took ages (at the time) to find the right build of XBMC for CrystalHD hardware support. The AppleTV itself also permanently runs warm, which suggests a fair waste of electricity somewhere.

Just a week ago, I got this cryptic error whenever I tried to use XBMC:

XBMC error

I didn’t have time to debug the problem or rebuild the AppleTV. Raspbmc to the rescue! Easy to install, and with the MPEG2 and VC-1 license keys to unlock the codecs, the Pi has more than enough guts to take over XBMC duties. And in a PiBow case it looks fancy, too:


The only downside is that I can’t easily download and watch movies from the iTunes store with the Raspberry Pi, so I’ll need to keep the AppleTV on standby. Movie studios and content distributors take note: I’ll happily pay for downloadable movies on the Pi. Please make this happen.

The home automation server

This requires a longer post, but what happens when you combine an RF transmittersmart sockets with RF receivers, and a Raspberry Pi?

pi@raspberrypi ~ $ tdtool –on 80
Turning on device 80, all study lights – Success

I’m still working out how best to make use of this. Ideas include automatically turning on the lights when it gets dark (querying sunrise/sunset information via HTTP?), or tweeting “hey Pi, turn on the lights please”. Maybe using a webcam to detect motion so lights don’t turn on unnecessarily. What else could I do?

More on that another day …

Posted in Computing, Personal, Research, Technology | Tagged , , , , , , | Comments Off