Netflix versus Blu-Ray

Which is better … streaming content or buying shiny plastic discs?

It’s surely an unfair comparison because the constraints of a service like Netflix (delivering uninterrupted video over a range of network qualities) are very different to delivering content on physical media. But I was looking to justify my purchase of Battlestar Galactica (BSG) on Blu-Ray so here’s some screen captures as a very unscientific comparison. The screen captures are only approximately the same frame on each media, but should be close enough for a rough comparison.

The Netflix screen captures were done streaming on a fibre internet connection using a Mac with Silverlight installed, but with no other adjustments. The Battlestar Galactica Blu-Ray claims to be “High Definition Widescreen 1.78:1” (but doesn’t define high definition further … is it supposed to be 720p or 1080p?).

Here’s the two sources at 100% magnification (click for full-size):

Battlestar Galactica Netflix vs Blu-Ray 100%

And at 200% magnification:

Battlestar Galactica Netflix vs Blu-Ray 200%

And at 300% magnification:

Battlestar Galactica Netflix vs Blu-Ray 300%

In this instance, Netflix does a pretty good job, although it looks a little blurry. BSG is probably a bad choice for showing off Blu-Ray in comparisons: it’s heavily processed to make it look grainy like film, and almost every single scene is very dark. I doubt it was shot on the latest HD cameras, either (the miniseries first aired in 2003, the full series began in 2004). But since more recent series like Game of Thrones aren’t available on Netflix, this is the best I’ve got for comparisons at the moment.

Of course, even if the Blu-Ray and Netflix versions were identical, a big benefit of those shiny plastic discs is they can be used off-line and don’t require a continuing subscription.

Posted in Personal, Planet, Research, Technology | Tagged , , , , , , | Comments Off

Home Automation: buttons on the desktop

I want a button on my Mac’s desktop to turn on or turn off the lights I have controlled by the Raspberry Pi. Here’s what I’ve got so far.

First, I wrote a script on the Pi to turn the lights on:

#!/bin/bash
/usr/bin/tdtool --on 1
/usr/bin/tdtool --on 2
/usr/bin/tdtool --on 3

For everything here, I also wrote the equivalent to turn the lights off.

Next, I tested running the script via SSH from my Mac:

ssh -f user@raspberrypi.local /home/user/lights_on.sh &>/dev/null

You can use the Applescript Editor to make pseudo-apps, so I wrote an Applescript to execute the SSH command:

tell application “Terminal”

    do script “ssh -f user@raspberrypi.local /home/user/lights_on.sh &>/dev/null”

    activate

end tell

delay 15

tell application “Terminal”

    quit

end tell

I saved this as file format “Application” in my Applications folder, then dragged it to my desktop (an alias is automatically created, leaving the original in Applications). I can now double-click the app and the script runs, or launch it from Spotlight or the wonderful Alfred:
 
alfred_lights
 
It works but it’s ugly as I have a Terminal window pop up for ~ 15 seconds. There must be a better way.
 
After some searching, I came across Use automator to create X11 shortcuts in OSX. It’s similar to the Applescript trick, but uses Automator instead, so there’s no need for Terminal. I put the SSH command into the “Run shell script” workflow action, and saved it as an application:
 
automator_lights
 
It works! Now I can turn the lights on or off without the Terminal window popping up.
 
There’s a couple of issues:
  • The tdtool command takes a long time to execute on the Pi. It takes about three seconds for all three plugs to switch, whereas the smart plug remote has an “all on / all off” button which is instant. I need to find out why the command is so slow, and/or a way to control all three in one go.
  • I don’t really want “Lights on” and “Lights off” apps, I want a single app that toggles the state. This could be done by making the server-side script smarter, but I’d really like the app icon to reflect the status of the lights too.
Room for improvement, but this is good enough for now.
 
Next up: automating sunsets.
 
References  
Posted in Computing, Home Automation, Personal, Planet, Research | Tagged , , , , , , , | Comments Off

Home Automation: Turn it on again

In Pi three ways I wrote:

what happens when you combine an RF transmitter, smart sockets with RF receivers, and a Raspberry Pi?

I’m still finding out what can be done, but this is what I’ve discovered so far.

tellstick_classic

This is what I bought:

bye bye standby kit

First, I set up each of the smart plugs, and paired them with the remote control.

This is already a big improvement: my home office has bookshelves filled with LED lights, but with inaccessible switches. Being able to turn them all on and off from the remote is awesome, but I’d really like them to turn on automatically, for example at sunset. So I need some compute power in the loop. Time for the Pi.

The Pi is running Raspian. I followed the installation instructions for telldus on Raspberry Pi. See also R-Pi Tellstick core for non-Debian instructions.

Next I tried to figure out the correct on/off parameters in tellstick.conf for the smart plugs. The Tellstick documentation is a bit sparse. Tellstick on Raspberry Pi to control Maplin (UK) lights talks about physical dials on the back of the remote control; sadly the Bye Bye Standby remote doesn’t have this.

Each plug is addressed using a protocol and a number of parameters. In the case of the Bye Bye Standby, it apparently falls under the arctech protocol, which has four different models, and each model uses the parameters “house” and sometimes “unit”.

Taking a brute-force approach, I generated a configuration for every possible parameter for the arctech protocol and codeswitch model:

for house in A B C D E F G H I J K L M N O P ; do
    for unit in {1..16} ; do
      cat <<EOF
device {
   id = $count
   name = "Test"
   protocol = "arctech"
   model = "codeswitch"
   parameters {
      house = "$house"
      unit = "$unit"
   }
}
EOF
   done
done

I then turned each of them on and off in turn, and waited until the tellstick spoke to the plugs:

count = 0
((count++))
for house in A B C D E F G H I J K L M N O P ; do
    for unit in {1..16} ; do
        echo "id = $count, house = $house, unit = $unit"
        tdtool --on $count
        tdtool --off $count
        ((count++))
    done
done

This eventually gave me house E and unit 16 (and the number of the corresponding automatically generated configuration, 80):

tdtool --on 80
Turning on device 80, Test – Success

bye bye standby plugs

But this only turned on or off all three plugs at the same time. I wanted control over each plug individually.

I stumbled upon How to pair Home Easy plug to Raspberry Pi with Tellstick, and that gave me enough information to reverse the process. Instead of getting the tellstick to work out what code the plugs talk, in theory I need to get the tellstick to listen to the plug for the code.

So this configuration should work, in combination with the tdlearn command:

device {
    id = 1
   name = "Study Right"
   protocol = "arctech"
   model = "selflearning-switch"
   parameters {
      house = "1"
      unit = "1"
   }
}

However this tiny footnote on the telldus website says: 

4Bye Bye Standby Self learning should be configured as Code switch.

So it seems it should be:

device {
    id = 1
   name = "Study Right"
   protocol = "arctech"
   model = “codeswitch"
   parameters {
      house = "1"
      unit = "1"
   }
}

… which is exactly what I had before. Remembering of course to do service telldusd restart each time we change the config, I tried learning again:

tdtool --learn 1
Learning device: 1 Study Right - The method you tried to use is not supported by the device

Well, bother. Looking at the Tellstick FAQ:

Is it possible to receive signals with TellStick?

TellStick only contains a transmitter module and it’s therefore only possible to transmit signals, not receive signals. TellStick Duo can receive signals from the compatible devices.

So it seemed like I was stuck with all-on, all-off unless I bought a TellStick Duo. Alternatively, I could expand my script to generate every possible combination in the tellstick.conf, and see if I can work out the magic option to control each plug individually. But since there are 1 to 67108863 possible house codes, this could take some time.

Rereading Bye Bye Standby 2011 compatible? finally gave me the answer. You put the plug into learning mode, and get the Tellstick to teach the right code to the plug by sending an “off” or an “on” signal:

tdtool --off 3

So setting house to a common letter and setting units to sensible increments, I can now control each of the plugs separately.

Next up: some automation.

Posted in Computing, Home Automation, Planet, Research | Tagged , , , , | Comments Off

Mobile modem

I was trying to get a handle on how much mobile data download speeds have improved over the years, so I did some digging through my archives. (The only thing I like more than a mail archive that spans decades is a website that spans decades. Great work, ALUG!) Here’s some totally arbitrary numbers to illustrate a point.

Sony-Ericsson t68i

In response to A few Questions, this is what I wrote in May 2002:

[Nokia] 7110 works fine with IR (7110 has a full modem). The 7110 supports 14.4bps
connections, but I bet your telco doesn’t :-)

That should have been 14.4kbps (14,400bps). In 2002 the phones were ahead of the network’s ability to deliver. In 2014, not much has changed.

In GPRS on Debian, this is what I wrote in November 2002:

I finally took the plunge and went for GPRS [..]  (up to 5x the speed of a dialup connection over a GSM mobile connection)

Remember when we did dialup over mobile connections? GSM Arena on the Sony-Ericsson T68i states 24-36 kbps. I’m assuming I got the lower end of that.

Nokia 3650

In 2003 I was using a Nokia 3650 GPRS connection. GSM Arena on the Nokia 3650 states 24-36 kbps. Let’s be generous and assume reality was right in the middle, at 30 kbps.

In 2004 I got a Nokia 6600, which according to GSM Arena could also do 24 – 36 kbps. It was a great phone, so let’s assume the upper bound for the 6600.

In 2008 I upgraded to 3G with the Nokia N80, and wrote:

3G data connections are dramatically better than GPRS

… but sadly I didn’t quantify how much better. According to GSM Arena, it was 384 kbps.

That’s a pretty good and pretty dramatic speed increase:

Mobile download speeds 2003-2008

Nokia N900

But then in 2009 I was using the Nokia N900 (and iPhone, HTC Hero, Google Nexus One, …). GSM Arena on the Nokia N900 states a theoretical 10Mbps … quite the upgrade, except O2 were limited to 3.6 mbps.

In 2012 I was using the Samsung Galaxy SII. GSM Arena on the Samsung Galaxy SII promises 21 mbps.

And now the Sony Xperia Z Ultra supports LTE at 42 MBPS and 150MBPS. Sadly, the networks don’t yet fully support those speeds, but if they did, the chart would be truly dramatic. 2003-2008 starts to look like a rounding error:

Mobile download speeds 2003-2013

I don’t need to use a modem or infrared, either. Things have really improved over the last twelve years!

(This post is probably best read in conjunction with Tom’s analysis of Mobile phone sizes.)

Posted in Computing, Mobile Tech, Planet, Research | Tagged , , , , , , , | Comments Off

Multi media

My movie collection is a bit of a mishmash, a bunch of different file formats all sat on a Drobo. In the early days I would create AVI, MKV or MP4 rips of my DVDs depending on how and where I wanted to watch them. Sometimes the rips would be split across multiple files. More recently I just copied the DVD wholesale, for conversion later. As a result, ensuring a consistent set of files to copy onto my phone or tablet is a bit of a pain.

With the arrival of the RaspBMC media server, I decided to clean everything up. Some constraints I set:

  • I want to avoid loss of quality from source material (so no re-encoding if possible, only copying).
  • I should be able to do everything from the command line so it can be automated (manipulating video files can be a slow process even without encoding).
  • I want to combine multiple DVDs where possible for easier viewing.
  • My end goal is to have MKV files for most things.
Here’s what I’ve got working so far. Bug fixes and improvements welcome.

~

AVI files

You can glue AVI files together (concatenate them) and then run mencoder over the joined up file to fix up the indexes:

brew install mplayer
cat part1.avi part2.avi > tmp.avi && \
/usr/local/bin/mencoder -forceidx -oac copy -ovc copy tmp.avi -o whole.avi

This forces mencoder to rebuild the index of the avi, which allows players to seek through the file. It encodes with the “copy” audio and video codec, i.e. no encoding, just streamed copying.

~

MKV files

MKV is Matroska, an open source open standard video container format. The process is similar to AVI files, but the mkvmerge tool does everything for you: 

brew install mkvtoolnix
/usr/local/bin/mkvmerge -o whole.mkv part1.mkv +part2.mkv

This takes the two parts and joins them together. Again, no re-encoding, just copying.

~

DVD rips

I started using RipIt to back up my DVDs; it can automatically encode DVDs, but once I got my Drobo I opted to keep the originals, so I always have the option to re-encode on a case-by-case basis for the target device without losing the best quality original.

I don’t need to touch most of the DVD copies, but a number of my DVDs are split across several disks, for example Starship Troopers and The Lord of the Rings.

One option would be to encode each DVD at the highest possible quality and then merge the AVI or MKV using the mechanisms above, but I want to avoid encoding if possible.

It turns out that the VOB files on a DVD are just MPEG files (see What’s on a DVD? for more details), so there’s no need to convert to AVI or MP4. We can glue them together as we did with the AVIs, then package them as MKV. The basic method is:

cat *.VOB > movie.vob

The problem is that we need to be selective about the VOB files that are included; there’s no point including DVD menu and setup screen animations, for example. A dirty hack might be to select only the VOB files bigger than a certain threshold size, and just hope that the movie is divided into logical chunks. Something like this, run in a movie directory:

find -s . -name '*.VOB' -size +50M

There’s a catch: the first VOB (vts_XX_0.vob) always contains a menu, so we need to skip those, and we don’t want the menu/copyright message (video_ts.vob):

find -s . \( -iname '*.VOB' ! -iname 'VTS_*_0.VOB' ! -iname 'VIDEO_TS.VOB' \) -size +50M 

We can then use ffmpeg to copy the output of find (a list of our VOB files) into an MKV file. So far we’re assuming we only want the first audio stream (usually English), and I haven’t investigated how best to handle subtitles yet. The command is:

ffmpeg -i - -vcodec copy -acodec copy foo.mkv

There’s a couple of issues with this:

So our final command is:

find -s . \( -iname '*.VOB' ! -iname 'VTS_*_0.VOB' ! -iname 'VIDEO_TS.VOB' \) -size +50M -exec cat {} \; \
| ffmpeg -fflags +genpts -i - -f matroska -vcodec copy -acodec copy -c:s copy foo.mkv

The output should be an mkv file roughly the same size as the constituent .dvdmedia directories. You can test it using mkvinfo foo.mkv, which should output information on the mkv file. For some reason, using ‘file foo.mkv’ does not recognise it as an mkv file, only as data.

~

Putting it all together

Now we know how to handle several individual file formats, we can script the whole process.

The next step is to trawl through a disk full of movies and to normalise them into one format. At this point, we’re well into XKCD territory (The General Problem, and Is It Worth The Time?), so that’s left as an exercise for the reader ;-)

~

References

Posted in Computing, Personal, Planet | Tagged , , , , , , , , , , , | 3 Comments

Pi three ways

My name’s Andrew and I have a Raspberry Pi addiction.

I’ve played with a lot of IT kit over the years, and set up a number of servers for a variety of reasons. File servers, media servers, web servers; from do-it-yourself bare bones solutions held together with scripts and bits of string, to complete vendor solutions that work flawlessly straight out of the box.

None have come close to the perfect storm of adoption, usability, and economy that the Pi represents. It’s terrific fun and I wish I’d jumped on the bandwagon sooner.

Mobile phone-class CPU boards are nothing new – see for example the BeagleBoard – but they’ve always taken a fair amount of effort to be productive. The high profile and huge community around the Pi makes it much more accessible. It even has an 80s-style magazine, MagPi, that reminds me of fun times typing in programs from computer magazines as a child.

Now I’m addicted to the Pi, the two biggest problems I’m faced with are:

  • What is the correct pluralisation and collective noun for more than one Pi? “A bake of Pis” looks horrible. “A bush of Raspberries Pi” is arguably wrong. “A transcendence of pien” sounds pompous. Suggestions welcome. Until then, I’ll go to lengths to avoid referring to the Pi in the plural.
  • How many is too many? I have two, I’m about to order a third, but I could make an argument for one in every room in the house. And maybe one in the shed, one in the car. One for every television. One in each drawer?

Here’s three ways I’m using the Raspberry Pi collection I’m assembling.

The print server

Pi in a Cup

Pi in a cup. For ages I’ve been using a big old clunky PC running Linux as a print server. It was never really called on to do much else, and so it represented power-hungry overkill for the occasional bit of printing.

With minimal tweaks, I got Raspian installed. It took longer to pick the correct printer from the list in CUPS than it did to get everything else set up. And now I have a low power, ultra quiet server and a whole drawer of space reclaimed. And yes, that is an HP TouchPad USB charger powering the Pi…

The media server

I am a huge fan of the original Apple TV: it’s the first device that really made home media make sense on the TV. Back in 2007 I wrote “Apple TV looks like a gorgeous little box”, and it wasn’t long before I bought one. But I soon found it limiting. I wanted to play the high resolution backups of all of my DVDs, but the Apple TV wasn’t powerful enough to do full HD playback. So I “fixed” it with an upgrade in August 2010.

As I wrote later in 2012:

“The AppleTV has been upgraded with a Broadcom Crystal HD chip in the internal PC Card slot – which normally has a wifi card in it. I figured it was worth sacrificing wifi, since with the size of HD video you really want to be using a fast wired network anyway. In order to make use of the HD chip, I have XBMC installed on the AppleTV.”

Sadly, this has always been bit of a hacked-together solution, even with the polish that the FireCore folk provided with aTV Flash. Occasionally something random would get bit-rot, and I’d have to reinstall. It took ages (at the time) to find the right build of XBMC for CrystalHD hardware support. The AppleTV itself also permanently runs warm, which suggests a fair waste of electricity somewhere.

Just a week ago, I got this cryptic error whenever I tried to use XBMC:

XBMC error

I didn’t have time to debug the problem or rebuild the AppleTV. Raspbmc to the rescue! Easy to install, and with the MPEG2 and VC-1 license keys to unlock the codecs, the Pi has more than enough guts to take over XBMC duties. And in a PiBow case it looks fancy, too:

PiBMC

The only downside is that I can’t easily download and watch movies from the iTunes store with the Raspberry Pi, so I’ll need to keep the AppleTV on standby. Movie studios and content distributors take note: I’ll happily pay for downloadable movies on the Pi. Please make this happen.

The home automation server

This requires a longer post, but what happens when you combine an RF transmittersmart sockets with RF receivers, and a Raspberry Pi?

pi@raspberrypi ~ $ tdtool –on 80
Turning on device 80, all study lights – Success

I’m still working out how best to make use of this. Ideas include automatically turning on the lights when it gets dark (querying sunrise/sunset information via HTTP?), or tweeting “hey Pi, turn on the lights please”. Maybe using a webcam to detect motion so lights don’t turn on unnecessarily. What else could I do?

More on that another day …

Posted in Computing, Personal, Research, Technology | Tagged , , , , , , | Comments Off

Sony Smart Watch 2

I’ve been playing around with a Sony SmartWatch 2. Herein some thoughts on wearable devices and smart watches.

Smartwatch2

There’s been several iterations of the smart watch idea. The Verge smartwatch roundup covers the state of play; The Independent has an interesting article on why a Google smartwatch makes sense, and the Samsung Galaxy Gear advert demonstrates nicely the desire for these “James Bond” gadget watches over the years.

The Sony Smart Watch 2 is really nice hardware. It’s a decent size, but doesn’t feel too heavy or cumbersome on the wrist.

Smartwatch displays

The “standby” watch face screen is easy to read. It also has a clever power saving feature where the display is turned off completely unless it detects you. I haven’t figured out how the detection works – presumably motion, touch, proximity or resistance. When the screen is on full it is bright and readable. The watch display is responsive, and swiping from screen to screen is fast and smooth.

The watch needs to be tethered to a phone, and this symbiotic relationship makes sense (up to a point). The watch on your wrist is more accessible than the phone in your pocket, so it’s easier and more natural to glance at the watch to see information. The challenge is figuring out what information you want to see, and getting that information displayed. For example, if you’re using a bluetooth headset and you receive an incoming call, caller ID is handy, along with the ability to accept or reject the call. SMS notifications are useful, but email notifications are too frequent and too verbose to be of much use.

Since the watch is strapped to your wrist and always on you, it allows nice features like warning you when you leave your phone behind. This works, and is useful. On the Sony watch, I achieved this using Augmented SmartWatch Pro; it vibrates the watch when the bluetooth connection to the phone is lost. It looks like similar functionality is on the Samsung Galaxy Gear.

You can’t install apps directly on the watch, you have to go via a management app on the phone. That makes sense, as the watch screen is not well-suited to browsing and buying from an app store. Sony claims more than 300 apps for the Smart Watch / Smart Watch 2. There’s a major flaw, however: you quickly realise that you want to customise the standby screen and watch faces, but those can’t currently be modified. So the alternate watch face apps are useless – you have to turn the watch display on, then navigate through the Sony watch face in order to see the alternate app. Or, you leave the app running and watch your battery drain in a matter of hours. It’s somewhat mitigated in that apps receiving notifications can light up the screen themselves. Apps like Augmented SmartWatch Pro go some way to offering useful alternate screens, even if they can’t be permanently displayed:

Smartwatch augmented

On the screen above: the weather forecast for my location, upcoming meetings, time, current temperature, phone battery status (graduated colour bar on the right) and watch battery status (left; non-existent).

Speaking of battery life: I’ve found it to be better than a phone, but still only lasting a few days (depending on usage). In a world where we get used to plugging in our phone every night that wouldn’t be so bad, except that the cover over the micro-USB charging slot is fiddly, and plugging in multiple devices quickly gets boring. This would have been a killer use case for wireless charging; leaving the watch on a powermat on the bedside table makes perfect sense. Without it, I have to check the battery status before deciding to wear the watch in the morning. VentureBeat reports that Sony is working on one hour wireless charging, which would be great but I’ll believe it when I see it.

So does the smart watch make sense? Almost. Some features are genuinely useful: answering calls from the watch using a bluetooth headset, seeing SMS messages, checking missed calls, reading the name of the Spotify track playing. None of these amount to “killer” features, and I’m not sure enough consideration has been given to what people could actually use a smart watch for – but perhaps that’s up to third-party developers to innovate and discover.

On the down side, there’s not enough fine-grained customisation or control over what content is delivered to the device. For example, I’d like only mail from certain people or with certain GMail labels to reach the device, rather than everything in my inbox. I’d like Twitter direct messages and certain hashtags to reach the device, but not all timeline tweets. I’d like to customise the watch face to display day of week, date, outside temperature, the time in other timezones, and custom notifications such as number of steps, but this isn’t currently possible. For a second-generation device, I’d hope many of these things would have been figured out by now. This is probably why Apple are waiting to release a watch – in order to answer the what, why, and how questions that Sony and Samsung have failed to cover.

Are smartwatches the future of gadget technology? Possibly. There’s genuine utility in the smartphone/smartwatch/bluetooth headset combination, even with the rough edges. The watch is killer screen real estate in a way the phone can never be, so the challenge is to figure out how to make best use of it.

Posted in Mobile Tech, Planet, Research | Tagged , , | Comments Off

Fitbug

I was encouraged to purchase a Fitbug as part of the healthcare scheme I’m a member of. The more I walk around, the more benefits I get from the scheme. Of course this makes sense – a fitter, healthier, more active me is less likely to have expensive healthcare requirements.

I’d been curious about the whole wearable fitness device market for some time, after seeing friends with Nike Plus gadgets and Fitbits. So the extra incentive of a healthcare bonus was enough for me to take the plunge.

I bought the Fitbug Air, which is a simple pedometer that can sync to an iPhone or iPad via Bluetooth. It’s a simple gadget with an LCD and three buttons on the front. It comes with a lanyard and a belt clip. The Air comes with a subscription to the Fitbug website, where you can track your daily calorie intake, add other exercises, set goals, and review previous activity. There’s an iOS app that lets you review basic information and sync your device’s data to the website, and there’s also a simple website that walks you through the Fitbug setup.

Fitbug Air

As a motivator to encourage more exercise, it definitely works. If you get toward the end of the day and your steps are measured in the hundreds rather than the thousands, you know you’ve been very lazy and it’s time to go for a walk.

In theory, the device is simple enough to just leave in your pocket and ignore. In practice, there are a few problems with this approach.

When in the pocket, the buttons have the tendency to get pushed accidentally if you lean against a desk, kneel in tight jeans, or have a shoulder bag. You then have to figure out which button was pressed. For example, it’s possible to accidentally alter your stride length, which throws out the measurements. This is exacerbated by the fact that the device comes with no useful documentation for the buttons. There’s various modes (for example, viewing historical data), but you have to work it all out for yourself.

The device is supposed to automatically sync via bluetooth to the iOS app whenever there’s been activity in a (configurable) 30 minute period. In practice, the sync stops working unless you restart the iOS app occasionally. Here, for example, the Fitbug stopped uploading for a couple of days, until I checked the app and kicked it:

Fitbug app

The app is supposed to support multiple devices (only one device in the UI, but multiple devices can sync to the website), so that several people can use Fitbugs without everyone having to own an iOS device. It took a call from the (extremely helpful and friendly) Fitbug support team to get this working for me.

Speaking of multiple devices, the second device I received shipped with a spare battery and other bits and pieces. I contacted Fitbug to ask why the first device didn’t come with this, but have yet to hear back.

You also can’t update nutrition information through the app, which makes logging data on the move during the day hard work. You can use the website, but it’s not entirely mobile-friendly.

The Fitbug website is schizophrenic. When I started using it, there were two versions of the site; an old version and a new one “with a bit of KiK”. I’ve no idea what “KiK” is, and it’s not documented anywhere.

Fitbug websites

The “new” site is pretty, and the infographic-style presentation of information is nice, although much of the terminology isn’t explained. For example, “you’ve hit pink!” on the nutrition section presumably means you’ve eaten the right amount.

Unfortunately, despite all the work that’s been done on the shiny new site, when you receive your weekly Fitbug update email, and click for the progress report, it takes you to the old-style website. This means customers have to be familiar navigating around both sites.

As if two sites weren’t enough, with the recent release of the Orb, Fitbug has replaced their new site with a new new landing page, which is one of the most buggy website implementations you’re ever likely to see. Check out the placement of the login/register buttons in the top-right of this screenshot for example:

Fitbug new new site

If you resize the browser window, making it narrower and then wider again repeatedly, those icons gradually move off the top of the page.

One nice feature of the Fitbug site is that it frequently asks you for feedback. I must have filed a dozen bugs by now, including suggestions for usability tweaks. Unfortunately, over the course of the last few months I haven’t seen any changes as a result of this feedback, so I’ve stopped submitting any.

At this point, you might be forgiven for thinking that as a new entrant to the market, Fitbug are still working out a few teething troubles. But surprisingly Fitbug has been around since 2005, two years ahead of better-known competitors such as Fitbit. So it’s somewhat disappointing that they are not much further ahead. I don’t mean to belittle the challenges of building an integrated hardware and software business, including mobile apps, an extensive website, and partnerships, but there just seem to be rather a lot of loose ends and a lack of polish. This is a booming industry, with fierce and rapidly-evolving competition.

Despite all the glitches and issues, I’ll still be using the Fitbug and, on the whole, I’m glad I’ve got it. I’m definitely walking more, and the website calorie counter provides a useful motivational tool – when you see your daily calorific intake, it provides additional motivation to take it easy on the snacks or to adjust your diet.

Now that I’ve got into fitness tracking device ownership, I’d like to try out alternative devices and apps. Unfortunately, I can’t (or at least, not without continuing to use the Fitbug as well). The sensible way to manage activity measurements would have been for healthcare providers to have an open API that any device can talk to, so that consumers can pick which pedometer they want to use. What’s happened instead is that healthcare providers only recognise one supplier, locking out choice and stifling innovation. It’s a familiar story in the technology world, and companies never seem to learn the lessons of the past. It’s not necessarily Fitbug’s fault, but I’d be a little more lenient toward them if they applied a little more polish to their offering.

Right, I’m off for a walk.

Posted in Mobile Tech, Personal, Planet, Research, Technology | Tagged , , , , , , | Comments Off

Books 8

A year and a month since the last list. I need to update this more often.

Here’s what I’ve been reading. Disclosure: links are affiliate, so if you buy through them I in theory get some money (although I haven’t seen a payment from Amazon in years … probably why I’m so grumpy with them).

  • A.I. Apocalypse and Avogadro Corp: The Singularity Is Closer Than It Appears, both books in the Singularity Series by William Hertling. An interesting and technically smart look at the near future. Contains some smart ideas like UPS drones, warring data centres and contaminated smartphones. I’ve just spotted The Last Firewall is available from the same author, looking forward to reading that.
  • A couple of books via the Humble eBook Bundle:
  • The Age of Ra by James Lovegrove. Fun book, particularly the idea that Egypt (with a pretty long history of diverse god-worship) could become Freegypt, the only place independent of the gods. I have the next in the series queued up to read.
  • The Forever War series, found via BestSFBooks 1976 Hugo Award winners. I loved all the Scalzi books starting from Old Man’s War, so it’s not surprising that I enjoyed Joe Haldeman‘s books. Forever Peace and Forever Free were quickly demolished afterward.
  • Starbound, also by Haldeman. Not bad.
  • Great North Road by Peter F Hamilton. Great story. A fun vision of a family of clones and a realistically miserable depiction of the North of the future. In some ways felt a little like Hyperion in tone.
  • Wolf Hall by Hilary Mantel. OMGWTF a non sci-fi book on the list? This book is tough to get into but utterly compelling. It’s historical fiction about the rise of Thomas Cromwell in the court of Henry VIII.  Followed up by A Place of Greater Safety, a historical retelling of the French Revolution, which I found much harder to read (a cast of hundreds, if I didn’t know French history it would have been nigh-on impossible to keep track of who’s who).
  • Working through the classics of Raymond E FeistMagicianSilverthorn, and A Darkness at Sethanon. All good.
  • Watchers by Dean Koontz. I used to be an avid reader of Koontz and Stephen King; this was a fun reminder of the genre.
  • Reamde by Neal Stephenson. One of my favourite authors. This is not his best book, but it’s still very good. If you’re a fan of World of Warcraft you’ll be amused.
  • Northworld Trilogy by David Drake. This is great. Imagine medieval Iron Man, and you’ve got only an inkling of how weird and good this book is.
  • The Amber Rooms by Ian Hocking. I discovered Déjà Vu in 2011 and really enjoyed it (go grab it, it’s currently free on Amazon). Flashback was a good sequel. Unfortunately I was less keen on The Amber Rooms. It’s a good book, but has a different feel to it than the two preceding works, and the pacing felt a bit off. Fortunately, Red Star Falling (a novella in the same series) was a return to form, and I can’t wait for more books from Mr Hocking.
  • The Hydrogen Sonata by Iain M. Banks. I’m unbelievably sad that we won’t have any more wonderful fiction or science fiction from Mr Banks.
  • Existence by David Brin. I enjoyed this for the same reason I enjoyed Rule 34. Both are a dark take on the wonders of future technology; one alien, one human.
  • Following in the footsteps of Haldeman and Scalzi, Jay Allan‘s military fiction (prodigious output, the guy writes quickly, which is great news for fans). Fast-paced, easy reading. The Crimson Worlds series:
  • I’m working through the works of Eric Brown, after previously enjoying The Kings of Eternity, Engineman and The Angels of Life and DeathMeridian Days and Penumbra were enjoyable; New York Nights was good but the whole rampant AI thing feels a bit familiar, and the Private Investigator noir trope didn’t quite hit the right notes.
  • Evan Currie’s Odyssey One series was fun. As one reviewer accurately puts it, the books feature “the most creatively unpleasant FTL systems I’ve ever come across”. I found myself wanting to read more when I got to the end of the third book, which is a good sign.
  • The Sandman Slim series (another one via Tim Bray).
  • Just One Damned Thing After Another, by Jodi Taylor. It’s billed as the first of the Chronicles of St Mary’s series, and I really hope there are more. I got this when it was free, but it’s easily worth a few quid (it’s currently a bargain at just 77 pence). Time-travelling academic researchers. “Meet the disaster-magnets of St Mary’s Institute of Historical Research as they ricochet around History.”  Brilliant.
  • 48 Hours, by J Jackson Bentley. It’s slightly clumsy, with excessive descriptions of locations and technical references that will date it very quickly, but underneath is a surprisingly good thriller / detective book. Oh, and it’s free at the moment. Bargain!
  • If you’re a fan of Stross’ Laundry series (and if you’re not, why not?), then Equoid is a satisfying novella. I actually read Equoid on tor.com where it’s free, but would have been happy to pay for it on the kindle if I’d spotted it there first. As it says on the site, “Equoid” contains scenes and situations some readers will find upsetting and/or repellent. What could possibly be repellent about unicorns, you may be be wondering. Just wait.
Preparing this list is a good thing. In going back through my Kindle orders and library to prepare it, I discovered nineteen books that I’d bought but not yet read, and several series with new books available. Of course, if the Amazon Kindle experience was better, I wouldn’t need to be a librarian in order to find out what books I’d missed.

Previously: booksbooks 2books 3books 4books 5, books 6, books 7.

Posted in Personal, Planet, Uncategorized | Tagged , , , , | Comments Off

Amazon, Kindle, reading lists

I’m preparing a blog post with a list of the latest books I’ve read, and in the process, I’ve been reminded of the awful Amazon experience.

For starters, I’d like to have better integration between my Kindle, goodreadscalibre, and this blog, so that I can automate the generation of reading lists and manage all my books. Why doesn’t Amazon make this easier out of the box? It would surely drive ebook sales. Instead, Amazon seem to delight in making life harder.

For example: Amazon used to list Kindle orders separately, and if you look under your account, they still have a menu item for “View Kindle Orders & Charges”:

Screenshot amazon kindle orders

But when you follow the link, instead of a custom list of all your Kindle purchases, Amazon now throws Kindle orders into the list of all other orders, with a banner message at the top:

Screenshot amazon all orders banner

This means that you have to search through all your Amazon orders to find the books. Perhaps you can get a list of your books by searching for “Kindle”:

Screenshot amazon order search kindle

That doesn’t seem too bad at first glance, so why doesn’t Amazon change the “View Kindle Orders & Charges” link to https://www.amazon.co.uk/gp/css/order-history/search=kindle? It would be a better user experience. Let me tell you why: not all books are tagged “kindle”, and so you can’t search for all your kindle book orders. This is so brain-dead stupid. The only way you can track all of your books is through the “Manage Your Kindle” link, which provides a horribly-limited view onto your library (limited to 15 items at a time, no view of the book covers, and only a few columns of information on each book).

It’s strange that Amazon boasts that even the most basic kindle holds “up to 1,400 books – take your library wherever you go” whilst at the same time making the library management experience so awful.

I’d like to see future Kindles automatically organise books by author. I’d like to see the Kindle website provide a better library browsing experience. And right from the library, I’d like to see where other books from the same author are available to buy. If Amazon improved the experience just a little, I’m sure book sales would jump significantly. I guess the problem is they are selling well already, and no-one else in the marketplace is really challenging them and giving them a reason to innovate.

Speaking of user experience – there are a number of sites popping up that help discoverability of books. They don’t have quite the same feel as browsing through the sci-fi section of the local library, breathing in that slightly musty book smell, feeling the crackle of the plastic book covers, picking up books with cool-looking cover art. But they do allow you to do things like browse books based on the award they won, or see books your like-minded friends are reading. Of particular interest:

Are there any better ways to manage Kindle content? Are there any better sites for book discovery?

Posted in Mobile Tech, Personal, Planet | Tagged , , , , , | Comments Off