So say we all!

[This should be up on the BBC News website soon]

There is a famous and hilarious episode of Whatever Happened to the Likely Lads, the BBC sitcom from the 1970’s, in which our eponymous Geordies, Bob and Terry, spend an anxious day trying to avoid hearing the result of a crucial football match because they will be watching it later that night on Match of the Day.

It was one of my mum’s favourites, partly I suspect because she came from Hebburn and had grown up among men who resembled the Likely Lads in many ways, my dad among them.

Despite many mishaps and near-misses all goes well until just before they are about to settle down in front of the TV, but the ending is both funnier and less predictable than you might expect, as it usually was in this fine old British comedy, so I won’t give it away here.

I thought of Terry and Bob this weekend, since I found myself avoiding Facebook and other social network sites, refraining from reading much email and staying far away from any of the many manifestations of Twitter for fear that someone would give me even the smallest hint of the denouement of Battlestar Galactica, which has just come to an end after seventy-three episodes and numerous webisodes and extras.

The ‘reimagining’ of this 1970’s television series has been running since 2003, and I’ve been a fan since I was introduced to it by one of my geekier friends, Simon, the following year.  I have the box set of the entire series up to the current one, and I’ll buy that as soon as it comes out.

In telling the story of the 32,000 or so survivors of a catastrophic attack on human civilisation by a race of cybernetic organisms, the Cylons, Galactica has explored issues of politics, relationships, philosophy and international relations in a way rarely seen in film or on television.

So there was no way I was going to let a careless tweet or status update spoilt the final two hours, after I’d invested so much time and energy in these characters and their stories, even if it meant cutting myself off from the social graph for a day.

I managed it, and I was able to enjoy the shock, surprise and delight that the finale had to offer without that dull ache that comes from knowing where the story is going, or even knowing that there’s a plot twist just before the end. And I’m not saying anything here, because some of you may still be waiting to see it.

But being cut off from the network for a day was surprisingly hard work.  I let myself read emails, because I was pretty sure nobody would be cruel or cunning enough to send me an apparently innocent message with a spoiler in it, but everything else was effectively offline.

I’m lucky enough to have a laptop, a smartphone and almost constant net access whenever I want it, and as a result I’ve built much of my daily life around the assumption that I’ll be plugged into the real-time web. If a news story breaks I expect to be alerted to it by one of the various feeds that come in, and although I’ll close things down for a hour or two to do some work, most of the time the burble of the network thinking out loud is as much a part of the background to my life as Radio 3 or 4 are when I’m working at home.

However I missed Twitter and Facebook most of all, and it made me reflect on the special status that these two services have, and how they emerged from the many other Web 2.0 startups to achieve their current dominance.  I think it has something to do with the physics of the cloud.

The metaphor of cloud computing, popularised by writers like Nick Carr in his book ‘The Big Switch’ and promoted by companies large and small across the network, is compelling and persuasive. We are moving to a world of utility computing where data and services will be spread between desktops, servers and mobile devices according to the needs of the moment, and the obstacles which stand in its way will be overcome.

But we should not think of the cloud as a fluffy cumulonimbus floating in the blue sky of the network, water vapour produced by the collision of the weather fronts of server farms and high-speed network access that could evaporate if the cold front dissipates.

A better analogy is the cloud of debris spewed into space by a supernova, the exploding stars whose runaway nuclear reactions produced most of the elements above oxygen in the periodic table and whose dust we all are.  This cloud is made up of particles of matter, swirling around in the hard vacuum of interstellar space, disturbed by the passage of Battlestar Galactica and the odd Cylon raider, and subject to all the forces of nature, including gravity.

Our observations of the universe tell us that stars gradually come together from this cosmic debris as random eddies and currents create clumps of matter that slowly pull more dust to them, and that the debris from this process undergoes its own accretion to form small and large planets.

The computing cloud was created when the old internet, the one that supported dialup users and file transfer, went supernova with the arrival of fast broadband and smart devices.
The sun of Web 2.0 has already sputtered into life, the first nuclear reactions taking place to make it a self-sustaining system that can illuminate and warm the environment around it.

Now we’re starting to see planets emerge, major nodes in the new social graph like Facebook and MySpace and Twitter.  We’re at a very early stage, of course, and there will be collisions and realignments, so some of these proto-planets may fall out of orbit and into the sun, or end up as an asteroid belt of small-scale sites. Success is never guaranteed, in business or when making a solar system.

And there’s also a chance that the bright new sun of Web 2.0 will itself prove unstable, perhaps collapsing into a black hole and exploding as a supernova in its own right.

Bill’s Links

Likely Lads:

Battlestar Galactica finale (spoiler..)

This entry was posted in billblog and tagged , , , . Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *