Devolo power-based internet: dreamy

We had a loft conversion built last summer. The signal from the Belkin wireless router two floors below was not sufficiently powerful to be reliable up there, so I introduced an interim LinkSys wireless router on the intervening floor. The idea was that when connecting from the top of the house, you’d use the wireless signal from the LinkSys, which in turn connected wirelessly to the Belkin which went straight out to the internet. Rob was hugely helpful in setting this up.

But while connection to the LinkSys was strong and reliable, it seems that the onward wireless link was unstable. In short, I think that relying on two wireless hops was asking too much.

I was directed by Steve (megastar) to use a power-based connection. The idea is that the copper wires that support your house’s electricity are used to transmit data. I’m not sure that this is what Edison had in mind when he discovered electricity, but by golly it’s a fabulous idea.

I went ahead and bout the Devolo dLAN 200 AV Wireless-N Starter Kit from Amazon fr £89.99. It arrived on Wednesday and I installed it on Saturday.

The device consists of what look like two regular electric plugs, each with a small transformer-sized pack on it. An Ethernet cable, an Ikea-esque word-free instruction page and a redundant CD complete the box’s contents.

It’s sublime. You plug in one of the plugs near your router and connect it using the Ethernet cable. You plug in the other plug in the troublesome area of the house. The second device emits a wireless signal, the password for which is on the back of the device. You connect to this wireless network and the electricity’s copper wiring connects that device to the other one, which connects on to the internet via the aforementioned Ethernet cable.

So far so good. The only issue that had me worried for a while was that the 16-character password didn’t seem to work for Apple devices—iPhone and iPad specifically. It turns out that the hyphens separating each quartet of numbers that were not required by Windows were required by iOS. Odd UX fail by Apple there.

Traffic calming makes me angry

There used to be a mini-roundabout in Clapham. It was simply a solid white circle painted in the middle of intersection, with curvy white arrows painted around it in a clockwise direction. If the tarmac was raised at all, it was hardly noticeable.

This meant that vehicles used to zoom across it, often flouting traffic laws and often ignoring the zebra crossing that immediately succeeded it on one of its tributaries. In hindsight, it was a bit dangerous—mainly to cyclists and pedestrians.

A few weeks ago, two workmen dug a big hole where the white circle used to be. And over the two weeks that followed, they built a proper roundabout, meticulously cutting the edge stones and the cobble stones that form its centre. It was lovely to see the road markings as they calculated the arc of the new arrows that would need to be painted, and how far away they should be from one another to ensure they were uniformly spaced.

A couple of weeks after its inauguration, there is a big black line cutting a chord going across it. It’s either rubber or some nasty fluid that has emanated from a vehicle. Judging by its angle, that vehicle was going way too fast and didn’t realise that a hump now furnished the road. Its driver, I suspect, was an idiot.

Since last week, the roundabout has supported four new ornaments in the form of traffic cones. I suspect this is to warn approaching motorists that there is a perfectly legal road feature in their wake. I also suspect that its introduction might be a reaction to possible legal action being pursued against the local council by the aforementioned alleged idiot, in a bid to avoid any possible future claims.

What will happen next, I don’t know. I expect additional signage will be introduced to cater for the speeding idiots. I only hope that the cones are not a permanent feature.

Human Resources and management in government

There is fear in the public sector around the impact that the imminent staff reductions will have on the effectiveness of its operations. Yet I have first-hand evidence that there is also fear that the staff reductions are not happening fast enough, and that the approach to reductions will not yield the most effective outcome. There is concern that voluntary redundancy process will result in the good people leaving, as they have a better opportunity to secure work elsewhere; while the dross will remain. So the civil service will be both slimmer and more ineffectual.

I’ve worked with government for the majority of the last ten years. During that time, I’ve never been aware of anyone being fired for poor performance. I’ve heard of people being fired for gross misconduct (backhanders and the like), but never for poor performance.

People management needs to improve. People need to be set real, achievable targets. People need to be measured against those targets. People who fail to meet those targets should be introduced to the performance management process. And if they don’t come out of it performing well, they should be fired. HR departments need to become more visible and work with the business to introduce them to this new regime.

By taking out the poor performers, the civil service can become slimmer while retaining its effectiveness. With the current model, yes it will become slimmer. But the average effectiveness of each civil servant will diminish.

How much should a DNS change cost?

Ten or so years ago, the trend in government was to outsource IT. It was perceived that government should focus on its remit—policy-making and serving the UK people—as opposed to worrying about the IT systems needed to support these tasks.

So now, the majority of the Whitehall Departments have a largely single-sourced model. HMRC is supported by Capgemini (under its Aspire branding); Defra by IBM; the Cabinet Office by Fujitsu, and so on. And these deals can last anywhere from five years upwards.

This is great. It means that the civil servants can focus on the task at hand, while IT support is at the end of a phone line, and the blame for glitches in high-profile IT problems can be outsourced to the private sector. Except it’s not that great.

A single-source model sounds wonderful. There’s never any doubt as to who to go to when you have a problem or indeed a requirement. But the problem is that this comes at a price. The largely Tier 1 suppliers enjoy the luxury of operating in what is effectively a monopolistic market for the term of the contract. And for this reason, prices go up. I heard only the other day of a large IT provider charging £28,000 (twenty-eight thousand pounds) to make two DNS changes on behalf of a Department. No matter how you do the maths, it’s incomprehensible to get anywhere close to this number in a rational world.

In the event that the commercial model allows the Department to award business to other providers, competition is introduced. But system integration almost always rests with a lead (Tier 1) supplier, and that Tier 1 supplier has a monopoly over this piece of the pie. So the lower price that might be secured by awarding the business to the competition will likely be counterbalanced by an elevated price for integrating that work into the Department’s IT estate, and so the Department loses out either way.

I wonder whether shame is the answer to all of this. If the general public was made aware how much the Tier 1 suppliers were charging for some of the basic IT building blocks—DNS changes, password resets and the like—would they be shamed into charging reasonable fees for such work? Or would the cost shift to other, less commoditised elements of their portfolio, less easily dissected through the Freedom of Information Act?

To start the ball rolling, who wants to ask the FOI question: Please tell the public how many DNS changes have been made in each Whitehall Department in the last twelve months, and how much was charged for each?

avast: loyalty in spite of problems

Last night, avast suffered a rather large issue. All of a sudden, it started blocking seemingly every http web page. I still had email access (over https), but even attempts to Google what the problem had the results blocked.

avast’s blog post of yesterday indicated that this was a result of a false positive issue with one of their virus definitions updates. By the time I’d read this, I’d already downloaded AVG’s free offering just to get me up and running again. avast’s mistake here was not to advertise the issue clearly on their website, particularly given its crippling impact. (I’ve just watched a Guy Kawasaki video in which one of his messages is: Deliver Bad News Early.)

As a paying avast customer, I will go back to them. Not *because* I’m a paying customer, but because the product is all kinds of awesome.

I’ve been burnt by horrendous anti-virus software in the past. Namely: Norton. Norton is a horrendous application, a virus in and of itself. It sucks the life out of users. It announces and advertises its presence at every opportunity. And it makes you want to throw your laptop through the next available window.

avast is the opposite of Norton. Its spinning disc sits innocently in the tray at the bottom of the screen, there purely to inspire confidence. On occasions, it tells me that the virus definitions have been updated. But beyond that, I know not of its existence.

I’ve been an avast customer since 31 May 2006. And I will continue to be their customer long after our fifth anniversary.