Wednesday 30 March 2011

Handbuilt by Richards

There was a TV commercial in the 1980's for the Fiat Strata (or Ritmo if you are reading from outside the UK) with the strapline "Handbuilt by Robots". The Not the Nine O'Clock News team did a brilliant satire of it with the alternative strapline line of "Handbuilt by Roberts". Watch the two in the above order and you'll get the joke.

On Tuesday this week, I felt like I was on the Not the Nine O'Clock News set. I had a meeting with our IT service partner Appichar. The four people in the room were me (Richard), my CEO (Richard Craig), their CEO (Richard Ferriman), and our account manager (Richard Eynon).

CTT IT, Handbuilt by Richards!

This is why I get out of bed

There's a horrible feeling on the Monday after the clocks go forward, when you get out of bed and the alarm says 6am but your body is telling you 5am, and you wonder why you do this? So it's really timely when something comes along that tells you why (OK so it actually arrived Friday, but it really helped).

The New Shoes Theatre company in Kent are one of the beneficiaries of the Huddle donation programme that CTX runs. Their story reminded me that what I do really does make a difference.

Wednesday 23 March 2011

Cloud Hype

I guess I first heard the term "cloud computing" around about three years ago, but in the last six months, just about everything seems to be tagged "cloud". My old colleagues at Gartner have a long standing model for this phenomenon called the "Hype cycle". It runs through five phases from a technology "trigger", through to a "Plateau of productivity". Right now, cloud computing must be right at the top of the "Peak of inflated expectations"!

What follows is peak this the "Trough of disillusionment" as organisations dive into "cloud" technology and  discover half of what's out there is only superficially "cloud", and lots of things don't really deliver. We're already seeing lots of debates on what really constitutes "cloud", which marks the start of the decline.

The model of delivering discrete applications over the web has been around for over ten years - variously called application service provision (ASP)  or software as a service (SaaS). Those applications that have developed and survived over this time have jumped on the "cloud" bandwagon. Expect them to jump off again as the term becomes seriously devalued. They will go back to selling the benefits of their applications, and its delivery mechanism (over the web), and all talk of cloud will disappear from their vocabulary.

Meanwhile, large organisations are adopting the flexible processing power models that have also been labelled "cloud". Most charities will only run into this as their web hosting providers adopt it. Even then it will be pretty transparent, except perhaps in some of the contractual terms.

Perhaps the most interesting developments to survive and thrive will be the ability to access many more applications from anywhere on the web (or mobile device), and the advent of the "app" model of downloading applications when I need them to run on my particular platform (mobile or static). There are some gamechangers in there that will have their own hype cycles.

Anyway, for me, the term "cloud computing" is dead for any useful purpose.

There's Always Something!

Last week, we upgraded all the office PCs to Windows 7 and Office 2010. Most were running XP and Office 2003. To say things went really well would be an understatement. My experience of mass upgrades is that there are usually a whole host of things that crawl out of the woodwork when you do them, however well you plan.

A couple of nasty little problems cropped up: a PC that attached to the network, but steadfastly refused to see the internet (Apples fault); and a very large mailbox that wouldn't download to a PC (fixed by running it in online mode). But otherwise I was really happy.

UNTIL: Monday morning one PC can't get into the finance system. We installed the finance software on two others, and they worked like a dream. This one refuses point blank. Actually its the user account because it all works when signed in as administrator. No one seems to be able to tell me why!

Friday 18 March 2011

Is Social Media More Powerful than Governments?

This was a question posed for a spectrogramme at the recent TechSoup Global Contributors Summit in California. If you've not come across the idea before, a spectrogramme is a line from one end of the room to the other, with one end labelled 'agree' & the other end labelled 'disagree'. Everyone in the room is asked to stand somewhere along the line in response to a specific question.

The summit took place post Tunisia and just as things in Egypt were coming to a head. Of the 50 people in the room, the vast majority were hanging around between ‘don’t know’ and ‘agree’. I was firmly in the ‘disagree’ camp. When I was challenged on this, I noted the age demographic of the people in the room and observed that I was one of those who was old enough to have been there when the Berlin Wall fell (I wasn’t actually in Berlin, but some of my friends were).

In 1989 the internet hadn’t been invented, nor had text messaging or Social Media. The latest technology was a mobile phone that needed a suitcase to carry the battery! Yet the Berlin Wall still fell. Why? Fundamentally it was because the dictatorships of Eastern Europe (starting with the Soviet Union) had lost the appetite, or the ability, to command their armies to suppress their people by force. When the people rose up, no one was willing to do what it took to stop them.

Contrast Eastern Europe with China in 1989. The Chinese leadership was prepared to crush the Tiananmen Square demonstrators with force, and they were able to rely on the support of their army to do it.
Fast forward 22 years. Popular uprisings topple long standing regimes in Tunisia and Egypt with remarkably little bloodshed. But then we come to Libya. A brutal dictator prepared to crush all opposition with a military that remains loyal to him. Sounds a familiar story?

Social Media has many benefits in helping people communicate. But it is people on the street that change regimes and Social Media doesn’t help them against tanks, planes and machineguns.

How IT is Set Up to Fail – because it’s trying to do the wrong job!

I've been working in the 'IT industry' for 25 years now (that's a scary thought in its own right!) and, as Martha Heller points out in her Computerworld article “How IT is Set up to Fail”, the same conversations go round and round: why doesn't IT deliver; why, in so many organisations, is IT seen as a barrier/difficult/a problem? Yet IT is part of everyday life in most organisations and for an increasing number of people outside of work as well - that's why there is so much focus on digital inclusion.

So why does the "CIO Paradox", the latest trendy term for this problem, persist? I don't believe the problem is due to CIOs intellect or business acumen, and I don't believe it’s just because CIO's aren't invited on to the board of organisations either! I think it’s because they are persistently expected to do the wrong job!
I think it comes down to one major factor: it’s not about technology it’s what we do with it. CIOs are expected to be able to revolutionise their organisations with IT while running day to day operations. Ground zero is that the IT an organisation has deployed works. IT has a history of failure here. If a production line failed as much as many IT systems do, and was so difficult to use, it would be ripped out and rebuilt. There's also a constant cacophony of change that most people outside of the IT world don't care about - most people I know were perfectly happy with Windows XP & Office 2003. IT is hardly likely to be invited to the senior table to discuss strategy if it hasn’t got the basics right.

However even when IT is delivering to a reasonable level operationally, the “what do we do with it?” question remains. And the problem is that we in the IT departments think we can answer that question! In Martha’s article she talks about overcoming the paradox by “moving beyond enablement” as a result of becoming a “competitive capabilities expert”. This is a grand idea, except it’s been around for as long as I have, if not longer, in one guise or another. It’s a tantalizing idea that would miraculously catapult CIOs into the inner circle of organisational executive management. Except for one small problem. CIOs and IT organisations are in the wrong place to succeed at it or even to get the buy-in to try to do it from the rest of the organisation.

Let me share a conversation I overheard on a train the other day to illustrate what I mean. Two people were sat behind me discussing a piece of work that was being undertaken by someone brought into their department specifically for the task. This person was, apparently, documenting the processes of the department with a view to identifying how technology could improve them. The conversation revolved around how the person doing the work had no understanding of their industry and the specific, detailed complications and issues surrounding their niche within it. These two people concluded that the exercise was inevitably going to be a waste of time as the person doing it was unlikely to come up with any practical insights into where improvements could be made.

Now, not withstanding the innate hostility that always accompanies a stranger turning up and telling you they could do your job better than you can, my experience tends to support their analysis. Don’t get me wrong, there are some very good business analysts out there in many organisations. People who can analyse a process, identify where IT can be deployed to increase efficiency/effectiveness, and drive a system specification that leads to a sound application. However, that isn’t being a “competitive capabilities expert”. In my experience, the real revolutions come from people who are so steeped in a sector niche that they intuitively understand where “the little things that make a big difference” are. Otherwise they are complete outsiders who are far enough removed to spot the wood from the trees and launch new organisations to exploit major technological/sociological discontinuities. For example spotting that the internet could catapult mail order into mainstream shopping (Amazon).

So I don’t think the CIO paradox is going to be solved anytime soon. The organisations that are going to exploit IT most effectively and strategically aren’t the ones who focus on finding a CIO who can turn the IT function into a collection of competitive capabilities experts. They are going to be the ones who can develop the IT savvy of people across their organisations so that they see how IT could impact what they do and then drive the projects that deliver those changes. Yes CIOs are set up to fail, but only if they persist in trying to do the wrong job!

I Have Seen the Future and the Future is Flat!

So the dust is starting to settle on the much hyped iPad2 launch. As with all things Apple there are those who immediately love it and those, like me, who think what’s all the fuss about? After all, tablet PCs have been around for quite a while. However, Apple’s incursion into the market, with their flare for design and ability to make the iPad the “cool” gadget to have, opens up an intriguing possibility for the future of IT.

The iPad may have some flaws in its capabilities when it comes to replacing my notepad (see the ComputerWorld article Six reasons you want an iPad, six reasons you don’t  for a far better critique than I could give), but it is a whole lot cooler than the traditional, slightly clunky, tablets I’m familiar with. Which means that in a couple of years or less, I could have a slim, ergonomic device that provides a simulated on screen keyboard and display, is light and easy to carry around and delivers as much, if not more than my notebook. What’s more, the advances in mobile networking mean that it is always connected to the internet.
It’s that combination of always on and highly practical that excites me.

You see the major shifts in technology more often than not arise out of the collision of trends rather than a single new innovation, and there is another trend that is rapidly growing in importance in the way organisations deliver their applications, Cloud Computing. At its simplest level, Cloud Computing delivers applications using internet technologies to a web browser. No need for locally installed software, local servers and everything that goes with traditional applications. Large companies are building their own “cloud” infrastructures to deliver internal applications on web technologies, and there are more and more commercial offerings like Salesforce.com’s CRM solution. Initiatives such as GoogleDocs and Microsoft’s Office365 are starting to deliver word processors, spreadsheets and presentation software using this model.

So if I don’t need any special software on my PC/iPad to get all the applications I want; my device of choice uses mobile network technology to ensure I’m connected wherever I go at a decent speed; and all my applications are accessed using the internet, then why do I need any other infrastructure? Why do I need a desktop hardwired to a server in the office? Why do I need local WiFi? I just get out my iPad (or equivalent) wherever I am (office, train, home, beach) and I’m fully operational!

What this really means is that my need for lots of infrastructure, network, local servers etc. goes away along with all the cost associated with it!

The future? I’m sure there are a hundred and one issues to overcome in getting there - I could rattle off half a dozen without trying (rural 3G coverage anyone?) - but as a model, for me, it’s compelling and I can’t wait!