Monthly Archives: January 2010

Private clouds are (not) the future

Google (well, James Hamilton) has weighted in on the question of private clouds. As expected from a large cloud provider, James takes the position that private clouds make no sense. His reasoning is straight forward: private clouds will never have the scale of public clouds, therefore private clouds can never achieve the same price point as their public brethren. Ergo, there’s no point in building private clouds.

As I’ve pointed out before, there’s a lot more to cloud than simply reducing costs. The biggest benefit is probably the agility that cloud can bring to your IT estate, leveraging a cloud platform’s ability to codify and automate many of the management practices and create a target platform that can work across a range of deployment options, as well as streamlining hardware provisioning. Companies are also increasingly having to deal with the realities of political boundaries, a situation where the best technical solution might not be acceptable due to legal requirements (such as privacy legislation). Developing a private cloud can be a sensible move in this context.

Of course, if you want to compete purely on cost then private cloud will never hit the same price point as public cloud. But this misses the point that for many companies IT flexibility/agility is more important than cost.

Note: I was going to post this as a comment on James’ post, but comments appear to be broken.

Posted via web from PEG @ Posterous

Is “agile enterprise IT” an oxymoron?

Have we managed to design agility out of enterprise IT? Are the two now incompatible? Our decision to measure IT purely in terms of cost (ROI) or stability (SLAs) means that we have put aside other desirable characteristics like responsiveness, making our IT estates more like the lumbering airships of the 1920s. While efficient and reliable (once we got the hydrogen out of them), they are neither exciting or responsive to the business. The business ends up going elsewhere for their thrills. What to do?

LZ-127 Graf Zeppelin
LZ-127 Graf Zeppelin

An interesting post on jugaad over at the Capgemini CTO blog got me thinking. The tension between the managed chaos that jugaad seems to represent and the stability we strive for in IT seems to nicely capture the current tensions between business and IT. Business finds that opportunities are blinking in and out of existence faster than ever before, providing dramatically reduced windows of opportunity leaving IT departments unable to respond in time, prompting the business to look outside the organisation for solutions.

The first rule of CIOs is “you only have a seat at the strategy table if you’re keeping the lights on”. The pressure is on to keep the transactions flowing, and we spend a lot of time and money (usually the vast majority of our budget) ensuring that transactions do indeed flow. We often complain that our entire focus seems to be on cost and operations, when there is so much more we can bring to the leadership team. We forget that all departments labour under a similar rule, and all these rules are really just localised versions of a single overarching rule: the first rule of business, which is to be in business (i.e. remain solvent). Sales needs to sell, manufacturing needs to manufacture, … By devoting so much of our energy on cost and stability, we seems to have dug ourselves into a bit of a hole.

There’s another rule that I like to quote from time-to-time: management is not the art of making the perfect decision, but making a timely decision and then making it work. This seems to be something we’ve forgotten in the West, and particularly in IT. Perfection is an unattainable ideal in the real world, and agility requires a little chaos/instability. What’s interesting about jugaad is the concept’s ability to embrace the chaos required to succeed when resource constraints prevent you for using the perfect (or even simply the best) solution.

Vickers F.B. 5 Gunbus
Vickers F.B.5. Gunbus

Consider a fighter plane. The other day I was watching a documentary on the history of aircraft which showed how the evolution of fighters is a progression from stability to instability The first fighters (and we’re talking the start of WWI here–all fabric and glue) were designed to float above the battlefield where the pilots could shoot down at soldiers, or even lob bombs at them. They were designed to be very stable, so stable that the pilot could ignore the controls for a while and the plane would fly itself. Or you could shoot out most of the control surfaces and still land safely. (Sounds a bit like a modern, bullet proof, IT application, eh?)

The Red Baron: NAME
The Red Baron: Manfred von Richthofen

The problem with these planes is that they are very stable. It’s hard to make them turn and dance about, and this makes them easy to shoot down. They needed to be more agile, harder to shoot down, and the solution was to make them less stable. The result, by the end of WWI, was the fairly unstable tri-planes we associate with the Red Baron. Yes, this made them harder to fly, and even harder to land, but it also made them harder to hit.

Wizz forward to the modern day, and we find that all modern fighters are unstable by design. They’re so unstable that they’re unflyable without modern fly-by-wire systems. Forget about landing: you couldn’t even get them off the ground without their fancy control systems. The governance of the fly-by-wire systems lets the pilot control the uncontrollable.

The problem with modern IT is that it is too stable. Not the parts, the individual applications, but the IT estate as a whole. We’ve designed agility out of it, focusing on creating a stable and efficient platform for lobbing bombs onto the enemy below. This is great is the landscape below us doesn’t change, and the enemy promises not to move or shoot back, but not so good in today’s rapidly changing business environment. We need to be able to rapidly turn and dance about, both to dodge bullets and pounce on opportunities. We need some instability as instability means that we’re poised for change.

Jugaad points out that we need to allow in a bit of chaos if we want to bring the agility back in. The chaos jugaad provides is the instability we need. This will require us to update our governance processes, evolving them beyond simply being a tool to stop the bad happening, transforming governance into a tool for harvesting the jugaad where it occurs. After all, the role of enterprise IT is to capture good ideas and automate them, allowing them to be leveraged across the entire enterprise.

Managing chaos has become something of a science in the aircraft world. Tools like Energy-Maneuverability theory are used during aircraft design to make informed tradeoffs between weight, weapons load, amount of wing (i.e. ability to turn), and so on. This goes well beyond most efforts to map and score business processes, which is inherently a static pieces/parts and cost driven approach. Our focus should be on using different technologies and delivery approaches to modify how our IT estate responds to business change; optimising our IT estate’s dynamic, change-driven characteristics as well as its cost-driven static characteristics.

This might be the root of some of the problems we’re seeing between business and IT. IT’s tendency to measure value in terms of cost and/or stability leads us to create IT estates optimised for a static environment, which are at odds with the dynamic nature of the modern business environment. We should be focusing on the overall dynamic business performance of the IT estate, its energy-maneuverability profile.

Time for a new covenant between business and IT

Garther have suggested that by 2012, 20% of companies will own no IT assets. At the same time we have Forrester predicting a boom in IT. I think both of them are right, and what we’re seeing is a breaking of the old covenant between business and the IT services industry (which includes internal IT departments). The old relationship was founded on the development and maintenance of IT assets (networks, applications, desktops …). The new one will be founded on something different. The new IT industry is going to be a different beast (i.e. no more strategic transformation or infrastructure projects), and we’ll need to radically reconfigure our organisations if we want to play a part.

Posted via web from PEG @ Posterous

Innovation [2010-01-18]

Another week and another collection of interesting ideas from around the internet.

As always, thoughts and/or comments are greatly appreciated.

Is Generation X/Y/Z irrelevant?

Generational distinctions seem to make less and less sense every year. While my grandmother never learnt to drive a car, my mother happily uses a computer and the Internet. Yes, the pace of change has sped up, but it appears that so have we. Age is a very crude factor, and as we shift to increasing personalisation age looks less and less relevant as a driver for change.

Why then do we persist in reporting on how each generations’ habits and predilections will transform the workplace, school or retirement village, when in reality these institutions seem to becoming closer together rather than further apart? Competition in the workplace is the main driver for change, with individuals adopting the tools and techniques they need to get the job done, whatever generation they are from.

There’s been a lot of talk about how the next generation (whichever that happens to be) is going to change the world. We had it with the Greatest Generation. We had it with the Pre-Boomers and Baby Boomers. We had it with Gen X. Now we have it with Gen Y. This might have made sense some time ago, when changes in social mores and practices took longer than a single generation. Change takes time, and if the pressure is only gentle then we can expect significant time to pass before the change is substantial.

I remember my grandmother who never learn’t to drive. Back in the day, before World War II, women driving was not the done thing. My grandmother never learnt to use a video recorder, computer, or the Internet, either. The pressure to change was gentle, and she was happy with her lot.

Sociologists now tell to that the differences between populations is often less than the differences within populations. Or, put another way, on aggregate we’re all pretty much the same. The same is true for my grandmothers. While one never learn’t to drive (among other things), my other grandmother charted a different course. No, she never learnt to use the Internet, but she did take the time when her husband went off to war to learn how to drive, and the both had a bit of a crush on Cary Grant.

If we wizz forward to the present day, then we can see the same dynamics at work. My parents have, in the course of only a few years, leapt from a technology-free zone to the proud owners of laptops, a wireless network, and a passion for doing their own video editing. Even mother-in-law, who has zero experience with technology, bought a Wii recently. She also seems to have more luck with the Wii than her video recorder which she’s never been able to work.

The idea that technology adoption is generational seems to have eroded to the point of irrelevance. There was even a report recently (by Cisco I think, though I can’t find the link) where the researchers could find no significant correlation between new technology adoption and generational strata.

Why then do we persist in pigeon holing generations when it is proven to be counter productive? Not all Gen X’s want to kill themselves. I’m a Gen X, I even like Nirvana, and I’ve yet to have that urge. Not all Gen Y’s want to publish their lives on Facebook. And not all baby boomers want to be helicopter parents. The only accomplishment this type of media story achieves by promoting these stereotypes is to massage the ego of their target demographic. To divide people into generations and say that this generation likes certain tools and techniques, and this generation doesn’t, and will never adapt, is naive.

If we must categorise people, then it makes more sense to use something like NEOs to divide the population into vertical groups based on how we approach life. Do you like change? Do you not? Do you value your privacy? Are you willing to put everything out in public? And so on…

The pace of change has accelerated to the point that everyone’s challenge, from Pre-Boomers and Baby Boomers through to Generation Z, is how to cope with significant change over the next ten year. If we are, as some predict, moving to an innovation economy, then it is the ability to adapt that is most important. Those betting their organisation on a generational change will be sadly disappointed as no generation has a monopoly on coping with change.

A more productive approach is to seek out the people from all generations who thrive in change, and aim for a diverse workforce so that you can tap into the broad range of skills this diversity will provide. Ultimately competition in the workplace is the main determinant for change, with individuals adopting the tools and techniques they need to get the job done, whatever generation they are from.

Updated: Elliot Ross pointed out some interesting research and analysis by Forrester. Forrester coins the term Technographics in their Groundswell work, capture how different people adopt social technologies. There’s even a nice tool which enables you to slice-and-dice the demographics. I’ve added the tool below, and highly recommend taking a look at Forrester’s work.

Updated: Mark Bullen over at Net Gen Skeptic does a nice job of bring some evidence to the debate, with Six reasons to be sceptical.

Reducing costs is not the only benefit of cloud computing & SaaS

The wisdom of the crowd seems to have decided that both cloud computing and its sibling SaaS are cost plays. You engage a cloud or SaaS vendor to reduce costs, as their software utility has the scale to deliver the same functionality at a lower price point than you could do yourself.

I think this misses some of the potential benefits that these new delivery models can provide, from reducing your management overhead, allowing you to focus on more important or pressing problems, through to acting as a large flex resource or providing you with a testbed for innovation. In an environment where we’re all racing to keep up, the time and space we can create through intelligently leveraging cloud and SaaS solutions could provide us with the competitive advantage we need.

Sameul Insull

Could and SaaS are going to take over the world, or so I hear. And it increasingly looks that way, from Nicholas Carr‘s entertaining stories about Sameul Insull through to, Google and Amazon‘s attempts to box-up SaaS and cloud for easy consumption. These companies massive economies of scale enable them to deliver commoditized functionality at a dramatically lower price point that most companies could achieve with even the best on-premises applications.

This simple fact causes many analysts to point out the folly of creating a private cloud. While a private cloud enables a company to avoid the security and ownership issues associated with a public service, they will never be able to realise the same economies of scale as their public brethren. It’s these economies of scale that enables companies like Google to devote significant time and effort into finding new and ever more creative techniques to extract every last drip of efficiency from their data centres, techniques which give them a competitive advantage.

I’ve always had problems with this point of view, as it ignores one important fact: a modern IT estate must deliver more than efficiency. Constant and dramatic business change means that our IT estate must be able to be rapidly reconfigured to support an ever evolving business environment. This might be as simple as scaling up and down, inline with changing transaction volumes, but it might also involve  rewriting business rules and processes as the organisation enters and leaves countries with differing regulation regimes, as well as adapting to mergers, acquisitions and divestments.

Once we look beyond cost, a few interesting potential uses for cloud and SaaS emerge.

First, we can use cloud as a tool to increase the flexibility of our IT estate. Using a standard cloud platform, such as an Amazon Machine Image, provides us with more deployment options than more traditional approaches. Development and testing can be streamlined, compressing development and testing time, while deployed applications can be migrated to the cloud instance which makes the most sense. We might choose to use public cloud for development and testing, while deploying to a private cloud under our own control to address privacy or political concerns. We might develop, test and deploy all into the public cloud. Or we might even use a hybrid strategy, retaining some business functionality in a private cloud, while using one or more public clouds as a flex resource to cope with peak loads.

Second, we can use cloud and SaaS as tools to increase the agility of our IT estate. By externalising the the management of our infrastructure (via cloud), or even the management of entire applications (via SaaS), we can create time and space to worry about more important problems. This enables us to focus on what needs to happen, rather than how to make it happen, and rely on the greater scale of our SaaS or cloud provider to respond more rapidly than we could if we were maintaining a traditional on-premises solution.

And finally, we can use cloud as the basis of an incubator strategy where an organisation may test a new idea using externalised resources, proving the business case before (potentially) moving to a more traditional internal deployment model.

One problem I’ve been thinking about recently is how to make our incredibly stable and reliable IT estates respond better to business change. Cloud and SaaS, with the ability to shape the flexibility and agility of our IT estate to meet what the business needs, might just be the tools we need to do this.

Information overload

We’re drowning in information, as I’ve written about before, both in the context of Business Intelligence and Innovation (whatever that is). An interesting blog post by Tim Kastelle over at his Innovation Leadership Network takes the somewhat contrarian view, that we have always had this information overload problem. Quoting Stowe Boyd, he points out:

I suggest we just haven’t experimented enough with ways to render information in more usable ways, and once we start to do so, it will like take 10 years (the 10,000 hour rule again) before anyone demonstrates real mastery of the techniques involved.

The problem is that our current tooling for information processing is not up to the task at hand. Unfortunately Tim, like most of us, is still trying to find the best way to managed the information load pressing down on us.

Any suggestions?

Posted via email from PEG @ Posterous

Security theater and the value of information

There’s an interesting post over at Bruce Schnier’s blog where he discusses where security did, and didn’t, work with the Christmas underwear bomber incident. As is his usual inclination, he points out that the threat wasn’t new, security (on the whole) worked, and, of interest to us, the fact the more information would not have helped prevent the threat.

After the fact, it’s easy to point to the bits of evidence and claim that someone should have “connected the dots.” But before the fact, when there millions of dots – some important but the vast majority unimportant – uncovering plots is a lot harder.

This is a lot like the challenge we’ve been talking about under the banner of The value of information. How do we make sense of weak, conflicting and volumous signals we see in the environment outside our business, fuse this with strong signals from data inside the business, and create real insight? Granted, sometimes we’re aware of the signals (or at least the shape of their outline) we need to go looking for, much like Tesco’s decision to integrate weather forecasts and historical till information to predict customer demand. In other circumstances, we’re not so sure what we’re looking for. The business equivalent of predicting (and responding to) the underwear bomber might be managing exceptions in a complex, global supply chain, countering a competitor’s new product launch, or supporting a social case worker dealing with a unexpected crisis in a client’s domestic situation.

It’s tempting to create counter measures – prescriptive workflows designed to resolve a problem – to each of these scenarios on a case-by-base basis. Or even just throw up our hands and continue with the tribal processes of old. But, as Bruce points out, this doesn’t work. The challenge with taking action against specific threats is that the terrorist will simply use a new tactic next time, or you’ll be confronted with yet-another situation. Soon you’ll have overloaded your knowledge workers with exception scenarios which only address yesterday’s problems. You’ve started an arms race which you cannot win.

Bruce’s solution, in the context of security, is to integrate information into an operational decision making framework which wards against generic attacks.

What we need is security that’s effective even if we can’t guess the next plot: intelligence, investigation and emergency response.

This prompts me to think of two things:

First, we might need to add third dimension to that figure from Inside vs. OutsidePrecision, to compliment Inside/Outside and Information Age. (Here, the engineer in me is going to split hairs over the definitions of focus, precise and accurate.) This new dimension captures how precise our need is. The Tesco example from above prefers precise signals, signal which communicates a single message. The exception manager might require imprecise signal, a derivative communicating a generic message aggregated generated by correlating a number of (in)precise signals. (A note of caution though, is to remember the recent impact of derivatives on the global financial markets.)

Second, we might want to rethink about how we conceptualise and use information information in our business. We currently have a very linear view, with information generation and consumption tightly connected to the stages of our value chain. It would be interesting to see how some of the ideas and frameworks behind the value of information could be fused with a decisioning framework like OODA. This would provide a tool to simplify the (potentially too complex) value of information framework, and realize it in operational work practices.

I’m not sure about the first point, but I expect the second will be fertile ground for further investigation.

Posted via web from PEG @ Posterous

Innovation [2010-01-04]

Another week and another collection of interesting ideas from around the internet.

As always, thoughts and/or comments are greatly appreciated.

  • Cisco’s Patent Strategy: It’s More Than Numbers [BusinessWeek: NEXT]
    Innovation—at least as measured by patents—seems to fading in the U.S. For the first time, moreover, foreigners obtained more patents than U.S. residents.
  • Technology First, Needs Last [jnd]
    Don Norman has come to an interesting conclusion: design research is great when it comes to improving existing product categories but essentially useless when it comes to new, innovative breakthroughs.
  • Boyer Lectures [Radio National]
    General Peter Cosgrove, AC MC (ret’d) presented the Boyer Lectures, from 8 November 2009, with his 40 years of military experience and service to Australia placing him in a unique position to talk about the challenges and opportunities faced by society today and into the future.
  • Head to Head: Innovation in China and the US [Innovate on Purpose]
    A survey comparing the attitudes and expectations about the US and China in regard to innovation finds some relatively unexpected differences, and some safe assumptions.