Category Archives: Technology and its malcontents

Platforms are the new fool’s gold

Fools-Gold-750x300

I have a new post up on the Deloitte Strategy blog, which I wrote with Richard Millar.

Platforms are all the rage. In the modern digital economy many organisations are looking to create platforms, rather than simply building a traditional value-chain driven company (otherwise known as a ‘pipe’).

In this context, a platform is a business model designed to facilitate exchanges between interdependent groups; as opposed to a pipe, which is centred on the sourcing, production and distribution process. The successful companies of the past focused on controlling distribution (something which is increasingly difficulty in our highly-interconnected digital world), while it’s thought that successful companies in the future will focus on controlling access to customers (which they can do by creating a platform that attracts the best customers).

Platforms are where the smart money is going (particularly if your platform is seen as scalable). There’s even a Platform Strategy Summit where you can learn the tricks that will make your platform successful.

This recent obsession with platforms raises some concerns though, as it seems to confuse cause and effect.

You can find the entire text over at the Strategy blog.

Has Apple made NFC irrelevant?

In The future of exchanging value{{1}} I, along with Peter Williams and Ian Harper at Deloitte, pointed out that a successful retail payments strategy should be founded on empowering consumers and merchants to transact when and where they want to. Investing in technologies such as near-field communication (NFC) networks might allow you to shave a couple of seconds off the transaction time once customer was at the till, but it ignores the fact that consumers are increasingly transacting away from the till as mobile phones and ubiquitous connectivity allow them to transact when and where they want to.

[[1]]Peter Evans-Greenwood, Ian Harper, Peter Williams (2012), The future of exchanging value, Deloitte[[1]]

We are seeing a shift from technology acquisition to technology use. Rather than building a payment strategy around the acquisition of a new technology (such as NFC), a successful strategy needs to be based on streamlining the buying journey. While NFC might enable the consumer to save a few seconds at the till, it does not address the far larger time they spent waiting in the queue beforehand. A more valuable solution might avoid the need to queue entirely. This is a design-led approach, focused on the overall problem the customer is solving and the context in which they are solving. Technologies are pulled into the payment strategy as needed, rather than building the strategy around the acquisition of an asset or capability.

Amazon used this approach with the development of the company’s mobile application, one that allows you snap an image of a barcode to purchase a product. Bricks-and-morter retailers see this as showrooming and unsportsmanlike. Many consumers, however, love the idea.

As I pointed out in The destruction of traditional retail{{2}}:

[[2]]The destruction of traditional retail @ PEG[[2]]

If you’re standing in an aisle casually browsing products then Amazon’s till is closer to you than the one at the front of the store[4]. You also don’t need to worry about carrying your purchase home.

The challenge for retailers (from The future of exchanging value) is to:

… manage a portfolio of technologies, from existing payment infrastructure through NFC to emerging tools, combining them to enable customers to transact when and how they need to.

The way for bricks-and-morter retailers to fight showrooming is use a range of low-cost consumer technologies to make it more convenient to transact with them than an internet retailer.

Apple showed how this might be done during the What’s New in Core Location presentation at the company’s recent Worldwide Developers Conference.

Imagine you walk into Jay’s Donut Shop. iBeacons from Core Location are accurate enough for the retailer to be sure that you have walked in, while other location technologies (such as GPS or those based on Wi-Fi) could, at best, provide a list of guesses. You don’t even need to check in. You could order you donuts before you entered the shop. When you reach the counter your iPhone would display a QR code that a clerk uses to verify the purchase. You grab your donuts and leave, the transaction charged to your iTunes account and your receipt already on your phone.

As Mike Elgan points out in his post Why Apple’s ‘indoor GPS’ plan is brilliant{{3}}, it’s not much of stretch to consider some much more interesting scenarios.

[[3]]Mike Elgan (14th September 2013), Why Apple’s ‘indoor GPS’ plan is brilliant, Computer World.[[3]]

A customer could scan the labels on clothing, process the transaction on the phone, then stroll out of the store with purchases in hand (the alarm would be de-activated for those items).

This is a solution that could be supported tomorrow on all iPhone 4Ss through to the new iPhone 5C. The hardware required to create an iBeacon is already available and it’s cheap, often in the 10s of US$.

NFC continues to struggle and it seems that Apple might have pulled together a solution that makes it irrelevent.

BPM over promised and under delivered

One Saturday night the other week I was typing away on a book that I’m working on (probably called The new instability. How cloud computing, globalisation and social media enable to you to create an unfair advantage) and I let out what was probably a quite involved tweet without any context to explain it.

[blackbirdpie id=”59188145870213120″]

Recently I’ve been thinking about the shift we’re seeing in the business environment. The world seems pretty unstable at the moment. Most business folk assume that this is simply a transition between two stable states, similar to what we’ve seen in the past. This time, however, business seems to be unable to settle into a new groove. The idea behind the book is that the instability we’re seeing is now the normal state of play.

Since Frederick Taylor’s time we’ve considered business – our businesses – vast machines to be improved. Define the perfect set of tasks and then fit the men to the task. Taylor timed workers, measuring their efforts to determine the optimal (in his opinion) amount of work he could expect from a worker in a single day. The idea is that by driving our workers to follow optimal business processes we can ensure that we minimise costs while improving quality. LEAN and Six Sigma are the most visible of Taylor’s grandchildren, representing generations of effort to incrementally chip away at the inefficiencies and problems we kept finding in our organisations.

This is the same mentality – incremental and internally focused, intent on optimising each and every task in our organisations – that we’ve used to apply technology to business. Departmental applications were first deployed to automate small repetitive tasks, such as tracking stock levels or calculating payrolls. Then we looked at the interactions between these tasks, giving birth to enterprise software in the process. Business Process Management (BPM) is the pinnacle of our more recent efforts – pulling in everything from our customers through to suppliers to create optimal straight through processes for our organisation to rely on.

Some vendors have taken this approach to its logical extreme, imagining (and trying to get us to buy) a single technology platform which will allow us to programme our entire business: business operating platforms1)Ismael Ghalimi (2009), Introducing the Business Operating Platform, IT|Redux. They’re aligning elements in the BPM technology stack with the major components found in most computers under the (mistaken) assumption that this will enable them to create a platform for the entire business. Business as programmable machine writ large.

The problem, as I’ve pointed out before2)Business is not a programming challenge @ PEG, is that:

Programming is the automation of the known. Business processes, however, are the management and anticipation of the unknown.

Business is not a computer, with memory, CPUs and disks, and the hope of creating an Excel with which we can play what if with the entire business is simply tilting at windmills.

The focus of business is, and always has been, problems and the people who solve them. Technology is simply a tool we’ve used to amplify these people, starting with the invention of writing through to modern SaaS applications and BPM suites. While technology has had a previously unimaginable impact on business, it can’t (yet) replace the people who solve the problems which create all the value. People collaborate, negotiate, and smash together ideas to find new solutions to old problems. Computers simply replicate what they are told to do.

We’ve reached Taylorism’s use-by date. Define the perfect task and fit the man to the task no longer works. The pace of business has accelerated to the point that the environment we operate in has become perpetually unstable, and this is pushing us to become externally focused, rather than internally focused. We’re stopped worrying about collecting resources to focus on our reactions to problems and opportunities as they present themselves. Computing (calculating payrolls, invoices, or gunnary tables) is less important as it can be obtained on demand, and we’re more concerned with the connections between ourselves and our clients, partners, suppliers and even our competitors. And we’re shifted our focus from collecting ever more data as it becomes increasingly important to ask the questions which enable us to make the right decisions and drive our business forward.

Success today in today’s unstable environment means matching the tactic – process – to the goal we’re trying to achieve and our current environment, with different tactics being using in different circumstances. Rather than support one true way, we need to support multiple ways.

There has been some half steps in the right direction, with the emergence of Adaptive Case Management (ACM)3)Keith D. Swenson (2010), Mastering the Unpredictable, Meghan-Kiffer Press. being the most obvious one. A typical case study for ACM might be something like resolving SWIFT payment exceptions. When the ACM process is triggered a knowledge worker creates a case and starts building a context by pulling data in and triggering small workflows or business processes to seek out data and resolve problems. At some stage the context will be complete, the exception resolved, and the final action is triggered. Contrast this with the standard BPM case study, which is typically a compliance story. (It’s no surprise that regulations such as SOX drove a lot of business processes work.) BPM is a task dependency tool, making it very good at specifying the steps in the required process, but unable to cope with exceptions.

So what do we replace the Talyorism’s catch cry with? The following seems to suit, rooted as it is in the challenge of winning in a rapidly changing environment.

Identify the goal and then assemble the perfect team to achieve the goal.

Note: This was also posted on noprocess.org.

References   [ + ]

1. Ismael Ghalimi (2009), Introducing the Business Operating Platform, IT|Redux
2. Business is not a programming challenge @ PEG
3. Keith D. Swenson (2010), Mastering the Unpredictable, Meghan-Kiffer Press.

Open Data might have failed, but Open Government is still going strong.

It would seem that the shine is starting to wear off the Open Government movement, with a recent report to the US congress challenging some of the assumptions which drove the dictate out of the U.S. Open Government Office1)The Obama Administration’s Open Government Initiative: Issues for Congress [PDF], forcing U.S. departments to publish their data sets. The report found that simply pushing out data has negative outcomes as well as positive ones (which should be no surprise), and that often the cost of pushing out (and maintaining) a data set didn’t outweigh the benefits. Most importantly, it raised the question of whether or not publishing these data sets was a good use of the public’s money.

So, has the business case behind Open Government been found lacking in the harsh light of day? Or is this one of those cases where some faith is – similar as with the investment in the U.S. highway network – because the benefits of stepping into the unknown are not calculable with the crude mechanism of ROI. The truth seems to lie somewhere between the two.

I wouldn’t confuse the investment in the US road network post WWII (or AU’s current investment in a NBN) with Open Government. The former was an investment in an asset which the U.S. government of the time made largely on faith, an investment which is currently seen to be returning $14 billion to the U.S. economy annually. (Australia’s NBN might be heading on a similar journey2)The NBN wants to be free @ PEG.) The latter is actually a philosophical point of view about an approach to government.

The problem is that we confuse “Open Data” with “Open Government”. They’re related, but not the same. Open Government is a move to streamline service acquisition and delivery by exposing the bureaucracy of government and integrating it more tightly with other service providers, and has been progressing nicely for a decade or more now. Open Data is a desire to change the relationship between government and the population, reducing the government to a simple data conduit between the public (or corporations) providing services and the public consuming them.

Open Government has made government easier to deal with by making it easier to find and consume the services you need, and by fostering community. Everything from applying for the dole, getting a grant through to organising a council supported street party is orders of magnitude easier than it was a few decades ago, mainly due to increased transparency. This has been delivered via a range of means, from publishing information on line, through providing better explanations for the services offered and promoting multi-channel access and self service delivery. The latest wave of Open Government is seeing departments integrating external services with their own, putting even more data out in public in the process, as they move from a service-provider to a service-enabler. Ultimately though, if government (as separate from politics) is focused on keeping folk feed and feeling safe then it’s doing it’s job. It’s basic Maslow3)Maslow’s hierarchy of needs @ Changing Minds.

Open Data, though, is based on the view that government should do as little as possible, hand over the data, and let individuals in the public get on with doing what they want. It’s claimed that this will provide transparency (the public has all the data, after all) as well as fostering entrepreneurs to provide innovative solutions to the many problems that confront us today.

It’s quite possible to have transparency and Open Government without the need to publish all your data, and maintain these published versions, as claimed by the Open Data proponents. People need to understand how the wheels of government turn if they want to trust it, and the best way of doing this is usually through key figures and analysis which builds a story and names the important players. Drowning people in data has the opposite effect, hiding government operation behind a wall of impenetrable details. Wikileaks was a great study in this effect, as it was only when the traditional journalists became involved, with their traditional analysis and publication weaving together a narrative the broader public could consume, that the leaks started to have a real impact. (It’s also interesting that the combination of the anonymous drop boxes being created by conventional media, and Open Leaks‘ anonymous mass distribution to conventional media, looks to be a more potent tool than the ideologically pure Wikileaks.)

Nor is treating government as an integration medium the only way to solve the world’s problems. While entrepreneurs and VCs might be the darlings of the moment, there’s many other organisations and governments which are also successfully chipping away at these problems. For every VC backed Bloom Box{{5}} who has mastered marketing hype, there’s a more boring organisation that might have already overtaken them4)New Solid Oxide Fuel Cell System Provides Cheap Grid Energy From CNG and Biogas @ IB Time UK. The entrepreneur model will be part of the solution, but it’s not the silver bullet many claim it to be.

The problem is that Open Data is the result of a libertarian political mindset rather, rather than being a solution to a pressing need. Forcing government to publish all its data sets does not provide or guarantee transparency, nor does it have a direct impact on the services offered by the government. It can also consume significant government resources that might be better spent providing services that the community needs. Publish a data set of no obvious value, or build a homeless shelter? Invest in Semantic Web enabling another data set few use, or pay for disaster relief? These are the tradeoffs that people responsible for the day-to-day operation of government are forced to make. Claims by folk like Tim Berners-Lee that magic will happen once data is out there and ontology enabled have proven to be largely wrong.

However, Open Data does align with a particular political point view. Open Data assumes that we, as a population, want such a small government model, an assumption which is completely unjustified. Some people trust, and want, the government to take responsibility for a lot of these services. Some want to meet the government somewhere in the middle. Open Data tries to force a world that works in shades of grey into a black-or-white choice that driven by a particular world view.

Deciding what and how much the government should be responsible for is a political decision, and it’s one that we revisit every time we visit the ballot box. Each time we vote we evolve, by a small amount, the role government plays in our lives5)What is the role of Government in a Web 2.0 world? @ PEG. (Occasionally we avoid the ballot box and revolt instead.) Should government own the roads? The answer appears to still be yes. Should government own power stations? Generally, no. Should they own the dams? We’re still deciding that one.

It’s in the context of the incremental and ongoing evolution of government’s role in our lives that we can best understand Open Data. Forcing Open Data onto government through mandate (as Obama did) was a political act driven by a desire to force one group’s preferred mode of operation on everyone else. You might want Open Data, but other people have differing priorities. Just because they disagree doesn’t make them wrong. The U.S. congressional report is the mechanism of government responding by documenting the benefits Open Data brought, the problems it caused, and the cost. The benefits (or not) will now be debated, and its future decided at the ballot box.

Open Government is alive and well, and is driving the evolution of government as we know it. Services are being improved, governments are increasingly their integrating services with those of the private sector, and more data will be released to support this. The assumption that all government data should remain secret unless proven otherwise has been flipped, and many public servants now assume that data should be made public unless there’s a good reason not to publish. Government is investing in moving specific information assets online, were it makes sense, and departments are opening up to social media and much closer involvement (and scrutiny) with the public sector. The mechanism of government is evolving, and this is a good thing.

Open Data, though, as an expression of a political point of view, looks like it’s in trouble.

References   [ + ]

Social media: bubble, definitely not; revolution, probably not; evolution, absolutely

Is Social Media in general (and mobility in particular) a bubble or revolution? Is it a a powerful and disruptive force that will transform governments and social organisations? Or is it no? There seems to be a few{{1}} people{{2}} pondering this question

[[1]]The video above is less than a minute long. Please … @ bryan.vc[[1]]
[[2]]Is The Mobile Phone Our Social Net? @ AVC[[2]]

Mobile phones are interesting as they are addressable. Two-way radios made communication mobile a long time ago, but it wasn’t until mobile phones (and cheap mobile phones, specifically) that we could address someone on the move, or someone on the move could address a stationary person or service.

The second and third world showed us the potential of this technology over ten year ago, from the fishermen using their phones to market and sell their catch while still on the boat, through to the distributed banking based on pre-paid mobile phone cards. Image/video sharing is just the latest evolution in this.

The idea that this might be a revolution seems to be predicated on the technology’s ability to topple centrally planned and controlled organisations. Oddly enough, central planning is a bad enough idea to fall over on its own in many cases, and the only effect of mobile technology is to speed up a process which is already in motion. The Soviet Union might well be the poster child for this: collapsing under the weight of it’s own bureaucracy with no help from social media (or mobile phones, for that matter). Even modern democracies are not immune, and the US energy regulation policies leading up to deregulation in the late 70s is a great example of the failures of central planning{{3}}. The (pending) failure of some of today’s more centralised, and authoritarian regimes, would be more accurately ascribed to the inability of slow moving, centrally managed bureaucracies to adapt to a rapidly changing environment. Distributed planning always trumps central planning in a rapidly changing environment.

[[3]]The Role of Petroleum Price and Allocation Regulations in Managing Energy Shortages @ Annual Review of Energy[[3]]

If we pause for a moment, we can see that governments do a few distinct things for us.

  • They provide us with what is seen as essential services.
  • They create a platform to enforce social norms (policies and laws).
  • They engage with the rest of the world on our behalf.

The reality is that many of the essential services that government provides are provided by the government because it’s too difficult or expensive for citizens (and to some extent, corporations) to access the information they need to run these services themselves. Mobile phones (and social media) are just the latest in a series of technologies that have changed these costs, enabling companies and citizens to take responsibility for providing services which, previously, were the sole domain of government. From energy, water and telecoms, through FixMyStreet and the evolving use of social media in New Orleans, Haiti and then Queensland during their respective natural disasters, we can see that this is a long running and continuing trend. Government is migrating from a role of providing all services, to one where government helps facilitate our access to the services we need. Expect this to continue, and keep building those apps.

As a platform for agreeing and enforcing social norms, then it’s hard to see anything replacing government in the short to mid term. (As always, the long term is completely up for grabs.) These social norms are geographical – based on the people you interact with directly on a day-to-day basis – and not virtual. Social media provides a mechanism for government to broaden the conversation. Some governments are embracing this, others, not so much. However, while people like to be consulted, they care a lot more about results. (Think Maslow’s Hierarchy of Needs{{4}}.) Singapore has a fairly restrictive and controlling government, which has (on the whole) a very happy population. China is playing a careful game of balancing consultation, control and outcomes, and seems to doing this successfully.

[[4]]Maslow’s Hierarchy of Needs @ Abraham-Maslow[[4]]

Finally we come to the most interesting question: government as a means for us to engage with the rest of the world. In this area, government’s role has shrunk in scope but grown in importance. Globalisation and the Internet (as a communication tool) has transformed societies, making it cheaper to call friends across the globe than it is to call them around the corner. We all have friends in other countries, cross-border relationships are common, and many of us see ourselves as global citizens. At the same time, the solutions to many of today’s most pressing issues, such as global warming, have important aspects which can only be addressed by our representatives on the global stage.

So we come back to the question at hand: is social media a bubble, a revolution, or an evolution of what has come before.

It’s hard to see it as a bubble: the changes driven by social media are obviously providing real value so we can expect them to persist and expand. I was particularly impressed by how the Queensland government had internalised a lot of the good ideas from the use of social media{{5}} in the Victorian fires, Haiti et al.

[[5]]Emergency services embrace Social Media @ Social Media Daily[[5]]

We can probably discount revolution too, as social media is (at most) a better communication tool and not a new theory of government. (What would Karl Marx think?) However, by dramatically changing the cost of communication it is having a material impact of the role government in our lives{{6}}. Government, and the society it represents is evolving in response.

[[6]]The changing role of government @ PEG[[6]]

The challenge is to keep political preference separate from societal need. While you might yearn for the type of society that Ayn Rand only ever dreamed about, other people find your utopia more akin to one of Dante’s seven circles of hell. Many of the visions for Gov 2.0 are political visions – individuals’ ideas for how they would organise an ideal society – rather than views of how technology can best be used to support society as a whole.

China is the elephant in this room. If social media is a disruptive, revolutionary force, then we can expect China’s government to topple. What appears more likely is that China will integrate social media into its toolbox while it focuses on keeping its population happy, evolving in the process. As long as they deliver the lower half of Maslow’s Hierarchy, they’ll be fairly safe. After all, the expulsion of governments and organisations – the revolution that social media is involved in – is due to these organisations’ inability to provide for the needs of their population, rather than any revolutionary compulsion inherent in the technology itself.

A prediction: many companies will start shedding IT architects in the next six to eighteen months

Business is intensely competitive these days. Under such intense pressure strategy usually breaks down into two things: do more of whatever is creating value, and do less of anything that doesn’t add value. This has put IT architecture in the firing line, as there seems to be a strong trend for architects to focus on technology and transformation, rather than business outcomes. If architects are not seen as taking responsibility for delivering a business outcome, then why does the business need them? I predict that business will start shedding the majority of their architects, just as they did in the eighties. Let’s say in six to eighteen months.

I heard a fascinating distinction the other day at breakfast. It’s the difference between “Architects” and “architects”. (That’s one with a little “a”, and the other with a large one.) It seems that some organisations have two flavours of architect. Those with the big “A” do the big thinking and the long meetings, they worry about the Enterprise, Application and Technology Architectures, and are skilled in the use of whiteboards. And those with the little “a” do the documenting and some implementation work, with Microsoft Visio and Word their tool of choice.

When did we starting trying to define an “Architect” as someone who doesn’t have some responsibility for execution? That’s a new idea for me. I thought that this Architect-architect split was a nice nutshell definition of what seems to be wrong with IT architecture at the moment.

We know that the best architects engage directly with the business and take accountability in providing solutions and outcomes the business cares about. However, splitting accountability between “Architects” and “architects” creates a structure and operation we know is potentially inefficient and disconnected from what’s really important. If the business sees architects (with either a big or little “a”) as not being responsible for delivering an outcome, then why does the business need them?

There’s a lot of hand wringing around the IT architecture community as proponents try to explain the benefits of architecture, and then communicate these benefits to the business. More often than not these efforts fall flat, with abstract arguments about governance, efficiency and business-technology alignment failing to resonate with the business.

“Better communication” might be pragmatic advice, but it ignores the fact that you need to be communicating something the audience cares about. And the business doesn’t care about governance, efficiency of the IT estate or business-technology alignment. You might: they don’t.

In my experience there are only three things that business does care about (and I generally work for the business these days).

  • Create a new product, service or market
  • Change the cost of operations or production
  • Create new interactions between customers and the company

And this seems to be the root of the problem. Neither IT efficiency, nor or governance or business-technology alignment are on that list. Gartner even highlighted this in a recent survey when they queried more than 1,500 business and technology executives to find out their priorities going forward.

Top 10 Business and Technology Priorities in 2010
Top 10 Business and Technology Priorities in 2010

Business need their applications — and are willing to admit this — but do they need better technical infrastructure or SOA (whatever that is)? How does that relate to workforce effectiveness? Will it help sell more product? Eventually the business will reach a point where doing nothing with IT seems like the most pragmatic option.

There’s a few classic examples of companies who get by while completely ignoring the IT estate. They happily continue using decades old applications, tweaking operational costs or worrying about M&A, and making healthy profits all the while. Their IT systems were good enough and fully depreciated, so why bother doing anything?

So what is the cost of doing nothing? Will the business suffer if the EA team just up and left? Or if the business let the entire architecture team go? The business will only invest in an architecture function if having one provides a better outcome than doing nothing. The challenge is that architecture has become largely detached from the businesses they are supposed to support. Architecture have forgotten that they work in logistics company, a bank or a government department, and not for “IT”. The tail is trying to wag the dog.

Defining Architecture (that’s the one with a big “A”) as a group who think the big technological thoughts, and who attend the long and very senior IT vendor meetings just compounds the problem. It sends a strong message to the business that architecture is not interested in the helping the business with the problems that it is facing. Technology and transformation are seen as more important.

It also seems that the business is starting to hear this message, which means that action can’t be far behind. Unless architecture community wakes up and reorganises around to what’s really important — the things that business care about — then we shouldn’t be surprised if business starts shedding these IT architecture functions that the business sees as adding no value. I give it six to eighteen months.

Michelangelo’s approach to workflow discovery

Take any existing workflow — any people driven business process — and I expect that most of the tasks within it could best be described as cruft.

cruft: /kruhft/
[very common; back-formation from crufty]

  1. n. An unpleasant substance. The dust that gathers under your bed is cruft; the TMRC Dictionary correctly noted that attacking it with a broom only produces more.
  2. n. The results of shoddy construction.
  3. vt. [from hand cruft, pun on ‘hand craft’] To write assembler code for something normally (and better) done by a compiler (see hand-hacking).
  4. n. Excess; superfluous junk; used esp. of redundant or superseded code.
  5. [University of Wisconsin] n. Cruft is to hackers as gaggle is to geese; that is, at UW one properly says “a cruft of hackers”.

The Jargon File, v4.4.7

Capturing and improving a workflow (optimising it even) is a processes of removing cruft to identify what really needs to be there. This is remarkably like Michalangelo{{1}}’s approach to carving David{{2}}. When asked how he created such a beautiful sculpture, everything just as it should be, Michalangeo responded (and I’m paraphrasing):

[[1]]Michelangelo Buonarroti[[1]]
[[2]]Michelangelo’s David[[2]]

Michelangelo's David
Michelangelo’s David

David was always there in the limestone; I just carved away the bits that weren’t David.

Cruft is the result of the people — the knowledge workers engaged in the process — dealing with the limitations of last decade’s technology. Cruft is the work-arounds and compensating actions for a fragmented and conflicting IT environment, an environment which gets in the road more often than it supports the knowledge workers. Or cruft might be the detritus of quality control and risk management measures put in place some time ago (decades in many instances) to prevented an expensive mistake that is no longer possible.

Most approaches to workflow automation are based on some sort of process improvement methodology, such as LEAN or Six Sigma. These methods work: I’ve often heard is stated that pointing Six Sigma at a process results in a 30% saving, each and every time. They do this by aggressively removing variation in the process — slicing away unnecessary decisions, as each decisions is an opportunity for a mistake. These decisions might represent duplicated decisions, redundant process steps, or unnecessarily complicated handoffs.

There’s a couple of problems with this though, when dealing with workflow. Looking for what’s redundant doesn’t create an explicit link between business objectives and the steps in the workflow, explicitly justifying each step’s existence, making it hard to ensure that we caught all the cruft. And the aggressive removal of variation can strip a process’s value along with its cost.

Much of the cruft in a workflow process is there for historical reasons. These reasons can range from something bad happened a long time in the past through to we don’t know why, but if we don’t do that then the whole thing falls over. A good facilitator will challenge seemingly obsolete steps, identifying those steps who have served their purpose and should be removed. However, it’s not possible to justify every step without quickly wearing down subject matter experts. Some obsolete steps will always leak through, no matter how many top-down and bottom-up iterations we do.

We can also find that we reach the end of the processes improvement journey only to find that much of the process’s value — the exceptions and variation that make the process valuable — has been cut out to make the process more efficient or easier to implement. In the quest for more science in our processes, we’ve eliminated the art that we relied on.

If business process management isn’t a programming challenge{{3}}, then this holds even truer for human driven workflow.

[[3]]A business process is not a programming challenge @ PEG[[3]]

What we need is a way to chip away the cruft and establish a clear line of traceability between the goals of each stakeholder involved in the process, and each step and decision in the workflow. And we need to do this in a way that allows us to balance art and science.

I’m pretty sure that Michalangeo had a good idea of what he wanted to create when he started belting on the chisel. He was looking for something in the rock, the natural seems and faults, that would let him find David. He kept the things that supported his grand plan, while chipping away those that didn’t.

For a workflow processes, these are the rules, tasks and points of variation that knowledge workers use to navigate their way through the day. Business rules and tasks are the basic stuff of workflow: decisions, data transformation and hand-offs between stakeholders. Points of variation let us identify those places in a workflow where we want to allow variation — alternate ways of achieving the one goal — as a way of balancing art and science.

Rather than focus on programming the steps of the process, worrying if we should send an email or a fax, we need to make this (often) tacit knowledge explicit. Working top-down, from the goals of the business owners, and bottom-up, from the hand-offs and touch-points with other stakeholders, we can chip away at the rock. Each rule, task or point of variation we find is measured against our goals to see if we should chip it away, or leave it to become part of the sculpture.

That which we need stays, that which is unnecessary is chipped away.

Taxonomies 1, Semantic Web (and Linked Data) 0

I’m not a big fan of Semantic Web{{1}}. For something that has been around for just over ten years — and which has been aggressively promoted by the likes of Tim Berners-Lee{{2}} — very little real has come of it.

Taxonomies, on the other hand, are going gangbusters, with solutions like GovDirect{{3}} showing that there is a real need for this sort of data-relationship driven approach{{4}}. Given this need, if the flexibility provided by Semantic Web (and more recently, Linked Data{{5}}) was really needed, then we would have expected someone to have invested in building significant solutions which use the technology.

While the technology behind Semantic Web and Linked Data is interesting, it seems that most people don’t think it’s worth the effort.

All this makes me think: the future of data management and standardisation is ad hoc, with communities or vendors scratching specific itches, rather than formal, top-down, theory driven approaches such as Semantic Web and Linked Data, or even other formal standardisation efforts of old.

[[1]]SemanticWeb.org[[1]]
[[2]]Tim Berners-Lee on Twitter[[2]]
[[3]]GovDirect[[3]]
[[4]]Peter Williams on the The Power of Taxonomies @ the Australian Government’s Standard Business Reporting Initiative[[4]]
[[5]]LinkedData.org[[5]]

The technologies behind the likes of Semantic Web and Linked Data have a long heritage. You can trace them back to at least the seventies when ontology and logic driven approaches to data management faced off against relational methodologies. Relational methods won that round — just ask Oracle or the nearest DBA.

That said, there has been a small number of interesting solutions built in the intervening years. I was involved in a few in one of my past lives{{6}}, and I’ve heard of more than a few built by colleagues and friends. The majority of these solutions used ontology management as a way to streamline service configuration, and therefor ease the pain of business change. Rather than being forced to rebuild a bunch of services, you could change some definitions, and off you go.

[[6]]AAII[[6]]

What we haven’t seen is a well placed Semantic Web SPARQL{{7}} query which makes all the difference. I’m still waiting for that travel website where I can ask for a holiday, somewhere warm, within my budget, and without too many tourists who use beach towels to reserve lounge chairs at six in the morning; and get a sensible result.

[[7]]SPARQL @ w3.org[[7]]

The flexibility which we could justify in the service delivery solutions just doesn’t appear to be justifiable in the data-driven solution. A colleague showed my a Semantic Web solution that consumed a million or so pounds worth of tax payer money to build a semantic-driven database for a small art collection. All this sophisticated technology would allow the user to ask all sorts of sophisticated questions, if they could navigate the (necessarily) complicated user interface, or if they could construct an even more daunting SPARQL query. A more pragmatic approach would have built a conventional web application — one which would easily satisfy 95% of users — for a fraction of the cost.

When you come down to it, the sort of power and flexibility provided by Semantic Web and Linked Data could only be used by a tiny fraction of the user population. For most people, something which gets them most of the way (with a little bit of trial and error) is good enough. Fire and forget. While the snazzy solution with the sophisticated technology might demo well (making it good TED{{8}} fodder), it’s not going to improve the day-to-day travail for most of the population.

[[8]]TED[[8]]

Then we get solutions like GovDirect. As the website puts it:

GovDirect® facilitates reporting to government agencies such as the Australian Tax Office via a single, secure online channel enabling you to reduce the complexity and cost of meeting your reporting obligations to government.

which make it, essentially, a Semantic Web solution. Except its not, as GovDirect is built on XBRL{{9}} with a cobbled together taxonomy.

[[9]]eXtensible Business Reporting Language[[9]]

Taxonomy driven solutions, such as GovDirect might not offer the power and sophistication of a Semantic Web driven solution, but they do get the job done. These taxonomies are also more likely to be ad hoc — codifying a vendor’s solution, or accreted whilst on the job — than the result of some formal, top down ontology{{10}} development methodology (such as those buried in the Semantic Web and Linked Data).

[[10]]Ontology defined in Wikipedia[[10]]

Take Salesforce.com{{11}} as an example. If we were to develop a taxonomy to exchange CRM data, then the most likely source will be other venders reverse engineering{{12}} whatever Salesforce.com is doing. The driver, after all, is to enable clients to get their data out of Salesforce.com. Or the source might be whatever a government working group publishes, given a government’s dominant role in its geography. By extension we can also see the end of the formal standardisation efforts of old, as they devolve into the sort of information frameworks represented by XBRL, which accrete attributes as needed.

[[11]]SalesForce.com[[11]]
[[12]]Reverse engineering defined in Wikipedia[[12]]

The general trend we’re seeing is a move away from top-down, tightly defined and structured definitions of data interchange formats, as they’re replaced by bottom-up, looser definitions.

Vacuum flasks: fulfilling a need

As seen on a plaque at Scienceworks in the House Secrets exhibit.

James Dewar invented the vacuum flask in 1892 to keep laboratory gases cold. Twelve years later, Reinhold Burger manufactured the Thermos to keep our picnic drinks hot.

A nice demonstration of the third of Peter Drucker’s seven sources of innovation.

Innovation based on process need.

Or, put another way, James Dewar scratched an itch; though he did play Edison to Reinhold Burger’s Sameul Insull.

Posted via web from PEG @ Posterous

What I like about jet engines

Rolls-Royce{{1}} (the engineering company, not the car manufacturer) is an interesting firm. From near disaster in the 70s, when the company was on the brink of failure, Rolls-Royce has spent the last 40 years reinventing itself. Where it used to sell jet engines, now the company sells hot air out the back of the engines, with clients paying only for the hours an engine is in service. Rolls-Royce is probably the one of the cleanest examples of business-technology{{2}} that I’ve come across; with the company picking out the synergies between business and technology to solve customer problems, rather than focusing on trying to align technology delivery with a previously imagined production process to push products at unsuspecting consumers. I like this for a few reasons. Firstly, because it wasn’t a green fields development (like Craig’s List{{3}} et al), and so provides hope for all companies with more than a few years under their belt. And secondly, as the transformation seems to have be the result of many incremental steps as the company felt its way into the future, rather than as the result of some grand, strategic plan.

[[1]]Rolls Royce[[1]]
[[2]]Business-Technology defined @ Forrester[[2]]
[[3]]Craig’s list[[3]]

A Rolls-Royce jet engine

I’ve been digging around for a while (years, not months), looking for good business-technology case studies. Examples of organisations which leverage the synergies between business and technology to create new business models which weren’t possible before, rather than simply deploying applications to accelerate some pre-imagined human process. What I’m after is a story that I can use in presentations and the like, and which shows not just what business-technology is, but also contrasts business-technology with the old business and technology alignment game while providing some practical insight into how the new model was created.

For a while I’ve been mulling over the obvious companies in this space, such as Craig’s List or Zappos{{4}}. While interesting, their stories don’t have the impact that they could as they were green fields developments. What I wanted was a company with some heritage, a history, to provide the longitudinal view this needs.

[[4]]Zappos[[4]]

The company I keep coming back to is Rolls-Royce. (The engineering firm, not the car manufacturer). I bumped into a story in The Economist{{5}}, Britain’s lone high-flier{{6}}, which talks about the challenge of manufacturing in Britain. (Which is, unfortunately, behind the pay wall now.) As The Economist pointed out:

A resurgent Rolls-Royce has become the most powerful symbol of British manufacturing. Its success may be hard to replicate, especially in difficult times.

[[5]]The Economist[[5]]
[[6]]Britain’s lone high-flier @ The Economist[[6]]

With its high costs and (relatively) inflexible workforce, running an manufacturing business out of Britain can be something of a challenge, especially with China breathing down your neck. Rolls-Royce’s solution was not to sell engines, but to sell engine hours.

This simple thought (which is strikingly similar to the tail of the story in Mesh Collaboration{{7}}) has huge ramifications, pushing the company into new areas of the aviation business. It also created a company heavily dependent on technology, from running realtime telemetry around the globe through to knowledge management. The business model — selling hot air out the back of an engine — doesn’t just use technology to achieve scale, but has technology woven into its very fabric. And, most interestingly, it is the result of tinkering, small incremental changes rather than being driven by some brilliant transformative idea.

[[7]]Mash-Up Corporations[[7]]

As with all these long term case studies, the Rolls-Royce story does suffer from applying new ideas to something that occurred yesterday. I’m sure that no one in Rolls-Royce was thinking “business-technology” when the company started the journey. Nor would they have even thought of the term until recently. However, the story still works for me as, for all it’s faults, I think there’s still a lot we can learn from it.

The burning platform was in the late 60s, early 70s. Rolls-Royce was in trouble. The company had 10% market share, rising labour costs, and was facing fierce competition from companies in the U.S. Even worse, these competitors did not have to worry about patents (a hangover from the second world war), they also had a large domestic market and a pipeline of military contracts which put them in a much stronger financial position. Rolls-Royce had to do something radical, or facing being worn down by aggressive competitors who had more resources behind them.

Interestingly, Roll-Royce chose to try and be smarter than the competition. Rather than focus on incremental development, the company decided to designed a completely new engine. Using carbon composite blades and a radical new engine architecture (three shafts rather than two, for those aeronautical engineers out there) their engine was going to be a lot more complex to design, build and maintain. It would also be a lot more fuel efficient and suffer less wear and tear. And it would be more scalable to different aircraft sizes. This approach allows Rolls-Royce to step out of the race for incremental improvements in existing designs (designing a slightly better fan blade) and create a significant advantage, one which would take the company’s competitors more than the usual development cycle or two to erase.

Most of the margin for jet engines, however, is in maintenance. Some pundits even estimate that engines are sold at a loss (though the manufactures claim to make modest margins on all the engines they sell), while maintenance can enjoy a healthy 35%. It’s another case of give them the razor but sell them the razor blades. But if you give away the razors, there’s always the danger that someone else may make blades to fit your razor. Fat margins and commoditized technology resulted in a thriving service market, with the major engine makers chasing each other’s business, along with a horde of independent servicing firms.

Rolls-Royce’s interesting solution was to integrate the expertise from the two businesses: engine development and servicing. Rather than run them as separate businesses, the company convinced customers to pay a fee for every hour an engine was operational. Rather than selling engines, the company sells hot air out the back of an engine. This provides a better deal for the customers (pay for what you use, rather than face a major capital expense), while providing Rolls-Royce with a stronger hold on its customer base.

Integrating the two business also enabled Rolls-Royce to become better at both. Maintenance data helps the company identify and fix design flaws, driving incremental improvements in fuel efficiency while extending the operating life (and time between major services) tenfold over the last thirty years. It also helps the company predict engine failures, allowing maintenance to be scheduled at the most opportune time for Rolls-Royce, and their customers.

Rolls-Royce leveraged this advantage to become the only one of the three main engine-makers with designs to fit the three newest airliners in the market: the Boeing 787 Dreamliner, the Airbus A380 and the new wide-bodied version of the Airbus A350. Of the world’s 50 leading airlines, 45 use its engines.

Today, an operations centre in Derby assess, in real time, the performance of 3,500 jet engines enabling to Rolls-Royce to spot issues before they become problems and schedule just-in-time maintenance. This means less maintenance and more operating hours, fewer breakdowns (and, I expect, happier customers), and the operational data generated is fed back into the design process to help optimise the next generation of engines.

This photograph is reproduced with the permission of Rolls-Royce plc, copyright © Rolls-Royce plc 2010
Rolls-Royce civil aviation operations in Derby

This service-based model creates a significant barrier to competitors for anyone who wants to steal Rolls-Royce’s business. Even if you could clone Rolls-Royce’s technology infrastructure (hard, but not impossible), you would still need to recreate all the tacit operational knowledge the company has captured over the years. The only real option is to recreate the knowledge yourself, which will take you a similar amount of time as it did Rolls-Royce, while Rolls-Royce continues to forge ahead. Even poaching key personnel from Rolls-Royce would only provide a modest boost to your efforts. As I’ve mentioned before{{8}}, this approach has the potential to create a sustainable competitive advantage.

[[8]]One of the only two sources of sustainable competitive advantage available to us today @ PEG[[8]]

While other companies have adopted some aspects of Rolls-Royce’s model (including the Joint Strike Fighter{{9}}, which is being procured under a similar model), Rolls-Royce continues to lead the pack. More than half of its existing engines in service are covered by such contracts, as are roughly 80% of those it is now selling.

[[9]]The Joint Strike Fighter[[9]]

I think that this makes Rolls-Royce a brilliant example of business-technology in action. Rolls-Royce found, by trial and error, a new model that wove technology and business together in a way that created an “outside in” business model, focused on what customers what to buy, rather than on a more traditional “inside out” model based on pushing products out into the market that the company wants to sell. You could even say that it’s an “in the market” model rather than a “go to market” model. And they did this with a significant legacy, rather than as a green fields effort.

In some industries and companies this type of “outside in” approach was possible before advent of the latest generation of web technology, particularly if it was high value and the company already had a network in place (such as Rolls-Royce success). For most companies it is only now becoming possible with business-technology along with some of the current trends, such as cloud computing, which erase many of the technology barriers.

The challenge is to figure out the “in the market” model you need, and then shift management attitude. Given constant change in the market, this means an evolutionary approach, rather than a revolutionary (transformative) one.