Monthly Archives: February 2010

Consulting doesn’t work any more. We need to reinvent it.

What does it mean to be in consulting these days? The consulting model that’s evolved over the last 30 – 50 years seems to be breaking down. The internet and social media have shifted the way business operates, and the consulting industry has failed to move with it. The old tricks that the industry has relied on — the did it, done it stories and the assumption that I know something you don’t — no longer apply. Margins are under pressure and revenue is on the way down (though outsourcing is propping up some) as clients find smarter ways to solve problems, or decide that they can simply do without. The knowledge and resources the consulting industry has been selling are no longer scarce, and we need to sell something else. Rather than seeing this as a problem, I see it as a huge opportunity; an opportunity to establish a more collaborative and productive relationship founded on shared, long term success. Sell outcomes, not scarcity and rationing.

I’m a consultant. I have been for some time too, working in both small and large consultancies. It seems to me that the traditional relationship between consultancy and client is breaking down. This also appears to be true for both flavours of consulting: business and technology. And by consulting I mean everything from the large tier ones down to the brave individuals carving a path for themselves.

Business is down, and the magic number seems to be roughly a 17% decline year-on-year. One possible cause might be that the life blood of the industry — the large multi-year transformation project — has lost a lot of its attraction in recent years. If you dig around in the financials for the large publicly listed consultancies and vendors you’ll find that the revenue from IT estate renewal and transformation (application licenses, application configuration and installation services, change management, and even advisory) is sagging by roughly 17% everywhere around the globe.

SABER @ American Airlines

Large transformation projects have lost much of their attraction. While IBM successfully delivered SABER back in the 60s, providing a heart transplant for American Airlines ticketing processes, more recent stabs at similarly sized projects have met with less than stellar results. Many more projects are quietly swept under the carpet, declared a success so that involved can move on to something else.

The consulting model is a simple one. Consultants work on projects, and the projects translate into billable hours. Consultancies strive to minimise overheads (working on customer premises and minimising support staff), while passing incidental costs through to clients in the form of expenses. Billable hours drive revenue, with lower grades provide higher margins.

This creates a couple of interesting, and predictable, behaviours. First, productivity enhancing tooling is frowned on. It’s better to deploy a graduate with a spreadsheet than a more senior consultant with effective tooling. Second, a small number of large transactions are preferred to a large number of small transactions. A small number of large transactions requires less overhead (sales and back-office infrastructure).

All this drives consultancies to create large, transformational projects. Advisory projects end up developing multi-year (or even multi-decade) roadmaps to consolidate, align and optimise the business. Technology projects deliver large, multi-million dollar, IT assets into the IT estate. These large, business and IT transformation projects provide the growth, revenue and margin targets required to beat the market.

This desire for large projects is packaged up in what is commonly called “best practice”. The consulting industry focuses on did it, done it stories, standard and repeatable projects to minimise risk. The sales pitch is straight-forward: “Do you want this thing we did over here?” This might be the development of a global sourcing strategy, an ERP implementation, …

Spencer Tracy & Katharine Hepburn in The Desk Set
Spencer Tracy & Katharine Hepburn in The Desk Set

This approach has worked for some time, with consultancy and client more-or-less aligned. Back when IBM developed SABER you were forced to build solutions from the tin up, and even small business solutions required significant effort to deliver. In the 1957, when Spencer Tracy played a productivity expert in The Desk Set, new IT solutions required very specific skills sets to develop and deploy. These skills were in short supply, making it hard for an organisation to create and maintain a critical mass of in-house expertise.

Rather than attempt to build an internal capability — forcing the organisation on a long learning journey, a journey involving making mistakes to acquire tacit knowledge — a more pragmatic approach is to rent the capability. Using a consultancy provides access to skills and knowledge you can’t get elsewhere, usually packaged up as a formal methodology. It’s a risk management exercise: you get a consultancy to deliver a solution or develop a strategy as they just did one last week and know where all the potholes are. If we were cheeky, then we would summerize this by stating that consultancies have a simple value proposition: I know something you don’t!

It’s a model defined by scarcity.

A lot has changed in the last few years; business moves a lot faster and a new generation of technology is starting to take hold. The business and technology environment is changing so fast that we’re struggling to keep up. Technology and business have become so interwoven that we now talk of Business-Technology, and a lot of that scarce knowledge is now easily obtainable.

The Diverging Pulse Rates of Business and Technology
The Diverging Pulse Rates of Business and Technology

The scarce tacit knowledge we used to require is now bundled up in methodologies; methodologies which are trainable, learnable, and scaleable. LEAN and Six Sigma are good examples of this, starting as more black art than science, maturing into respected methodologies, to today where certification is widely available and each methodology has a vibrate community of practitioners spread across both clients and consultancies. The growth of MBA programmes also ensures that this knowledge is spread far and wide.

Technology has followed a similar path, with the detailed knowledge required to develop distributed solutions incrementally reified in methodologies and frameworks. When I started my career XDR and sockets were the networking technologies of the day, and teams often grew to close to one hundred engineers. Today the same solution developed on a modern platform (Java, Ruby, Python …) has a team in the single digits, and takes a fraction of the time. Tacit knowledge has be reified in software platforms and frameworks. SaaS (Software as a Service) takes this to a while new level by enabling you to avoid software development entirely.

The did it, done it stories that consulting has thrived on in the past are being chewed up and spat out by the business schools, open source, and the platform and SaaS vendors. A casual survey of the market usually finds that SaaS-based solutions require 10% of the installation effort of a traditional on-premsis solution. (Yes, that’s 90% less effort.) Less effort means less revenue for the consultancies. It also reduces the need for advisory services, as provisioning a SaaS solution with the corporate credit card should not require a $200,000 project to build a cost-benefit analysis. And gone are the days when you could simply read the latest magazines and articles from the business schools, spouting what you’d read back to a client. Many clients have been on the consulting side of the fence, have a similar education in the business schools, and reads all the same articles.

I know and you don’t! no longer works. The world has moved on and the consulting industry needs to adapt. The knowledge and resources the industry has been selling are no longer scarce, and we need to sell something else. I see this is a huge opportunity; an opportunity to establish a more collaborative and productive relationship founded on shared, long term success. As Jeff Jarvis has said: stop selling scarcity, sell outcomes.

Updated: A good friend has pointed out the one area of consulting — one which we might call applied business consulting — resists the trend to be commoditized. This is the old school task of sitting with clients one-on-one, working to understand their enterprise and what makes it special, and then using this understanding to find the next area or opportunity that the enterprise is uniquely qualified to exploit. There’s no junior consultants in this area, only old grey-beards who are too expensive to stay in their old jobs, but that still are highly useful to the industry. Unfortunately this model doesn’t scale, forcing most (if not all) consultancies into a more operational knowledge transfer role (think Six Sigma and LEAN) in an attempt to improve revenue and GOP.

Updated: Keith Coleman (global head of public sector at Capgemini Consulting) makes a similar case with Time to sell results, not just advice (via @rpetal27).

Updated: I’ve responded to my own post, tweaking my consulting page to capture my take on what a consultant needs to do in this day and age.

What does it take to be an expert?

How do we measure a guru’s worth? In this case, I’m specifically thinking about social media / communications gurus. Do you need 10,000 followers watching every tweet about the incremental progress of your hair cut? Or is it enough to squeeze out one gem a day which then is referred through multiple social networks?

This is a variation of the old (in Internet terms) chestnut, “would you trust a social media expert who doesn’t use social media”. It’s hard to see this as a black and white issue though, as I doubt there’s a communications professional out there who doesn’t use social media in some way. The real distinction is between someone who gorges at the social media trough, against someone who picks and chooses their involvement.

I’m reminded of a comment from rec.food.cooking back in the early 90s:

Never trust a skinny chef.

The assumption was that a chef who didn’t enjoy food enough to over indulge couldn’t be a good chef. This is just wrong, confusing quantity with quality. Just like the critic in Ratatouille, some chefs are more selective about what they consume, but this doesn’t mean that they have less passion or ability than their more indulgent peers.

I don't LIKE food. I LOVE it. If I don't love it, I don't SWALLOW.
Anton Ego

Anton Ego: I don’t LIKE food. I LOVE it. If I don’t love it, I don’t SWALLOW.

So the real question is: do we measure social media professionals by the volume of their engagement with the medium, or by the quality of their engagement. McDonalds (an international chain) vs. Vue de Monde (50 seats in Melbourne). Do you need to tweet 50 times a day to be considered a guru? Or will one well placed tweet a day qualify you?

My preference is for the advice of someone who demonstrates knowledge and insight into the medium, an understanding of the problem I’m trying to solve, and I’ll measure that insight by what they publish. The volume of their engagement is a secondary concern.

Renovation

I’ve done a bit of spring cleaning of the blog on a quiet Sunday afternoon (plus the kids are monopolising the Wii, so I can’t play New Super Mario Bros).

There’s more to do, but the big change is to gather some of the article threads into categories. A couple of posts seem to have taken a life of their own, and the resultant ping-pong between this blog and others has generated some interesting narratives on a couple of topics. Rather than leave them hidden in the threads, I’ve created a Focus category, and started to collect each thread in a sub-category.

So far:

  • The Value of Information. Starting with a simple observation that when we get information has as much impact as what we get, this thread generated some nice thoughts on how we might use information to create a more dynamic enterprise.
  • The Art of Random. Triggered by an invitation to present at InnoFuture — which unfortunately didn’t eventuate — I used the content create a series of posts (the outline for the preso was around six pages, so it would have been to much for one post). It covers the idea that innovation seems random due to the simple fact that your are not aware of the intervening steps from interesting problem through to novel solution.

There’s also a placeholder for Knowledge Worker of the Future, but more on that later.

Oh — and my favorite flying car is now in the header. Next I need to sort out the CSS colours to match.

One of the only two sources of sustainable competitive advantage available to us today

I stumbled onto a somewhat interesting post over at HBR, which talks Garry Kasparov’s ideas in the business world. This is actually quite a relevant pairing, though an old one in the tradition of human-computer augmentation.

The idea a simple one, which takes far fewer words to express than the article took.

Use information technology to augment users, rather than replace them.

IT is good at lot of tasks, and less good at others. People, too, have their strengths and weaknesses. What’s interesting is that computers are weak where people are strong, and vice-versa. Computers excel as appliers of algorithms with huge memories and an attention to detail; people are powerful, creative problem solvers who have trouble thinking of four things at once and like coffee breaks. Why not pair the two, and get the best of both worlds.

Rather than replace the users, why don’t we use technology to automate the easy (for technology) 80% of what they do. (This is something I’ve written about before.) In the chess example, the easy 80% is providing the user with a chess computer for the commoditized solution space search, allowing them to focus on strategy. The performance improvement this approach provides can create an significant competitive advantage. As Garry Kasparov found, even a weak user with a chess computer can be impossible to defeat, by human or computer.

This then provides us with two options:

  1. Take the improvement as a saving by reducing head count.
  2. Reinvest the improvement by providing our users with more time to focus on the hard 20%.

(I must admit, i much prefer the later.)

If we continue to focus on automating the next easy 80%, we’ve created a platform and process for continual business optimisation. (Improvements in search efficiency would simply be harvested when appropriate to maintain parity.) Interestingly, this is one of only two sources of a sustainable competitive advantage available to us today.

The competative advantage with this approach rests with the user, in the commonplaces, the strategies, they use to solve problems. By reifying the easy 80% these strategies in software (processes and rules) we are moving some of the competitive advantage into the organisation with it can leveraged by other users. By continually attacking the easy 80% of what the users are doing, we are continually improving our competitive position. We could even sell our IT platform (but not the reified problem solving strategies) to our competitors — commoditzing the platform to realise a cost saving — without endangering our competitive position, as they would need to go through the same improvement and learning process that we did, while we continue to race ahead.

Now that’s scary: as long as we keep improving our process, our competitors will never be able to catch us.

Posted via web from PEG @ Posterous

Innovation [2010-02-17]

Another week and another collection of interesting ideas from around the internet.

As always, thoughts and/or comments are greatly appreciated.

  • An innovation report card [The Conference Board of Canada]
    Countries with the highest overall scores not only spend more on science and technology but also have policies that drive innovation supply and demand.
  • Innovation: what’s your score? [McKinsey & Company: What Matters]
    Can companies measure the impact of their innovation activities? Can they benchmark their performance on innovation against that of their peers? Can the long-term effects of innovation strategies be tracked systematically? Yes, yes, and yes. In fact, not only can companies objectively assess innovation; we believe they must. Only then will they know how to select the right strategies and execute them well.
  • The Original Futurama: The Legacy of the 1939 World’s Fair [Popular Mechanics]
    Seventy years after the closing of the 1939 New York World’s Fair, The Daily Show writer Elliott Kalan looks back at its past vision of the World of Tomorrow.
  • Why private companies are more innovative [BusinessWeek: NEXT]
    Do privately held companies have an edge when it comes to long-term innovation? At least some of them seem to. Recently, Al Gore—former Vice-President and Senator and now Nobel Prize-winning environmental evangelist—declared S.C. Johnson & Son one of the most sustainable companies in the world.

What is the role of government in a Web 2.0 world?

What will be the role of government in a post Web 2.0 world? I doubt it’s what a lot of us predict, given society’s poor track record in predicting it’s own future.

One thing I am reasonably sure of though, is that this future won’t represent the open source nirvana that some pundits hope for. When I’ve ruminated in the past about the changing role of government, I’ve pointed out that attempting to create the future by dictate is definitely not the right approach. As I said then:

You don’t create peace by starting a war, and nor do you create open and collaborative government through top down directives. We can do better.

There was an excellent article by Nat Torkington, Rethinking open data, posted over at O’Reilly radar which shows this in action. As it points out, the U.S. Open Government Directive has prompted datasets of questionable value to be added to data.gov; while many of the applications are developed as they are easy to build, rather than providing any tangible benefit. Many of the large infrastructure projects commissioned in the name of open data suffered the same fate as large, unjustified infrastructure projects in private enterprise (i.e. they’re hard for the layman to understand, they have scant impact on solving the problems society seems plagued with, and they’re overly complex to deliver and use due to technological and political puritism).

A more productive approach is focus on solving problems that we, the populace, actually care about. In Australia this might involve responding to the bush fire season. California has a similar problem. The recent disaster in Haiti was another significant call to action. It was great to see the success that was Web 2.0 in Haiti (New Scientist had an excellent article).

As Nat Torkington says:

the best way to convince them to open data is to show an open data project that’s useful to real people.

Which makes me think: government is a tool for us to work together, not the enemy to subdue. Why don’t we move government on from service provider of last resort, which is the role it seems to play today.

Haiti showed us that some degree of centralisation is required to make these efforts work efficiently. A logical role for government going forward would be something like a market maker: connecting people who need services with the organisations providing them, and working to ensure that the market remains liquid. Government becomes the trusted party that ensures that there are enough service providers to meet demand, possibly even bundling service to provide solutions to life’s more complex problems.

We’ve had the public debate on whether or not government should own assets (bridges, power utilities etc.), and the answer was generally not. Government provision of services is well down a similar road. This frees up dedicated and hard working public servants (case workers, forestry staff, policy wonks …) to focus on the harder problem of determining what services should be provided.

Which brings me back to my original point. Why are we trying to drive government, and society in general, toward a particular imagined future of our choosing (one involving Open Government Directives, and complicated and expensive RDF infrastructure projects). We can use events like the bush fires and Haiti to form a new working relationship. Let’s pick hard but tractable problems and work together to find solutions. As Nat (again) points out, there’s a lot of data in government that public servants are eager to share, if we just give them a reason. And if our efforts deliver tangible benefits, then everyone will want to come along for the ride.

Updated: The reports are in: data.gov has quality issues. I’ve updated the text updated with the following references.

Updated: More news on data.gov’s limitations highlighting the problems with a “push” model to open government.

The benefits of SaaS (beyond low cost)

I’ve already written about why I think private clouds can be a good idea. Similar arguments can be made for SaaS, and then some. A friend and I did the email-ping-pong thing and ended up with a (shortish) list of reasons why to go with a SaaS solution over an traditional on-premises solution.

  • OPEX rather than CAPEX cost. The CAPEX gulp is minimised, and the ongoing costs are tied to your own operational cost (head count, etc).
  • Faster provisioning. SaaS is can be up to 90% faster to deploy than on-premises solutions. (Weeks/months rather than months/years.)
  • No more upgrades. You’re always on the latest version, and new features are roll out organically rather than every few years as part of a change management process.
  • More focused vendor and community support. As there is only a single version in play, support efforts from the vendor and user community are focused on the version that you’re using. This also avoids the problem of getting left behind on a stale and unsupported platform (been there, done that, and have the scars to prove it).
  • SaaS provides a platform that scales organically with our organization. You’re not required to invest in additional hardware, software, and provisioning processes, letting your business focus on the business.
  • Reduced IT involvement. IT resources can focus on specific business problems rather than the care and feeding of the system.
  • Try before you buy. Instead of a traditional big license gulp at risk, sign up for a handful of SaaS seats for a few weeks and try it out. (From @shermo1.)

Any more?

Posted via web from PEG @ Posterous

Managing personalisation is more important than managing change

Death, taxes, and now, change, are the eternal verities. As I said in another post:

The pace of change has accelerated to the point that everyone’s challenge, from Pre-Boomers and Baby Boomers through Generation Y to Generation Z, is how to cope with significant change over the next ten years. If we are, as some predict, moving to an innovation economy, then it is the ability to adapt that is most important. Those betting their organisation on a generational change will be sadly disappointed as no generation has a monopoly on coping with change.

While the youngest generation (whichever that is at a particular point in time) might have the advantage of coming unencumbered to the new ways of working, every generation has a unfortunate habit of treating what they learnt in their formative years (~24) as dogma once they hit their late 20s. Social research has shown that most people’s interest in novel ideas or experiences peaks around the mid to late 20s. (Tell me your favourite band and cuisine, and I’ll tell you what decade you grew up in.) Or, put another way, 24–28 might have the advantage in a rapidly changing world, but once you grow out the top of that age bracket you’ll find yourself at the disadvantage.

However, as with all gross generalisations, and the exceptions are more interesting than the rule; in this case the commonalities between groups are usually stronger than the differences between them. Research like Forrest’s Groundswell show that its more productive to think in terms of personality types.

I prefer to focus on getting stuff done, and ensuring that each and every stakeholder has the tools and support they need to get their job done. This is not a static thing either, something we do once for each stakeholder, as someone’s needs and preferences can change month-by-month, week-by-week, day-by-day or even minute-by-minute.

And this is probably the most important mega-trend we’re seeing emerge at the moment: the drive to continually personalise communication/products/services/tools for each and every individual, rather than trying to divide people into coarse-grained, and increasingly unproductive, demographic groups with predefined needs. If you’re managing change, then you’re still thinking in terms of a static work/home environment that needs to be transformed (however regularly). If you’re managing personalisation, then you’re focused on creating a continually optimised environment for all your stakeholders, ensuring that they have the information and tools they need at that moment. Change isn’t an enemy that should be managed—its a tool to help you achieve, and sustain, peak performance.

Posted via web from PEG @ Posterous

Finding the new white spaces

There’s a quite a bit of noise in the blogsphere about the coming entrepreneurship boom, generating yet another pointless debate about the distinction between generations. What is really going to drive this new boom is the ability to find new white spaces, not access resources or connections (people forget that Sergy & Larry had a good idea and connections into the VC network in the Bay Area).

Twitpic is a case in point. Started on a spare server to scratch an itch, Twitpic is a poster child on how to build something new with little or no resources or connections.

  • In terms of traffic, Alexa says Twitpic is a top 100 site.
  • In 2009, the site did over $1.5 million in ad sales.
  • For every million in sales, the company keeps $700,000.
  • The site has about 6.5 million registered users.
  • Noah, the founder, was recently offered 8 figures for the business.
  • There are only 4 people working on the site (including Noah’s parents).

The common point with services like Twitpic and Craig’s List is that this new generation of businesses are creating new white spaces, and that the cost and accessibility of attacking these white spaces is now very very low.

Posted via web from Business-Technology

Innovation [2010-02-01]

Another week and another collection of interesting ideas from around the internet.

As always, thoughts and/or comments are greatly appreciated.