As Andy Mullholland pointed out in a recent post, all too often we manage our businesses by looking out the rear window to see where we’ve been, rather than looking forward to see where we’re going. How we use information too drive informed business decisions has a significant impact on our competitiveness.
I’ve made the point previously (which Andy built on) that not all information is of equal value. Success in today’s rapidly changing and uncertain business environment rests on our ability to make timely, appropriate and decisive action in response to new insights. Execution speed or organizational intelligence are not enough on their own: we need an intimate connection to the environment we operate in. Simply collecting more historical data will not solve the problem. If we want to look out the front window and see where we’re going, then we need to consider external market information, and not just internal historical information, or predictions derived from this information.
A little while ago I wrote about the value of information. My main point was that we tend to think of most information in one of two modes—either transactionally, with the information part of current business operations; or historically, when the information represents past business performance—where it’s more productive to think of an information age continuum.
Andy Mulholland posted an interesting build on this idea on the Capgemini CTO blog, adding the idea that information from our external environment provides mixed and weak signals, while internal, historical information provides focused and strong signals.
Andy’s major point was that traditional approaches to Business Intelligence (BI) focus on these strong, historical signals, which is much like driving a car by looking out the back window. While this works in a (relatively) unchanging environment (if the road was curving right, then keep turning right), it’s less useful in a rapidly changing environment as we won’t see the unexpected speed bump until we hit it. As Andy commented:
Unfortunately stability and lack of change are two elements that are conspicuously lacking in the global markets of today. Added to which, social and technology changes are creating new ideas, waves, and markets – almost overnight in some cases. These are the ‘opportunities’ to achieve ‘stretch targets’, or even to adjust positioning and the current business plan and budget. But the information is difficult to understand and use, as it is comprised of ‘mixed and weak signals’. As an example, we can look to what signals did the rise of the iPod and iTunes send to the music industry. There were definite signals in the market that change was occurring, but the BI of the music industry was monitoring its sales of CDs and didn’t react until these were impacted, by which point it was probably too late. Too late meaning the market had chosen to change and the new arrival had the strength to fight off the late actions of the previous established players.
We’ve become quite sophisticated at looking out the back window to manage moving forward. A whole class of enterprise applications, Enterprise Performance Management (EPM), has been created to harvest and analyze this data, aligning it with enterprise strategies and targets. With our own quants, we can create sophisticated models of our business, market, competitors and clients to predict where they’ll go next.
Despite EPM’s impressive theories and product sheets, it cannot, on its own, help us leverage these new market opportunities. These tools simply cannot predict where the speed bumps in the market, no matter how sophisticated they are.
There’s a simple thought experiment economists use to show the inherent limitations in using mathematical models to simulate the market. (A topical subject given the recent global financial crisis.) Imagine, for a moment, that you have a perfect model of the market; you can predict when and where the market will move with startling accuracy. However, as Sun likes to point out, statistically, the smartest people in your field do not work for your company; the resources in the general market are too big when compared to your company. If you have a perfect model, then you must assume that your competitors also have a perfect model. Assuming you’ll both use these models as triggers for action, you’ll both act earlier, and in possibly the same way, changing the state of the market. The fact that you’ve invented a tool to predicts the speed bumps causes the speed bumps to move. Scary!
Enterprise Performance Management is firmly in the grasp of the law of diminishing returns. Once you have the critical mass of data required to create a reasonable prediction, collecting additional data will have a negligible impact on the quality of this prediction. The harder your quants work, the more sophisticated your models, the larger the volume of data you collect and trawl, the lower the incremental impact will be on your business.
Andy’s point is a big one. It’s not possible to accurately predict future market disruptions with on historical data alone. Real insight is dependent on data sourced from outside the organization, not inside. This is not to diminish the important role BI and EPM play in modern business management, but to highlight that we need to look outside the organization if we are to deliver the next step change in performance.
Zara, a fashion retailer, is an interesting example of this. Rather than attempt to predict or create demand on a seasonal fashion cycle, and deliver product appropriately (an internally driven approach), Zara tracks customer preferences and trends as they happen in the stores and tries to deliver an appropriate design as rapidly as possible (an externally driven approach). This approach has made Zara the most profitable arm of Inditex, a holding company of eight retail brands, and one of the biggest success stories in Spanish business. You could say that Quants are out, and Blink is in.
At this point we can return to my original goal: creating a simple graphic that captures and communicates what drives the value of information. Building on both my own and Andy’s ideas we can create a new chart. This chart needs to capture how the value of information is effected by age, as well as the impact of externally vs. internally sourced. Using these two factors as dimensions, we can create a heat map capturing information value, as shown below.
Vertically we have the divide between inside and outside: internally created from processes; though information at the surface of our organization, sourced from current customers and partners; to information sourced from the general market and environment outside the organization. Horizontally we have information age, from information we obtain proactively (we think that customer might want a product), through reactively (the customer has indicated that they want a product) to historical (we sold a product to a customer). Highest value, in the top right corner, represents the external market disruption that we can tap into. Lowest value (though still important) represents an internal transactional processes.
As an acid test, I’ve plotted some of the case studies mentioned in to the conversation so far on a copy of this diagram.
- The maintenance story I used in my original post. Internal, historical data lets us do predictive maintenance on equipment, while external data enables us to maintain just before (detected) failure. Note: This also applies tasks like vegetation management (trimming trees to avoid power lines), as real time data and be used to determine where vegetation is a problem, rather than simply eyeballing the entire power network.
- The Walkman and iPod examples from Andy’s follow-up post. Check out Snake Coffee for a discussion on how information driven the evolution of the Walkman.
- The Walmart Telxon story, using floor staff to capture word of mouth sales.
- The example from my follow-up (of Andy’s follow-up), of Albert Heijn (a Dutch Supermarket group) lifting the pricing of ice cream and certain drinks when the temperature goes above 25° C.
- Netflix vs. (traditional) Blockbuster (via. Nigel Walsh in the comments), where Netflix helps you maintain a list of files you would like to see, rather than a more traditional brick-and-morter store which reacts to your desire to see a film.
Send me any examples that you know of (or think of) and I’ll add them to the acid test chart.
An interesting exercise left to the reader is to map Peter Drucker’s Seven Drivers for change onto the same figure.
Update: A discussion with a different take on the value of information is happening over at the Information Architects.
Update: The latest instalment in this thread is Working from the outside in.
Update: MIT Sloan Management Review weighs in with an interesting article on How to make sense of weak signals.
This is turning into an excellent example of the whole 'new world' in the way that social networks allow people to share and develop ideas rapidly, and i am really enjoying this with Peter and others.
let me try and move this forward on the chart above with my as yet unposted thoughts on moving from the 20/80 principle where 20% of your activities could make you 80% of your revenue and instead suggest that we need to focus on the 20% of opportunites that will yield 80% of the higest margins. the shift here is from stable products to short term execution of intellectual property against market opportunity as the way to make money.
look again at the Dutch ice cream example – old stable world says sales and revenues go up when the tempreture goes up, but in our unstable world fixed pricing tends to mean low margins to be competitive against the choice people now have. but when tempretures go up the opportunitistic factor of the market and buyers increase and the value of an ice cream now increases.
buyers and sellers win ! how do you win as a buyer at a higher price? beacuse if we focus on the old game the risk / reward of ice cream stocking and selling becomes unattractive so the buyer cant get ice cream when they want it to hand where as in the new game the higher margin makes it attractive to sell ice cream so the opportunity to benefit hits both buyer and seller
Peter, a great development of the idea. Age of information is important, as Andy pointed out stretch it to 5 years and it becomes irrelevant again. There are so many external factors – economy etc that just twist it.
The other thought I had on this is the consistency of the measure. All too often we move from measuring one thing to another and as a result not being able to accurately compare results over time. With internal insight this is relatively straightforward, but adding the more valuable external data may be more difficult. After all, no two days are the same (thankfully!) That said, the world changes and we often need to measure something different.
I'm also not sure I agree with the iPod example. Maybe I have the wrong end of the stick, but no one could predict that the way we listen to music will change until after the event. There were signs indeed that a new way or technology was coming that *could* revolutionise, but no one knew for sure. There are/were many competing formats/options – remember Betamax – that was going to change the world. Only the historic info on declining CD sales in this case would give sufficient insight.
In your request for examples – the same could be said for the evolution from VHS >>> DVD >>> Blu Ray >>> Online…. Use Blockbuster or Netflix as an example. Blockbuster being your traditional video rental store, and now changing to meet customer requirements in its delivery/approach. Netflix leading in the delivery rental of DVD's (and great white label solutions) – select online but still deliver the actual disc to where we are now with the whole movie being delivered online. I saw YouTube also announced today that movies may also be part of their future – all of this possible of course by not just the internet, but the availability of a better, faster infrastructure to support higher speeds.
Another would be the Amazon recommendation engine. This to me offers a huge opportunity about internal & external data. My thought process being I log in and buy a book for myself and then one for my mum. It then presents me with “people who bought this also bought….” I may not want another Mills and Boon (sorry mum) book – internal information we can analyse and make recommendations on. However taking this further and adding external data such as people profiles and now starting to tread on the toes of the over discussed socialgraph, in this case “people like me” add this to the people who bought x book we have analysed internally – you start to get “People like you” also bought xyz, in my view – much more likely to be successful. The value here could be huge. Maybe they do this already..
Finally, most of this is product related, what about services – how about we buy a hotel room – how many people now rely on other peoples reviews prior to booking – enter trip advisor data.
Ultimately the value of information is one thing, it’s what and how quickly we can and do act on it that matters.
Yep–it is a good example of distributed collaboration. And I think the collective result is becoming quite a powerful tool 🙂
I've been thinking that here's a few things that modulate information value, and which we haven't managed to factor in yet. The most obvious is the ability to action the information. Knowing a customer has a problem isn't worth much if you can't do much about it. Another factor might be the number of distinct threads of data involved. Is creating insight from two thread, worth more or less than from four threads? (I'm currently assuming more, as I expect it would be rarer/harder, but then I haven't finished thinking it through.) I've been playing with diagrams on the tram, but haven't made much progress to date.
r.
PEG
I think this becomes a different discussion then – if we start to compare more data feeds vs. faster reaction to the initial one or two feeds. Logically more data should be better, however how long on the horizontal axis do we give it before doing nothing is an action in itself. What benefits can be attained as a result of first mover advantage or listening to the customer (as in the Zara story) and making decisions quickly…
Consistency is an interesting idea. I've been thinking that half the battle will be how to select what data to observe, which is sort of the same thing. You don't need a lot of data, but you do need the right data. If you focus on volume, then the good data will be swamped by the noise.
Tying the iPod / Walkman example to Peter Drucker's drivers for change should make it a clearer story. Check out http://snakecoffee.wordpress.com/2006/04/30/peter-druckers-seven-sources-of-innovation/, which does this nicely for the iPod.
I like the examples. Especially the services angle. (I wasn't ignoring services–I just hadn't though of factoring them in.) I'll factor them into an update of the final chart in this post.
And finally, I completely agree with your last point. If you can't action the information, then what's the point. I think I was replying to Andy's post along these lines when you were posting 🙂
r.
PEG
Opps… this was meant to be a reply to Andy's comment 🙁
The feed don't need to be distributed in time. It's really a question of our ability to make connections between bits of data, and manufacture insight (i.e. knowledge synthesis). Given a collection of disparit data points, which ones matter the most? One challenge when soliciting customer feedback is knowledge which feedback to take on board, and which to ignore.
“Should I wait, or should I go” is another interesting question 🙂 I see it as a separate on though, and one I hadn't thought of until you mentioned it.
Peter, I enjoyed your perspective on “Inside vs Outside” and the debate surrounding this discussion. Improving information outside of the organisation to take advantage of trends and look forward should be an important strategy of any organisation.
One of the challenges is identifying where you are at in relation to “inside” and “outside” information management. Are you making the best use of inside information? Are there quick wins that can be made relating to improving internal processes, or should this be abandoned for a focus around external triggers to take advantage of opportunites? Maybe the answer is to consolidate internal initiatives in order to focus externally.
The importance of integrating and sensibly using external information is undeniable, and this effort needs to be juggled with concurrent activities relating to “inside” improvement … or you will fail when the opportunities come. Not every organisation feels they have an “A+” on inside information management.
It's an interesting question you raise: which information matters? How do we find those few factor which will drive the best decision/analytic from the thousands (if not millions) of factors available to?
I think the inside vs. outside qn is a bit of distraction, because we want is the best information, and it is the integration of this information, the synthesis of insight, that creates value. The Tesco example is a good one: fusing external weather and internal sales history to predict what will drive sales. How do we determine which is the best information?
And I'm glad you liked the post. It's always nice to see ideas take on a life of their own in a debate 🙂
r.
PEG
The retail chain Zara's success are not simply about accurate business information. All modern chains have reasonable point of sales systems and extracting sales information from these is hardly rocket science.
Zara has chosen to have very flexible production lines close to their major sales outlets and consequently they are able to respond very rapidly to changes in consumer demand. Many of their competitors have relatively inflexible offshore production lines located a long way from their sales outlets. They are unable to respond rapidly to changes in consumer demand, even when they know what the this change is. The key point is that knowing what is happening in the market place is not enough; you have to have the organizational capacity to respond to it.
Zara management proved that for some product lines, optimizing for flexible response is more profitable than cost minimization. Depending on your product mix there's probably room for both models.
Even the most rabid of BI vendors would accept the diminishing returns to collecting business information but few would be comfortable admitting just how quickly this kicks in.
Agreed; information which we cannot execute on has little value. Zara, as you point out, is an interesting case study in optimizing a business to minimize the cost of change, rather than the cost of operations. Putting acceleration over velocity, as it were. Also interesting is how Zara integrate external data (from fashion shows through to walking through stores and talking to customers) into their decisions. As Andy M. pointed out, this is the challenge of balancing strong internal signals (from the till) with weak and conflicting external signals.
It would be an interesting exercise to determine when the law of diminishing returns for data kicks in. You probably right–it's a lot sooner than any of us expect.
Agreed; information which we cannot execute on has little value. Zara is an interesting case study in optimizing a business to minimize the cost of change, rather than the cost of operations. Putting acceleration over velocity, as it were. Also interesting is how Zara integrate external data (from fashion shows through to walking through stores and talking to customers) into their decisions. As Andy M. pointed out, this is the challenge of balancing strong internal signals (from the till) with weak and conflicting external signals.
It would be an interesting exercise to determine when the law of diminishing returns for data kicks in. You probably right–it's a lot sooner than any of us expect.
Agreed; information which we cannot execute on has little value. Zara is an interesting case study in optimizing a business to minimize the cost of change, rather than the cost of operations. Putting acceleration over velocity, as it were. Also interesting is how Zara integrate external data (from fashion shows through to walking through stores and talking to customers) into their decisions. As Andy M. pointed out, this is the challenge of balancing strong internal signals (from the till) with weak and conflicting external signals.
It would be an interesting exercise to determine when the law of diminishing returns for data kicks in. You probably right–it's a lot sooner than any of us expect.
[…] though got me thinking though about some work that PEG, a colleague of mine has been doing on the value of information. PEG’s central premise is that time and distance drive the value of information. That is, […]
[…] is a lot like the challenge we’ve been talking about under the banner of The value of information. How do we make sense of weak, conflicting and volumous signals we see in the environment outside […]