Our big point is that AI doesn’t replicate human intelligence, it replicates specific human behaviours, and the mechanisms behind these behaviours are different to those behind their human equivalents. It’s in these differences that opportunity lies, as there’s evidence that machine and human intelligence are complimentary, rather than in competition. As we say in the report “humans and machines are [both] better together”. The poster child for this is freestyle chess.
Eight years later [after Deep Blue defeated Kasparov in 1997], it became clear that the story is considerably more interesting than “machine vanquishes man.” A competition called “freestyle chess” was held, allowing any combination of human and computer chess players to compete. The competition resulted in an upset victory that Kasparov later reflected upon:
The surprise came at the conclusion of the event. The winner was revealed to be not a grandmaster with a state-of-the-art PC but a pair of amateur American chess players using three computers at the same time. Their skill at manipulating and “coaching” their computers to look very deeply into positions effectively counteracted the superior chess understanding of their grandmaster opponents and the greater computational power of other participants. Weak human + machine + better process was superior to a strong computer alone and, more remarkably, superior to a strong human + machine + inferior process. . . . Human strategic guidance combined with the tactical acuity of a computer was overwhelming.1)Garry Kasparov, “The chess master and the computer,” New York Review of Books, February 11, 2010, www.nybooks.com/articles/2010/02/11/the-chess-master-and-the-computer/. View in article
So rather than thinking of AI as our enemy, we should think of it as supporting us in our failings.
There’s been a lot of talk about using technology to democratise trust, and much of it shows a deep misunderstanding of just what trust is. It’s implicitly assumed that trust is a fungible asset, something that can be quantified, captured and passed around via technology. This isn’t true though.
As I point out in the post:
Trust is different to technology. We can’t democratise trust. Trust is a subjective measure of risk. It’s something we construct internally when we observe a consistent pattern of behaviour. We can’t create new kinds of trust. Trust is not a fungible factor that we can manipulate and transfer.
Misunderstanding trust means that technical solutions are proposed rather than tackling the real problem. As I conclude in the post:
If we want to rebuild trust then we need to solve the hard social problems, and create the stable, consistent and transparent institutions (be they distributed or centralised) that all of us can trust.
Technology can enable us to create more transparent institutions, but if these institutions fail to behave in a trustworthy manner then few will trust them. This is why the recent Ethereum hard fork is interesting. Some people wanted an immutable ledger, and they’re now all on ETC as they no longer trust ETH. Others trust the Ethereum Foundation to “do the right thing by them” and they’re now on ETH, and don’t trust ETC.
Computers are at the heart of the economy, and coding is at the heart of computers. Australia’s prosperity depends on equipping the next generation with the skills they need to thrive in this environment, but does this mean that we need to teach everyone how to code? Coding has a proud role in digital technology’s past, but is it an essential skill in the future? Our relationship with technology is evolving and coding, while still important, is just one of the many new skills that will be required.
Tyler Cowen has an article over at MIT Technology Review, Measured and Unequal, that discusses how improved measurement of workers might be a fundamental driver of inequality in the workplace of the future.
Consider journalism. In the “good old days,” no one knew how many people were reading an article like this one, or an individual columnist. Today a digital media company knows exactly how many people are reading which articles for how long, and also whether they click through to other links. The exactness and the transparency offered by information technology allow us to measure value fairly precisely.
The result is that many journalists turn out to be not so valuable at all. Their wages fall or they lose their jobs, while the superstar journalists attract more Web traffic and become their own global brands. Some even start their own media companies, as did Nate Silver at FiveThirtyEight and Ezra Klein at Vox. In this case better measurement boosts income inequality more or less permanently.
The assumption behind this sort of piecework measurement is that all the value realised by an article is due to the sweat and toil of a more-talented-than-usual journalist. If your article gets the clicks, then it must be because you are so good at what you do.
Unfortunately the world is not so simple.
We might choose to build our organisations around this sort of idea (and indeed, BuzzFeed et al work this way) but it tends to foster a short term and overly transactional view of work that ignores a lot of the value that workers, or a community of workers might create.
The first problem is the obsessive focus on outputs, on the assumption that the worker is responsible for all the value created. Outputs depend on inputs, and not just the worker’s skills. You can’t make a silk purse out of a sow’s ear, as the saying goes.
While the worker might be skilled, their work is also dependant on the quality of the materials they have to work with. Take the journalism example. A manager somewhere is splitting up the work, either by handing out the story ideas or by allocating topics to individuals. Not all ideas or topics are equal. It’s possible for someone to come from outside this system by finding a new approach—as Nate Silver did with a data-drivern approach—but that’s the exception rather than the rule. It’s more typical for the quality of the value of the outputs to be bound by the quality of the inputs, not the effort of the individual.
We see something similar in sales. It’s easy to sell in a rising market, and a booming market will see many sales people getting large commissions for no reason other than turning up. In a down market, though, its a different story, and we punish some of our best people for working hard just to bring anything into the business.
If we want to reward individuals based on their contribution then we need to quantify the amount of value they added, rather than the amount of value they lucked into. If we don’t then we’ll create a feeding frenzy for the juicy bits of work, while other less attractive (but possibly no less important in the overall scheme of things) get ignored.
Unfortunately it’s surprisingly difficult to measure value-add for many workers as it can be challenging to gauge the quality of the materials that they have to work with. A good example of this are the efforts in the US to measure teachers on the value they add in the class room, efforts which are struggling as it seems nearly impossible to objectively measure the quality of the students that they have to work with. There’s just too many variables.
Second is the problem of cumulative advantage. Success typically brings more success for no other reason than you were successful. Consider the opportunities created when you win an Oscar. The Oscars are an annual competition, so they’re awarded even if the year’s releases aren’t particularly good (such as if there’s a writers strike during most of the past year).
It doesn’t matter how you win the Oscar—either by creating great art and a big box office success, or simply be being the best of a bad lot—the attention that the Oscars garners you brings you to the attention of the world and the opportunities start flowing in. This improves the quality of the materials you can choose to work with. You might break the VW emissions story due to dumb luck, but it results in more story ideas flowing your way. You might not be the best journalist, you might not even be the journalist best positioned to make the most of the idea, but the idea is yours none the less.
Entire careers are built on the back of a lucky break followed by cumulative advantage. While this is good for the few lucky individuals, it’s not so good for the firm as it means that the firm might not be making the most of the materials at its command (though picking winners does make it easier for management). Nor is it much good for the equally talented individuals who weren’t quite so lucky.
Third is the problem of context. It’s rare, these days, to work in isolation. The context we’re in provides us with resources and connections that we couldn’t get elsewhere, or even just a boss that we can work with. While we might thrive in one environment, we struggle in others. One good example is star analysts, who often struggle when they leave the firm where they built their reputation. Some of that value in the outputs created might be the result of a productive work culture or effective management structure and team, factors that are the result of the everyone’s contributions, and not just the contributions the individual creating the deliverable.
Mr Cowen’s problem is that he has mistaken ease for cost. It’s cheaper than ever to measure all sorts of factors associated with work. At the same time, work has evolved making it hard know what to measure. While it might be cheap to generate all sorts of stats on worker activity, it’s not easy to tie these back to productivity.1)Aside, that is, for work situations which are explicitly configured as piece work, such as Uber drivers.
The root cause of this a recent shift (possibly sometime around 2005) from value being defined by the producer, to being defined by the consumer. The emergence of the consumer internet put the consumer in control as it enabled the consumer to have more information on a product than the merchant or producer, and the ability to source the product from any merchant around the globe. This was followed by the more recent emergence of social media, enabling consumers to turn to their peer, rather then brands.
Value used to be defined in terms of product features and functions, and we could measure a worker’s productivity by their contribution to creating these features and functions. Frederick Taylor started the trend by measuring how long it took for a man to unload a cart. The modern version is the basis of Mr Cowen’s article: counting the number and reach of articles carrying a byline, or worker surveillance where everything a worker types at a computer, everything they do is logged, recorded, and measured.
Value today is defined by a customer’s relationship to a product. Value is relative and shifting because it is a function of an expanding choice space for consumers. While all your workers contribute to creating this value, it’s not always obvious how to quantify their contribution.Their contribution might also be different for each customer, as relative value means that each customer could possibly conceive value differently.
Any retailer who heads down the omnichannel path, for example, needs to deal with the challenging of aligning a salesforce measured on their sales with a strategy that has sales skipping across multiple channels and contact points as the customer learns about the firm, develops their own understanding of what value is created, and winds their way to a decision. When you consider this it’s not surprising the Apple’s stores (some of the most profitable in the world) are not measured on sales, and fall under the marketing budget.
In the mean time we have many firms racing to quantify and optimise individual tasks that their workers undertake. This might drive improvements in a short term and overly instrumentalist definition of productivity, and result in a few lucky individuals receiving large pay checks. In the longer term the same strategy is destroying the value created for the customer, and possibly taking the firm’s future with it.
We used to be defined by what we knew. But today, knowing too much can be a liability.
Google, for example, is putting its trust in (potentially uncredentialled) “capable generalists” rather than “experts”.1)Laszlo Bock, Google’s Vice-President of People Operations, at The Economist’s Ideas Economy: Innovation Forum on March 28th 2013 in Berkeley, California. https://www.youtube.com/watch?v=wBRJ01NNKj8 Expertise still matters for narrowly focused highly-technical roles but Google has found that in most instances a capable generalist will arrive at the same solution as an expert, while in some cases they will come up with a new solution that is superior to those proposed by the experts.
Expertise, and being an expert, implies having the hard-won knowledge and skills that make you a reliable judge of what is best or wisest to do. It’s an inherently backwards-looking concept, ascribing value to individuals based on their ability to accumulate experience and then generalise from it, taking generic solutions that have worked in the past and applying them to specific problems encountered today.
This is an approach that worked well in the past when knowledge and skills were expensive and difficult to acquire, and the problems we tackled later in our career were similar to those encountered at the start. Society has spent centuries reorganising work and dividing it into ever more narrowly defined specialisations to enable individuals to focus on, and develop expertise in, specific jobs.
Take the case of the Brunels in the 1800s: Marc, who built the first tunnel under the Thames,2)Marc Brunel was, in the early 1800s, the engineer responsible for the first tunnel to be dug under a substantial river. and his son, Isambard, creator of the Great Britain.3)Isambard Brunel built the SS Great Britain, in the late 1800s, the longest ship in the world at her time and the first iron steamer with a screw propeller. Both Marc and Isambard roamed across architecture, and civil and mechanical engineering, designing everything from buildings and manufacturing processes through railways to steam engines and ships, covering most of the technologies we associate with the industrial revolution.
Overtime all these technologies became increasing complicated and entailed, requiring you to acquire more and more knowledge and skills before you could be productive and contribute your own ideas and findings. The ground covered by the two Brunels has been divided into a range of highly specialised disciplines, each with their own narrowly defined education and credentialing process.
Digital technology, however, is changing our relationship with knowledge and, consequently, with expertise. The pithy version of this is “it’s not what you know, it’s what you can google”. By allowing us to easily capture and transmit knowledge, and by providing new means of communicating with our peers, the growth of digital technology is tipping the balance of power from narrowly defined expertise to more broadly defined capability. Knowledge is available on demand via online resources and social media while skills are being captured in software packages, shifting what used to be stocks to flows.4)The shift from stocks to flows @ PEG The generalist is no longer at a disadvantage to the specialist, as most (if not all) specialist knowledge and skills are available on-demand.
I heard a nice example of this a while ago when I was listening to Film Buff’s Forecast5)Film Buff’s Forecast @ RRR. The show was interviewing a director who also lectured at a local university. The director opined that the current graduating class had a lot more sophisticated understand of film, and were more sophisticated in their approach to their work, than he and his class were back in the early seventies. In his view this wasn’t because they current class were inherently smarter. It was because the majority of their time at university was invested in exploring the possibilities provided by film as a medium, and developing an understanding of what they might do within the medium. This is in contrast to the director’s class back in the early seventies, when the majority of a student’s time was spent finding, accessing, and internalising knowledge stocks.
The example the director gave was of a student being directed to some technique that Alfred Hitchcock used.6)Unfortunately I don’t remember which technique was mentioned. Back in the seventies this would have implied many afternoons spent in the stacks at the library looking for film criticism that discussed the technique, so that the student could develop an understanding of it and know in which films the best (and worst) examples could be seen, followed by a search of the rep theatres to find screenings of key films.
That same understanding can be obtained via an afternoon on the couch browsing the internet with the following day spent streaming films from Netflix.
Today we invest our time exploring the problem we’re trying to solve, and the context we’re solving it in, rather pouring most of our effort into finding the information we need.
We’re also increasingly finding ourselves asked to solve new problems, create new products and services, and, in some cases, even rethink how entire industries and sectors of the economy work. This is what we commonly refer to as digital disruption, even though that term fails to capture the full extent of the social change that is bearing down on us.
Take the construction industry for example. Technology has been used to streamline or automate many tasks making today’s construction industry a different beast to the construction industry of our grandparents, but it is still an industry that adheres to a fundamental craft-based paradigm, with skilled trades people working onsite to create bespoke buildings.
A range of technological and social changes are about to transform the construction industry from a craft-based paradigm to a flexible-manufacturing paradigm, skipping over the traditional industrial paradigm in the process.
My favourite example of this is Unitised Building7)http://www.unitisedbuilding.com who have developed a new construction process (as opposed to a technology) that enables them to construct a mid-rise building in a fraction of the time and with a fraction of the money, of a traditional approach. This building system is completely digitised, with the building design in 3D modelling tools before the design is broken down and sent to numerically controlled machines for part fabrication and assembly on the shop floor. Assembled modules are trucked to the construction site where one is lifted into place every eight minutes after which the various connectors snapped together and gaps plastered over. A process that took months now takes weeks and the cost is shafted in the process.
The shift from craft to flexible manufacturing has a dramatic impact on the skills required from the workforce, moving from deep expertise in building to general design, digital modelling and construction skills. The focus has shifted from needing people who can work within the established building system (people with deep expertise who can generalise experience and then apply these general solutions to specific problems) to people who can work to develop and improve a new building system (people with broader skills who can find new problems to be solves, and solutions to these new problems).
A similar trend can been seen across all sectors. We’re moving from working in the system that is a business, to working on the system. The consequence of this is that its becoming more important to have the general capabilities and breadth of experience that enable us to develop and improve the system in novel directions, than it is to have deep, highly entailed experience in working within the current system. There will always be a need for narrowly focused expertise in highly technical areas, but in the majority of cases the generalist now has an advantage over the specialist.
This raises an interesting conundrum. While you might not need to know as much as you did in the past, it’s not clear just how much you do need to know now. This is a particular problem for educators and firms as they want to arm the individuals under their care with the knowledge and skills required to be successful in the workplace. Teaching too little means that the individual will not be effective at what they do. Teaching too much implies that we are wasting the individual’s time (and money, in many cases).
Focusing on understanding how much to teach might be asking the wrong question though. In many cases the only person who can judge how much knowledge is enough will be the individual, as “how much is enough” will be determined by the problem that they are trying to solve and the context that they are trying to solve it in.
We need to break down the problem a bit more if we’re to understand what question we should be asking.
First, we do know that you need enough knowledge to be dangerous; to be conversant in the domain, to be able to understand and describe the problem, and to be able to interact and discuss what you are doing with the others who you are collaborating or working with. That film director mentioned above needs to be able to understand the criticism that they are reading, knowing the key concepts, technical terms and idioms that form the language of film. Similarly for our flexible-manufacturing building system, where you would need to understand the basic language of building, digital design, and flexible manufacturing if you expect to be productive and contribute.
Second, we need to equip the individual with the tools they need to manage their own knowledge and their access to knowledge. If the only person who can determine how much knowledge is enough is the individual, then we need to empower them by providing them with the tools they need to manage knowledge for themselves.
This can be further broken down into the following.
You need to understand limits of your current knowledge (or, put another way, you need to know when to go looking for new knowledge). This may be as simple of coming across new terms and concepts that you don’t understand, through to having the sensitivity to realise that your lack of progress in a task is due to the knowledge (the ideas and skills) that you’re applying being insufficient, and you need to find a new approach that is based on different knowledge.
You need to be aware of what additional knowledge you might draw on, so that you can you can reach out and pull it in as needed. This is a process of eliminating the unknown unknowns: reading blogs, going to conferences, participating in communities of practice, and even having conversations at the water cooled, so that your aware of the other ideas out there in the community, and other other individuals who are working in related areas. You can only draw on new knowledge if you’re aware that it exists, which means you must invest some time in scanning the environment around you for new ideas and fellow travellers.
You also need the habits of mind – the attitudes and behaviours – that lead you to reach out when you realise that you’re knowledge isn’t up to the task as had, explore the various ideas that you’re aware of (or use this awareness to discover new ideas), and then pull in and learn the knowledge required.
Finally, you need to be working in a context where all this is possible. To many work environments are setup in a way that prevents individuals from either investing time in exploring what is going on around them (and eliminating unknown unknowns), taking time out from the day-to-day to learn what they need to learn on-demand, or from taking what they’ve learnt and doing something different (deviating from the defined, approved and rewarded process).
So question we asked at the start of this post – How much do you need to know? – is clearly the wrong question to be asking.
Rather than focus trying to know (or teach) everything that might be relevant (the old competence model) we need to move up a level, focusing on metacognition. This means providing people with the tools needed to manage knowledge their own: fostering the sensitivity required to know when knowledge and skills have run out, creating time and space so that they can invest in their own knowledge management, and encouraging the habits of mind that mean that they have the ability and attitude to do something about it.
Image: Isambard Kingdom Brunel preparing the launch of ‘The Great Eastern by Robert Howlett
When we did an Australian version of the Shift Index2)The Shift Index in Slides @ PEG we saw that while Australia has a pretty good digital foundation and society seems to be adapting to the shift fairly well, we’re not realising as much value as it could be. Or put another way, while we’re using digital technology to create new knowledge flows, we’re not as proficient at realising their value.
With the Shift Index complete we turned our attention to education, as it seemed logical that education would be the most effective fulcrum to use to improve our performance.
The major finding in the report is that our relationship with knowledge is changing, and consequently our relationship with education is changing. The snappy version of this is “Why remember what you can google?”. The longer story has interesting implications for the education sector as by changing what it means to be educated has all sorts of potential knock-on effects for education and educators.
The report is our attempt move the current debate beyond pedagogy and edu-tech, funding and Australia’s ranking on international league tables to consider if our changing relationship to knowledge (the shift from knowledge stocks to knowledge flows, highlighted in the report) is changing the role and purpose of education and (by extension) the education sector.
He makes a good case for his main thesis, and I recommend reading the paper, but what I found interesting was a section early in the piece where he makes the strong distrinction between learning and teaching.
Learning is something personal. It’s something that you, an individual, does. Learning has you pulling in new knowledge and skills, experimenting and testing them, before you adopt what works for you while rejecting that which doesn’t.
Education, on the other hand, is something that’s done to you. It’s an intervention, a black box where the student enters on one side and leaves on the other changed in some way. This change may be the acquisition of knowledge, as it was so often in the past. It could be the acquisition of attitudes and behavours that you couldn’t develop on our own. Or it might even be the development of an awareness of the intersection between what you’re good at and what you like.
Learning is (ideally) something that you do continually. Education is something that you seek out when you need help.
This leads us to the somewhat radical conclusion that schools – or any educational instutition for that matter – are not places of learning. The place of learning is whereever the student is. Sometimes that place of learning is located at the educational institution. Often it’s not.
Calling any educational instutition a place of learning is a bit silly as it implies that learning is restricted to occurring at discrete and well-defined places (even if these places are virtual), when clearlly it could and should happen anywhere. Indeed, during the development of a recent Centre for the Edge report in the changing nature of education (more on that in the weeks to come) we even heard one senior K12 education state that their school is not a place of learning – it’s a place of education – as the learning should occur where ever the student is.
Making a clear distinction between learning and education also leads us to the conclusion that a lot of the discussion about life-long learning is really talk of periodic, life-long reeducation. I’m not sure that periodic, life-long reeducation makes any sense for students who are quite capable of managing their own learning. It does, however, make a lot of sense for educational instutitions who intend to charge students each time they return to the knowledge well.
Finally, the confusion between education and learning means that all the problems in the education sector are treated as learning problems. This worries me as it’s becoming clear that many of these problems stem from our inability to develop a shared understanding of what it means to be educated in this day and age.
It’s clear that the modern workplace is placing new demands on workers. Analysis skills used to be top of the list – the ability to to pull problems apart, optimise the pieces, and then put them back together. Now it’s creativity that’s in demand – the ability to pull together disparat ideas and make something new. We used to work alone, sitting in our office. Now we work in teams, often with members drawn from different organisations and cultures. And so on…
Today your value to the firm is not based on what you can prove that you know or can do, but in what the firm expects you to achieve. Firm’s are looking for individuals who a demonstrable interest in a problem the firm has; someone who has a track record of integrating new ideas from other disciplines and domains to create new, novel solutions; an individual who can effectively integrate into the firm’s team; and someone who’s background and culture will helps broaden as well as deepen the reach of the firm when searching for ideas.
This new generation of workers – Google calls them “smart creatives”2)Eric Schmidt & Jonathan Rosenberg (2014), How Google Works, Grand Central Publishing – have different educational needs.
As we say in the Centre for the Edge’s education report:
The goal of a formal education should be to prepare students for life after their formal education. In a world dominated by change it would be wise to define ‘being educated’ as having the ability ‘to adapt to whatever life might bring’. An increasingly important part of education – and intervention – will therefore be to instil in students the importance of continually updating and expanding their own knowledge stocks, as well as fostering within them the sensitivity to know when they need to do this. Doing this is a skill in and of itself. It is a skill built on habits of mind, the attitudes and behaviours that a student develops during their formal education.
Education and learning might have been synonymous in the past. primarily as educators had a virtual monopoly on knowledge. That is no longer true as it’s not what you know, it’s what you can google that matters.
What we think of as education is expanding and changing in repose to the changing nature of society. Education and learning are now very different things but we continue to view all problems in the education sector as learning problems, to our own detriment.
We’ve spent the last six months or so at the Centre for the Edge looking into how the trends we saw in the Australian Shift Index (i.e. the shift from knowing something, to being able to google it) might be changing the education sector.
Our hypothesis was that digital technology has changed our relationship with knowledge, and that this has, in turn, driven changes in business and society making the existing education sector (and the model behind it) increasingly irrelevant. This means that the problems confronting educational institutions don’t arise from a lack of technology or pedagogy (and MOOCs will not save the world), but from a mismatch between the perceived purpose and role of education, and the demands of the modern worker.
To help get the creative juices flowing we put some of our thoughts onto some slides which we then used to spark conversations with a wide range of folk within traditional education institutions, and elsewhere. Given that we’re in the process of pulling together a report the details our findings we thought that it would be worthwhile to share the slides, so you can find them embedded below as well as in SlideShare.
Business men go down with their businesses because they like the old way so well they cannot bring themselves to change. One sees them all about – men who do not know that yesterday is past and woke up this morning with their last year’s ideas.
Henry Ford, My Life and Work (discussing the disruption that the mass produced car was bringing across so many sectors)
What is innovation? I don’t know, but then I’m not even sure that it’s an interesting question. The yearning so many companies have to be innovative often seems to prevent them from actually doing anything innovative. They get so caught up in trying to come up with the next innovation — the next big product — that they often fail to do anything innovative at all. It’s more productive to define innovation by understanding what it’s not: doing the same thing as the rest of the crowd, while accepting that there are no silver bullets and that you don’t control all the variables.
So, what is innovation? This seems to be a common question thats comes up whenever a company wants to innovate. After all, the first step in solving a problem is usually to define our terms.
Innovation is a bit like quantum theory’s spooky action at a distance,1)Spooky action at a distance? @ Fact and Fiction where stuff we know and understand behaves in a way we don’t expect. It can be easy to spot an innovative outcome (hindsight is a wonderful thing), but it’s hard to predict what will be innovative in the future. Just spend some time browsing Paleo-Future2)Paleo-Future (one of my favourite blogs) to see just how far off the mark we’ve been in the past.
The problem is that as it’s all relative; what’s innovative in one context may (or may not) be innovative in another. You need an environment that brings together a confluence of factors — ideas, skills, the right business and market drivers, the time and space to try something new — before there’s a chance that something innovative might happen.
Unfortunately innovation has been claimed as the engine behind the success of more than a few leading companies, so we all wanted to know what it is (and how to get some). Many books have been written promising to tell you exactly what to do to create innovation, providing you with a twelve step program3)Twelve step programs @ Wikipedia to a happier and more innovative future. If you just do this, then you too can invent the next iPhone.4)iPhone — the Apple innovation everyone expected @ Fast Company
Initially we were told that we just needed to find the big idea, a concept which will form the basis of our industry shattering innovation. We hired consultants to run ideation5)Ideation defined at Wikipedia workshops for us, or even outsourced ideation to an innovation consultancy asking them to hunt down the big idea for us. A whole industry has sprung up around the quest for the big idea, with TED6)TED (which I have mixed feelings about) being the most obvious example.
The challenge when managing innovation is not in capturing ideas before they develop into market shaping innovations. If we see an innovative idea outside our organization, then we must assume that we’re not the first to see it, and ideas are easily copied. If innovation is a transferable good, then we’d all have the latest version.
Ideas are a dime a dozen, so real challenge is to execute on an idea (i.e. pick one and do something meaningful with it). If you get involved in that ideas arms race, then you will come last as someone will always have the idea before you. As Scott McNealy at Sun likes to say:
Statistically, most of the smart people work for somebody else.
More recently our focus has shifted from ideas to method. Realising that a good idea is not enough, we’ve tried to find a repeatable method with which we can manufacture innovation. This is what business does after all; formalise and systematise a skill, and then deploy it at huge scale to generate a profit. Think Henry Ford and the creation of that first production line.
Design Thinking8)Design Thinking … what is that? @ Fast Company is the most popular candidate for method of innovation, due largely to the role of Jonathan Ive9)Jonathan Ive @ Design Museum and design in Apple’s rise from also-ran to market leader. There’s a lot of good stuff in Design Thinking — concepts and practices anyone with an engineering background10)Sorry, software engineering doesn’t count. would recognise. Understand the context that your product or solution must work in. Build up the ideas used in your solution in an incremental and iterative fashion, testing and prototyping as you go. Teamwork and collaboration. And so on…
The fairly obvious problem with this is that Design Thinking does not guarantee an innovative outcome. For every Apple with their iPhone there’s an Apple with a Newton.11)The story behind the Apple Newton @ Gizmodo Or Microsoft with a Kin.12)Microsoft Said to Blame Low Sales, High Price for Kin’s Failure @ Business Week Or a host of other carefully designed and crafted products which failed to have any impact in the market. I’ll let the blog-sphere debate the precise reason for each failure, but we can’t escape the fact the best people with a perfect method cannot guarantee us success.
People make bad decisions. You might have followed the method correctly, but perhaps you didn’t quite identify the right target audience. Or the technology might not quite be where you need it to be. Or something a competitor did might render all your blood sweet and tears irrelevant.
Design Thinking (and innovation) is not chess: a game where all variables are known and we have complete information, allowing us to make perfect decisions. We can’t expect a method like Design Thinking to provide an innovative outcome.
Why then do we try and define innovation in terms of the big idea or perfect methodology? I put this down to the quest for a silver bullet: most people hope that there’s a magic cure for their problems which requires little effort to implement, and they dislike the notion that hard work is key.
This is true in many of life’s facets. We prefer diet pills and magic foods over exercise and eating less. If I pay for this, then it will all come good. If we just can just find that innovative idea in our next facilitated ideation workshop. Or hire more designers and implement Design Thinking across our organisation.
Success with innovation, as with so many things, is more a question of hard work than anything else. We forget that the person behind P&G’s Design Thinking efforts,13)P&G changes it’s game @ Business Week Cindy Tripp, came out of marketing and finance, not design. She chose Design Thinking as the right tool for the problems she needed to solve — Design Thinking didn’t choose her. And she worked hard, pulling in ideas from left, right and centre, to find, test and implement the tools she needed.
So innovation is not the big idea. Nor is it a process like Design Thinking.
For me, innovation is simply:
working toward a meaningful goal, and
being empower to use whichever tools will be most beneficial.
If I was to try and define innovation more formally, then I would say that innovation is a combination of two key concepts: obliquity14)Obliquity defined at SearchCRM and Jeet Kune Do’s15)Jeet Kune Do, a martial art discipline developed by Bruce Lee @ Wikipedia concept of absorbing what is useful.
Obliquity is the simple idea that the best way to achieve a goal in a complex environment is to take an indirect approach. The fastest and most productive path to the top of the mountain might be to take the path that winds its way around the mountain, rather than to try and walk directly up the steepest face.
Apple is a good example of obliquity in action. Both Steve Jobs and Jonathan Ives are on record as wanting to make “great products that we want to own ourselves,” rather than plotting to build the biggest and most innovative company on the planet. Rather than try and game the financial metrics, they are focusing on making great products.
Bruce Lee16)Bruce Lee: the devine wind came up with the idea of “absorbing what is useful”17)Absorbing what is useful @ Wikipedia when he created Jeet Kune Do. He promoted the idea that students should learn a range of methods and doctrines, experiment to learn what works (and what doesn’t work) for them, “absorb what is useful” while discarding the remainder. The critical point of this principle is that the choice of what to keep is based on personal experimentation. It is not based on how a technique may look or feel, or how precisely the artist can mimic tradition. In the final analysis, if the technique is not beneficial, it is discarded. Lee believed that only the individual could come to understand what worked; based on critical self analysis, and by, “honestly expressing oneself, without lying to oneself.”
Cindy Tripp at P&G is a good example of someone absorbing what is useful. Her career has her investigating different topics and domains, more a sun shaped individual than a t-shaped one.18)T-Shaped + Sun-Shaped People @ Logic + Emotion Starting from a core passion, she accreted a collection of disciplines, tools and techniques which are beneficial. Design Thinking is one of these techniques (which she uses as a reframing tool).
I suppose you could say that I’ve defined innovation by identifying what it’s not: innovation is the courage to find a different way up the hill, while accepting that there are no silver bullets and that you don’t control all the variables.
Updated: Tweeked the wording in the (lucky) 13th paragraph in line with Bill Buxton’s comment.
For every Apple with their iPhone there’s an Apple with a Newton. Or Microsoft with a Kin.