Suleyman, Mustafa. The Coming Wave. First edition. New York: Crown, 2023.
This book describes itself as the work of ‘the ultimate insider’1. This seems rather apt as it provides us with a glimpse of what the technocratic chattering class are saying about the current AI moment. Unfortunately it doesn’t provide us with insight into how this current moment will play out as the view from inside appears to be is quite poor, lacking the perspective need to really grapple with this question.
We might summarise the book along the following lines:
- technology comes in waves and is without bounds;
- many current technologies—but primarily Large Language Models (LLMs) and synthetic biology—represent the coming wave and will consequently reshape the world; and
- its important for the right people to be in charge, to ensure that technology is a force for good rather then evil, and we can do this by containment.
There’s a lot here to unpack, which is challenging to do while reading as it’s largely unreferenced (though there are a few endnotes) so the only way to distinguish between ambit claim and documented fact is via prior knowledge. Does 3-D printing, for example, enable us to slash the time and cost required to construct a home? (No it doesn’t, but it does enable us to to create a nice curvilinear wall, should we want one.)2 Or will AI revolutionise drug discovery? (Unfortunately the choke points in that process are target choice and toxicology, not computational biology.)3
The first misconception in the book is likely the (unfortunately) common assumption that technology (application) follows science (theory), when the reverse is more usual.
The history of technology is a complex, messy, and fascinating story of practitioners trying new things and building on the work others as they strive to address the problems in front of them, of steady incremental development. The idea of the steam engine, for example, dates back to somewhere between 15 and 30 B.C., and was made practical by the tinkering of Newcomen (an ironmonger), Stephenson (a mining engineer), and then Watt (a tool maker), a long story of trial and error that resulted in the Industrial Revolution. Thermodynamics, the theory of steam (developed, in part, by Joseph Black, a scientist at the university employing Watt), was developed to understand why these machines worked.
One interesting implication of the bottom-up, emergent nature of technology is that technological waves—covered in the first part of the book—represent how we feel technology affecting us, rather than waves of technology development. Development is (more or less) constant, but the effects of newly developed technology are not. The power loom is a great example as the technology that triggered its creation, the flying shuttle, is nothing special when compared to the previous tweaks to weaving, not much more than a couple of boxes and a spool, but it was the final piece of a puzzle that enabled looms to transition from human to mechanical sources of power. Moving to mechanical power improved productivity by a factor of 2.5 as weavers could attend 2.5 looms rather than one, destroyed the putting-out approach to production (and the weaves traditional way of life) in the process.
The current anxiety over technology in general (and AI in particular) is not a sign that technology development has accelerated, is out of control. Nor should it be taken as a sign that the technologies of our current moment are more potent than those of the past. The opposite might actually be case, with measures such as productivity growth (which is driven by both population growth and the potency of newly developed technologies) persistently down.4 These technologies are surprising and new, we don’t know what to think of them, and it’s easy to let our imaginations get away from us. We forget that technology does have limits. In the case of AI, the narrow intelligence of LLMs is unlikely to sit on a continuum with general intelligence.5 LLMs are powerful tools, but history tells us that there is unlikely to be a smooth developmental path from the LLMs of today to Bostrom’s paperclip maximizer.6 Similarly with synthetic biology et al.
Perhaps that problem here is that we insist on engaging with technology as a noun, rather than a verb. Technology is best approached as a form of human action, rather than as a ‘thing’. For example, current concerns that workplace monitoring technology is driving us to a dystopia are ignoring the humans commissioning and developing this technology.7 It is the actions of these humans that should concern us, as it is their actions that will result in the dystopia we fear. New technology creates new possibilities, and it’s how we approach these possibilities that determines if a technology is a force for good or evil. This is why the history of technology is also a story of how communities and societies negotiate within themselves how a technology should be used. While technology is an important factor in public issues, nontechnical factors always take precedence.8
Given that The coming wave assumes that technology comes in waves and these waves are driven by the insiders, the solution it proposes is containment—governments should determine (via regulation) who gets to develop the technology, and what uses they should put the technology to. The assumption seems to be that governments can control access to natural choke points in the technology. One figure the book offers is how around 80% of the sand used in semiconductors comes from a single mine—control the mine and you control much of that aspect of the industry. This is not true though. Nuclear containment, for example, relies more on peer pressure between nation states, than regulation per se. It’s quite possible to build a reactor or bomb in your backyard.9 The more you scale up these efforts, the more it’s likely that the international community will notice and press you to stop. Squeezing on one of these choke points it more likely to move the activity somewhere else, then enable you to control it.
The coming wave is interesting as it enables us to peak into the conversations of AI industry insiders and get some sense of what they think and feel of the current AI moment. Unfortunately the view from inside is quite limited, and doesn’t match the well documented history of how societies develop and adopt technology, and the predictions the book offers are quite wrong. The prescriptions are also weak, and unlikely to hold back a coming wave.
At its heart this is a book by an insider arguing that someone is going to develop this world-changing technology, and it should be them.
- The book’s Amazon store page has it as “the ground-breaking book from the ultimate AI insider”. https://www.amazon.com.au/Coming-Wave-Technology-Twenty-First-Centurys/dp/1847927483 ↩︎
- Evans-Greenwood, Peter, Robert Hillard, Peter Williams, and Damien Crough. “Digitalizing the Construction Industry: A Case Study in Complex Disruption.” Deloitte Review, July 2019. https://www2.deloitte.com/insights/us/en/topics/digital-transformation/digitizing-the-construction-industry.html ↩︎
- Lowe, Derek, “AI and the Hard Stuff.” In the pipeline, May 23, 2023. https://www.science.org/content/blog-post/ai-and-hard-stuff ↩︎
- See Gordon, Robert J. The Rise and Fall of American Growth: The U.S. Standard of Living since the Civil War. The Princeton Economic History of the Western World. Princeton: Princeton University Press, 2016. ↩︎
- Mitchell, Melanie. “Why AI Is Harder Than We Think.” arXiv, April 28, 2021. http://arxiv.org/abs/2104.12871. ↩︎
- https://en.wikipedia.org/wiki/Instrumental_convergence#Paperclip_maximizer ↩︎
- Evans-Greenwood, Peter, Pip Dexter, Claudia Marks, Peter Williams, and Joel Hardy. “The Trust Deficit between Workers and Organizations Isn’t Personal. It’s Systemic.” Deloitte Insights, August 23, 2023. https://www2.deloitte.com/us/en/insights/topics/leadership/workplace-monitoring-and-the-lack-of-trust-in-the-workplace.html ↩︎
- As per Kransberg’s Fourth Law: “Although technology might be a prime element in many public issues, nontechnical factors take precedence in technology-policy decisions”. See Kranzberg, Melvin. 1986. “Technology and History: ‘Kranzberg’s Laws.’” Technology and Culture 27 (3): 544–560. doi:10.2307/3105385. ↩︎
- https://en.wikipedia.org/wiki/David_Hahn ↩︎