The Coming Wave

 The Coming Wave" by Mustafa Suleyman and Michael Bhaskar, which I summarized, spans over 370 pages. Here, I provide a summary


Advance Praise (Pages 2-3): The book received accolades from prominent figures like Yuval Noah Harari, Nouriel Roubini, Al Gore, and others. They praise it as a crucial, well-researched work addressing the existential dangers of AI and biotechnology and its potential to reshape humanity's future.

Prologue (Pages 12-13): The prologue, partly written by AI, reflects on critical moments in human history, like the discovery of fire and electricity, and likens the rise of advanced AI and biotechnology to such transformative events. It discusses the vast potential benefits and dangers of these technologies and emphasizes the importance of the decisions humanity will make in the face of this new era.

Chapter 1: Containment Is Not Possible (Pages 15-22): This chapter discusses the concept of a 'wave' sweeping through history, reshaping civilizations. It introduces the idea of 'Homo Technologicus' - humanity driven by technological evolution. It reflects on how technology has become cheaper, more accessible, and argues that the rapid advancement of AI and biotechnology might make their containment impossible.

Chapter 2: Endless Proliferation (Pages 29-31): The beginning of Chapter 2 illustrates the proliferation of technology using the example of the internal combustion engine and the automobile, highlighting how technology follows a trajectory of becoming more accessible and influential over time.


Large Language Model

Large Language Models (LLMs) like the Switch Transformer are highly advanced AI models used for understanding and generating human language. These models are built upon a foundation of neural networks, which are designed to mimic the way the human brain operates, though in a simplified and more structured manner. To understand how these models work and their scale, it's important to break down the concepts of Switch Transformer, parameters, and tokens:

  1. Switch Transformer: This is a specific kind of large language model developed by Google. It's a variant of the Transformer model, which has been a groundbreaking architecture in natural language processing. The key innovation in the Switch Transformer is its efficiency and scalability. It uses a technique called "sparsity" to manage the connections within the neural network more efficiently. This allows the Switch Transformer to scale up to a much larger number of parameters (the individual elements that the model learns and adjusts during training) compared to traditional models, without a proportional increase in computational requirements.

  2. Parameters: In the context of LLMs, parameters are akin to the knowledge and rules the model learns during its training. Each parameter can be thought of as a knob that adjusts how the model responds to certain types of input. In simple terms, more parameters mean the model has more capacity to learn and remember different aspects of language. Large models like GPT-3 or the Switch Transformer have parameters in the range of hundreds of billions, which allows them to generate highly nuanced and contextually relevant language outputs.

  3. Tokens: Tokens are the basic units of language that the model processes. In English, a token could be a word or a part of a word. For instance, the sentence "ChatGPT is helpful" would be broken down into tokens like "ChatGPT," "is," and "helpful." The model processes these tokens to understand and generate language. Each token is represented as a vector in a high-dimensional space, which allows the model to capture the nuances of meaning, context, and syntax.

In summary, the Switch Transformer and other large language models are massive neural networks with a high number of parameters, allowing them to process language at a highly advanced level. They work by processing tokens of language and using their extensive number of learned parameters to understand and generate nuanced human-like text. This combination of scale (through parameters) and efficient processing (like the techniques used in the Switch Transformer) represents a significant leap in AI's capability to handle complex language tasks.

China's activities in the field of technology 
  1. China's National Strategy for AI Leadership (Page 117): China has an explicit national strategy to become the world leader in AI by 2030 through the New Generation Artificial Intelligence Development Plan. This plan aims to harness government, military, research organizations, and industry in a collective mission to make China a primary AI innovation center​​.

  2. China's Technological Advancements (Pages 118, 119, 120): The book details China's surge in fundamental technologies like AI and bioscience, with substantial investment and increasing numbers of highly cited papers in AI. It notes China's achievements in areas like quantum technology and the significant growth in its scientific and engineering capabilities​​​​​​.

  3. Challenges and Strategies in Technology (Pages 233, 234): The book discusses China's dependency on importing critical technology components, especially semiconductors, and its efforts to develop domestic semiconductor capacity. It mentions the U.S.'s export controls aimed at slowing down technological development in China, particularly in AI and supercomputing​​​​.

  4. China's Response to Technological Threats (Page 233): Xi Jinping's concerns about China's dependency on imported technology are highlighted, including the strategic importance of controlling key technologies for China's future and geopolitical security​​.

  5. Global Impact and Technological Arms Race (Pages 122, 206, 184): The book reflects on the global proliferation of technology, including China's role in this trend. It mentions the impact of China's technological growth on global dynamics and the evolving nature of technological arms races, also touching on demographic challenges that China will face, requiring new technology for sustainability​​​​​​.

  6. Export Controls and Choke Points (Page 235): Discussion about the use of export controls and choke points as strategies to regulate the pace of technological development globally, not just in China. This approach is seen as a method to contain but not entirely stifle technological progress​​.



Other Reference

Bibliography | The Coming Wave Book (the-coming-wave.com)

India’s Prospects in the Age of AI, Sramana Mitra, Ramakrishna Mission Institute, Kolkata 11/3/23 (youtube.com)

Comments

Popular posts from this blog

Hitting Refresh

Oath and Honor

Book American Prometheus: The Triumph and Tragedy of J. Robert Oppenheimer