Recommended Reading
Nexus: A Brief History of Information Networks from the Stone Age to AI

Historian and philosopher Yuval Noah Harari has been a vocal critic of artificial intelligence (AI), so it's no surprise that his latest book, Nexus: A Brief History of Information Networks from the Stone Age to AI, takes a deep dive into the history of information networks. In his characteristic provocative style, Harari starts his story with the origins of human networks, followed by an account of inorganic (or computer networks) and then concludes with a section on computer politics. He covers a lot of ground traversing religion, authoritarianism, democracy, and social media.
Historically, information has not only revealed truth and led to wisdom but has also been used to create order and wield power, often sacrificing or at least bending the truth. Users of social media are acutely aware of how information, particularly misinformation, can exert power as it rapidly spreads across the network. Harari emphasizes that every human information network must simultaneously accomplish two crucial tasks: discovering truth and establishing order.
In terms of the sources and flow of information, our world today is so remarkably different than just a few decades ago, that we may view history as irrelevant. Reinforcing our need to respect history. Harari reminds us that, “History isn’t the study of the past; it is the study of change. History teaches us what remains the same, what changes, and how things change.” Despite a tremendous amount of information at our disposal, we are as susceptible as our ancient ancestors were to fantasy and delusion.
We live most of our lives online now, viewing information from a variety of sources. We don’t really control what we see first, or how the next post surfaces in our feed – an algorithm determines that. What we may view as information-> truth - > wisdom may also be information -> order -> power (bypassing the truth check). Harari implores us to develop self-correcting mechanisms in artificial intelligence networks, because he rightly worries that runaway misinformation driven by unchecked computer networks may start a war or a major catastrophe.
We reside in a commendable scientific bubble, where peer review, corrections, and retractions are the norm. Harari writes: “The most celebrated moments in the history of science are precisely those moments when accepted wisdom is overturned and new theories are born.” However, this is not the case for the majority of the information we consume, which frequently lacks these rigorous self-correcting mechanisms. “The networks have also learned how to use information to maintain stronger social order among larger populations, by using not just truthful accounts but also fictions, fantasies, propaganda, and—occasionally—downright lies.” Harari’s warning serves as a timely call to action for all of us to gain a deeper understanding of how information networks shape our perception of the world. We must grasp both the potential benefits and the risks of these networks, which are designed to potentially evade self-correcting mechanisms due to their inherent autonomous capabilities.
Drawing from historical lessons from witch hunts and Stalin’s kulak liquidation, Harari is worried that artificial intelligence networks will wield unexpected dominion over humans. As consumers of an increasingly complex information age, I will leave you to decide if you want to read this book or not. There are two compelling stories in this book that I do want to share with you. The first is the unfortunate consequence of a Facebook algorithm that contributed to the massacre of the Rohingya community in Myanmar, and the other is that of a Microsoft chatbot, called Tay, that became racist and mean within 16 hours of learning from its community as it began to repost what it was gleaning from its followers! It was quickly taken down.
—Salahuddin “Dino” Kazi, M.D.