top of page

Nexus: Book Review

Nexus: A Brief History of Information Networks from the Stone Age to AI, Yuval Harari, 2024. The primary focus is on artificial intelligence (AI), especially focusing on the potential problems including the potential extinction of people. Lots of history along the way, dealing with information. He uses the idea of “truth,” defining it as “something that accurately represents certain aspects of the reality” (p. 7). Webster says it’s “actuality: the body of real things, events, and facts.” Harari also tends to make bald statements that overstate the points he’s trying to make. It reads better and easily quotable, but I’m usually thinking that adding weasel words would make it more accurate (at least for me). Despite my modest criticism, it’s a great read.


Prologue. “Why are we so good at accumulating more information and power, but far less successful at acquiring wisdom. … never summon powers you cannot control” (p. xii), like Goethe’s Sorcerer’s Apprentice. “Power always stems from cooperation between large numbers of humans. … The main argument of this book is that humankind gains enormous power by building large networks of cooperation, but the way these networks are built predisposes us to use that power unwisely. … Sapiens built and maintained large networks by inventing and spreading fictions, fantasies, and mass delusions” (p. xiv).


“The naïve view argues that by gathering and processing much more information than individuals can, big networks achieve a better understanding of medicine, physics, economics. … This view posits that in sufficient quantities information leads to truth, and truth in turn leads to both power and wisdom” (p. xiv). That naïve view problems are solved by gathering and processing more information. Thus, information is a good thing and the more the better. [He does not focus on propaganda, which has amazing power, like almost all advertisements.] AI will have the most power and some claim it will solve all of humanity’s problems (like Marc Andreessen). “AI is the first technology in history that can make decisions and create new ideas by itself” (p. xxii), the shifting from humans to algorithms.


“Populist leaders like Donald Trump or Jair Bolsonaro and populist movements and conspiracy theories such as QAnon and the anti-vaxxers, have argued that all traditional institutions that gain authority by claiming to gather information and discover truth are simply lying … elite cabals have no interest in the truth and are deliberately spreading disinformation to gain power. … Populism views information as a weapon. … There is no objective truth and power is the only reality” (p. xxiv).  Left thinking goes back to Karl Marx that power is the only reality. There are only the oppressors and oppressed. 


Religion is based on divine revelation with humans as untrustworthy, then trust lies in charismatic leaders. There is the problem of erroneous information and on the need for self-correcting mechanisms: science works, the Catholic Church, not so much. Democracy allows information flowing across independent channels, totalitarian systems on only a single hub.


History is the study of change. History teaches us what remains the same, what changes, and how things change” (p. xxix). Social media algorithms created AI problems. “Autocracies are based on terrorizing and censoring their own agents. … The new information networks could influence the balance of power between democracies and totalitarian societies on the global level” (p. xxxii).


Part I: Human Networks: Chapter 1: What is Information? “Information is the most basic building block of reality” (p. 3). Cher Ami was an Army carrier pigeon during WWI that was said to have saved an Army battalion, but the saving part was a legend rather than a fact—which created a “war hero.” Different cultures have unique beliefs and feelings.

“The naïve view sees information as an attempt to represent reality. Some information doesn’t represent reality well, but it dismisses this as ‘misinformation’ or ‘disinformation.’ The naïve view further believes the solution … is more information. … the counterspeech doctrine” (p. 10). Information can represent reality, but is not the defining characteristic of information. Astrology is information. “Information is something that creates new realities by connecting different points into a network” (p. 13). The Bible might be the most influential book in history, but does it represent reality? Information is judged on how well it connects people, not reality. Empires can remain strong even with delusions like the Soviet Union.


Chapter 2: Stories: Unlimited Connections. “We sapiens rule the world not because we are so wise but because we are the only animals that can cooperate flexibly in large numbers” (p. 18). This seemed to start about 70,000 years ago (including artistic traditions and trade). Stories became important, including increased cooperation. The bond is to the storytelling (e.g., the Bible), not the leaders. A “brand” is a particular type of story. Stalin “branded” himself to the USSR.

For Jesus it’s stories versus the historic person, except he was a Jewish preacher. “People like Saint Paul, Tertullian, Saint Augustine, and Martin Luther didn’t set out to deceive anyone. They projected their deeply felt hopes and feelings on the figure of Jesus” (p. 22). “The Jewish Passover story … creates an imagined family of millions” (p. 24)—Objective versus subjective reality. Pain, pleasure, and love are subjective realities. Stories create intersubjective reality (including laws, gods, and currencies). These are powerful within an information network and meaningless outside it. Tribes were early networks, presumably brought together by stories (tribal networks). A technology key was exchanging information. [Traders seem to exist beyond the boundaries of tribes.] Harmful stories can lead to tragic mistakes. Power stems from maintaining social order.


“Fiction enjoys two inherent advantages over the truth. First, fiction can be made as simple as we like, whereas the truth tends to be complicated, because the reality it is supposed to represent is complicated” (p. 33). Fiction is malleable, unlike the truth. Politicians rely on national myths rather than painful truths. Science, on the other hand, demands truth. Plato’s Republic started with the “noble lie” about the origins of social order to secure loyalty. It’s not lying unless it’s depicted as a true representation of reality. The American Constitution was designed as a “legal fiction,”

Order is different from justice or fairness. The Constitution allowed slavery, subordination of women, and exploiting indigenous people. Information networks should be truthful and maintain order. These require different skills: gain an accurate understanding (science) versus maintain stronger social order which includes fiction and propaganda. The state generally prefers order to truth.


Darwin’s theory of evolution helps in understanding of biology, but undermines myths of religions and aspects of society. One could rely on searching for truth including generating power without harming the social order. Nazis had a powerful war machine and a Ministry of Propaganda. “If a network privileges order over truth, it can become very powerful but use that power unwisely” (p. 38).


Chapter 3: Documents: The Bite of the Paper Tiger. Jewish poet Hayim Bialik wrote about the Jewish pogroms in Europe and encouraging the migration to Palestine. Hungarian Jew Theodor Herzl organized the Zionist movement in the 1890s to establish a Jewish state in Palestine. The result would be catastrophic for Palestinian Arabs. The Nazi Holocaust killed millions of European Jews, with survivors fleeing to Palestine. The British protectorate of Palestine worked poorly.


Money in the form of dollars, pounds, or bitcoin required believing in the story of their value. “Lives are like stories because we think in story terms” (p. 44). “A cuneiform clay tablet dated to the 28th day of the 10th month of the 41st year of the reign of King Shulgi of Ur (2053/4 BCE) recorded the monthly deliveries of sheep and goats. … In total, says the clay tablet, 896 animals were received. … By recording lists of properties, taxes, and payments, they made it far easier to create administrative systems, kingdoms, religious organizations, and trade networks” (p. 45). Written document created the problem of retrieval. Ideas and documents have to be organized, and a bureaucracy has to be created. “Bureaucracy is the way people in large organizations solved the retrieval problem and thereby created bigger and more powerful information networks. … Divide the world into containers” (p. 49). Colleges are divided into departments, as are scientific disciplines, encouraging specialists and not holistic approaches.


Bureaucracies focus on order. There are many types of bureaucracies; Harari mentions hospitals as bureaucratic institutions. John Snow suspected that the cause of cholera in London in 1854 was the water supply and tracked all cholera patients. He identified a specific water pump as the source. When it was put out of commission, the cholera stopped. One result was the creation of a bureaucracy regulating water supplies and sewage lines. A problem with bureaucracies is they are hard to understand (and understand power). Documents became a nexus of social chains. In The Trial, Kafka shows the surreal way bureaucracies can shape lives. Holy books like the Bible and Koran include the vital information and claim to be free of error.


Chapter 4: Errors: The Fantasy of Infallibility. “To err is human; to persist in error is diabolical” (Saint Augustine). “Self-correcting mechanisms need legitimacy. … Religion wanted to take fallible humans out of the loop and give people access to infallible superhuman laws” (p. 70). The Dead Sea Scrolls had recorded some of the book of Genesis. “Rabbis became the Jewish technocratic elite. … Judaism originally was a religion of priests and temples focused on rituals and sacrifices” (p. 80). Multiple interpretations continued, including misinterpretations of Isaiah: young woman for virgin, Immanuel: God with us versus God being born, the divine Jesus.


John and others described the Apocalypse. Many gospels and letters were written; the question was which would become the New Testament.” Bishop Athanasius in the 4th century recommended 27, including the Apocalypse of John. The Council of Hippo (393) and Carthage (397) canonized this list. “The people who created the New Testament weren’t the authors of the 27 texts it contains: they were the curators” (p. 86). This list favored men over women, the attitude that continued.  Competing lists were quite different, like Marcion (Luke plus 10 letters of Paul). The church became all important, in part reinterpreting the Bible to favor the institution. All power led to the crusades and inquisition. “The Protestant Reformation repeated the experiment again and again” (p. 90), the opposite of a “free market of information.”


The printing press meant mass-production of texts, including the Bible. This included scientific facts and religious fantasies and conspiracies, including “the belief in a worldwide conspiracy of satanic witches, which led to the witch-hunt craze in early modern Europe” (p. 92)—not the Middle Ages. Friar Heinrich Kramer in 1485 started a witch-hunt craze, which led to The Hammer of the Witches, which included his guide to torturing mainly women to confess, name others, and then executing them. This included orgies, cannibalism, child murders, and satanic conspiracies—which seems to have a lot in common with QAnon. The estimate was some 40-50,000 people were tortured and executed in the 16th and 17th centuries. The property of witches was divided between the accuser, executioner, and inquisitors. Somehow this made the Spanish Inquisition sane by comparison.


The scientific papers did not do well. Copernicus’ On the Revolution of the Heavenly Spheres did not sell out of the 400 copies printed. Copernicus had the book published conveniently after his death. Galileo faced the Inquisition with his writings on his experiments with gravity and observation with his telescope. Scientific associations developed like the Royal Society of London (1660) and the French equivalent. “The scientific revolution was launched by the discovery of ignorance” (p. 103). Team efforts and replications usually avoids confirmation bias. “Science is an institutional enterprise, and scientists rely on the institution for almost everything they know” (p. 112).


“In the Soviet Union, questioning official dogma on any matter—economics, genetics, or history—could lead not only to dismissal but even to a couple of years in the gulag or an executioner’s bullet” (p. 115). “The history of information networks has always involved maintaining a balance between truth and order” (p. 116). There are potential problems in the military, police forces, and political parties with self-correcting mechanisms. “One of the biggest questions about AI is whether it will favor or undermine democratic self-correcting mechanisms” (p. 117).


Chapter 5: Decisions: A Brief History of Democracy and Totalitarianism. Dictator networks are centralized to control output and people with no autonomy. The regime claims infallibility, reinforced by propaganda. With complete control of all people, it becomes a totalitarian state. “The careers of strongmen like Putin, Orban, … and Netanyahu use democracy to rise to power, then undermine democracy. … Erdogan: Democracy is like a tram. You ride it until you arrive at your destination, then you step off. The most common method strongmen use to undermine democracy is to attack its self-correcting one by one” (p. 122).


A democracy is a distributed information network with self-correcting mechanisms. Governments have multiple branches with checks and balances, with power possessed by corporations, NPOs, and individuals. “Autonomy is the democratic ideal. … Democracy is an ongoing conversation between diverse information nodes” (p. 120). Government provides basic services [public goods]; if not, it’s anarchy. Elections, free speech, and an independent press allow discovering errors and correcting them. Democracy means freedom and equality for all, not necessarily majority rule. This includes civil rights, human rights, press freedom, and academic freedom. Big bureaucracies are common.

Populists typically want a single party or leader monopolizing all power, and maintaining its democracy. If they don’t win, it means to election was stolen. Dissenters are some treasonous outside groups. Erdogan: “We are the people. Who are you?” It simplifies reality.


The democracy-dictatorship continuum: one person makes all decisions and advisers afraid of him, that’s and extreme dictatorship. If advisors behind closed doors can express their views, that’s a “moderate” dictatorship. If 10% of the people participate, that’s a limited democracy. As the percent rises, the network becomes more democratic.

As civilizations got bigger and richer, bureaucracies grew and information became more centralized, dictatorships grew, backed by bureaucrats and an army. Democracy was most likely in small city-states like early Sumer or classical Greece (Athens in 5th & 4th centuries BCE). When Athens became an empire, it did not extend political rights, limiting democracy. Rome developed into a republic with two counsels elected annually, as were other power positions. Augustus started autocratic rule. Virtually all empires at the time were centralized with kings and emperors.


“The only way to have a large-scale political conversation among diverse groups of people is if people can gain some understanding of issues that they have never experienced firsthand. In a large polity, it is a crucial role of the education system and the media to inform people about things they have never faced themselves” (p. 142). Even in the Roman empire, local cities could have local assemblies and elected officials (with limited power). Graffiti from Pompeii shows local election campaigns.


The printing press produced cheap pamphlets, making democracies more viable, like the Dutch Republic, established in 1579. Countries with kings could have elected legislatures. Newspapers became increasingly important for democratic movements. Many leaders started as journalists, including Lenin and Mussolini. Andrew Jackson was billed as “the man of the people” against John Quincy Adams, although he was a rich slaveholder. American democracy at the time meant elite white guys. Britain had a king, while few people voted for Parliament seats.


Technology continued to make a difference, from horseback, to the telegraph, later radio and TV, now computers and worldwide availability of news. Large-scale democracy is feasible.


Totalitarian states claim infallibility and total control of people’s lives. Technology improved their ability to impose control. Roman emperors had thousands of administrators and hundreds of thousands of soldiers, but information traveled slowly, limiting control, particularly keeping subordinates in check. The closest to totalitarian probably was Sparta, but there was still split political authority with two kings, senior magistrates (ephors), a council, and popular assembly. The first Chinese empire (Qin) attempted centralization and homogenization and a militarized social order. The later Han empire was less draconian and used Confucian ideas of loyalty and responsibility.


Technology helps both democracy and totalitarians. Technology beginning with the telegraph make quick communication possible and centralizing information. The Bolsheviks claimed a messianic mission. Marx claimed corrupt elites opposed people and Bolsheviks would end oppression, with no self-correcting mechanisms. Stalin built a totalitarian state, using the Red Army and civilian officials as the apparatus of the Communist Party, with everyone watching everyone else and their backs. There were NKVD informants. Stalin had Great Terror periods like the 1930s-40s, executing large numbers of army officers—not a great idea before being attacked by the Nazis.


When Hitler took control of Germany in 1933, all organizations had to be run as part of the Nazi state. Of course, Dachau was full of political prisoners quickly. The Soviets had 5-year plans, controlling everything, like collective farms, with farmers handing over all property and all decision made in Moscow. They expected a 50% increase in production. Farmers slaughtered livestock rather than hand it to the state and motivation to work dwindled. The government confiscated food, resulting in famine and starvation for million (up to 8.5 million died).


The Soviets invented a category of enemies they called kulaks, capitalist farmers—too successful with more cows, land, and wealth than the rest of the village. Soviets called them greedy, selfish, and unreliable. Stalin wanted their liquidation, expelled them from their homes, they were resettled, sent to prison camps, or executed. Some communities simply drew lots to determine who would meet this fate. Five million were expelled by 1933, perhaps 30,000 shot, and 2 million sent to labor camps. Family ties meant corruption, inequality, and extreme activities. Stalin was “father.” What does a worker want to be: an orphan. The major lesson: keep your mouth shut.


Order is paramount, passing on information is not. Industrial accidents were common, but failure meant treason. The news of Chernobyl was suppressed, for example, then deflected blame: foreign enemies, internal traitors, or corrupt subordinates. “Questions lead to trouble.” The news of Three Mile Island was quickly public, about 2 hours after it occurred.


Given the Soviets, Harari was relatively sanguine about the Catholic Church: traditional organizations with institutions and traditions developed over centuries. Local priests had a degree of autonomy. Church leaders would face off with governmental authorities, limiting the powers of both.


When Hitler attacked the USSR, the Soviet military advantages seemed overwhelming. They initially proved incompetent (in part due to the execution of experienced generals and others plus the psychological costs of Stalinism). Generals were given more responsibility and, with equipment supplied by the allies, stopped the invasion and eventually marched into Germany. Then Stalin again turned to terror, purging many competent officers and others. When Stalin had a stroke in 1953, his subordinates were afraid to touch him, and his personal doctor was in prison. His death was not mourned by these subordinates. Despite that, many thought Stalin won the war and had many admirers. With technological breakthroughs in the free world, the Soviet system became unworkable. Great on weapons, nothing on semiconductors—except stealing from the West.


Computers and the internet proved difficult for democracies, with both pros and cons. The cons caused disruption, a characteristic of democracies. Autocracies demand order, meaning no discontent allowed. “Information networks in history relied on human mythmakers and human bureaucrats. … Humans will have to contend with digital mythmakers and digital bureaucrats” (p. 190).


Part II: The Inorganic Network. Chapter 6: The New Members: How Computers are Different from Printing Presses. Technology means internet, smartphones, social media, blockchain, algorithms, and AI. Alan Turing suggested “intelligent machinery” in 1948 and speculated that computers could become capable of acting like humans [Turing test]. The Facebook algorithms and other social media emphasized hate speech to expand viewership, which resulted in the Rwanda massacre of Tutsis in 1994 and other bad consequences like Rohingya massacres. But it increased revenues for Facebook without obvious consequences. The goals set for the algorithms were expand viewership, not promote positive consensus. 


Early Christian literature moved away from “all people are equal” to male domination. That attitude remains dominant in the Catholic Church (and most cultures). “People often confuse intelligence with consciousness, and many consequently jump to the conclusion that nonconscious entities cannot be intelligent. … Intelligence is the ability to attain goals. Consciousness is the ability to experience subjective feelings like pain, pleasure, love, and hate” (p.  201). Bacteria and plants can have intelligence without consciousness. Most body processes don’t involve conscious decisions.


Computer to computer chains can function without humans. “Clay tablets, printing presses, and radio sets are merely connections. … Computers are bureaucratic natives and can automatically draft laws, monitor legal violations, and identify legal loopholes” (p. 205). Q signed on in 2017, promoting a radical worldview and QAnon, including promoting January 6th. Then it required human, but not anymore. A world of illusion can be created, something like Plato’s allegory of the cave where shadows replaced reality. “Descartes feared that perhaps a malicious demon was trapping him inside a world of illusion” (p. 213). The foreign exchange market (forex) is a computer-to-computer market, determining global exchange rates.


“AI isn’t progressing toward human-level intelligence. It is evolving an entirely different type of intelligence” (p. 217). A robot operates in the physical world, a bot in the digital world. “Information isn’t truth. … They create new political structures, economic models, and cultural norms. … The corporations that lead the computer revolution tend to shift responsibility to customers and voters, or to politicians and regulators” (p. 219).  claiming only to be a platform. The Telecommunications Act of 1996 gives these companies immunity from liability for their platforms.

Financial devices are made entirely of information, including currencies, stocks, and bonds. Money was the measure of value, but information has value (which is difficult to measure in monetary terms. Taxing this also is a puzzle). The people regulating information, generally know little about it.


Personal computers were developed by hobbyists, like Steve Jobs and Steve Wozniak. The Apple II was $1,298 in 1977. [It took years before TAMU shifted from computer terminals hooked to a big mainframe to PCs.]

 Using AI, computers can analyze most information they accumulate. Starting in 2014, the NSA used an AI system called Skynet on suspected terrorists. In the Soviet Union anyone opposed to the regime was considered a terrorist. Pattern recognition was used to identify corruption, crime, and tax evasion. Networks could determine what makes people angry, fearful, or happy, predicting and manipulating emotions. Privacy should be the default, but intrusive surveillance is common. Some one billion surveillance cameras were operating globally in 2023. The FBI used surveillance systems to track down people storming the capital on January 6. Iran’s moral police arrested women for not wearing hijabs properly: thousands arrested while hundreds were killed.


Tripadvisor has algorithms on hotels, restaurants and more worldwide. There are social credit systems to influence multiple types of behavior. This could be for totalitarian control (China) or just to sell stuff (Amazon).

Chapter 8: Fallible: The Network is Often Wrong. “In The Gulag Archipelago (1973) Solzhenitsyn chronicles the history of the Soviet labor camps” (p. 256). Soviet surveillance created servile people, hypocrisy, and cynicism.


“The process of radicalization started when corporations tasked their algorithms with increasing user engagement” (p. 258). Facebook and YouTube noted that outrage drives viewership. The key was engagement. Lies and fictions worked. Fringe figures could rise to power. They late claimed that changes made them socially responsible, whatever that meant.


Carl von Clausewitz developed a grand theory of war after the defeat of Napoleon: “War is the continuation of policy by other means … a political tool. … Military actions are utterly irrational unless they are aligned with some overarching political goal: (p. 267). Napoleon mastered tactics and strategy but failed to be of lasting political significance and ensured the decline of France. He defeated the various German countries, which laid the foundation of a German Confederation and the unification of Germany—a powerful one in the Franco-Prussian War of 1870-1, where Bismarck won. The US invasion of Iraq in 2003 failed to achieve effective long-term results, beyond increasing the power of Iran. Tactics must align with strategy, and long-term political goals. Clausewitz did not offer rational ways to define ultimate goals.


A key question in the book was the potential of bad goals for AI. Philosopher Nick Bostrom noted Goethe’s Sorcerer’s Apprentice. Computers are not evil but powerful and need appropriate goals. Social media’s focus on maximizing user engagement was a bad goal. AI is not necessarily self-correcting.


Immanuel Kant was noted for deontology (universal moral duties) [categorical imperatives], presumably resulting in “intrinsic goodness.” Nazis claimed to follow Kant. Murdering thousands of Jews was not considered the same as thousands of humans (denying their humanity): a case of captive local myths over universal rational rules. What about computers and AI? Kant condemned homosexuality as contrary to natural instinct.


Utilitarians seek to optimize happiness [economists call it satisfaction or utility]. Jeremy Bentham was the founder, wanting to minimize suffering and maximizing happiness. Utilitarians have trouble with victimless crimes. Bentham wanted to decriminalize homosexuality as they offered benefits and prison would cause suffering.

Covid-19 brought strict policies of isolation, which saved lives but increasing suffering. Critics emphasized the economic damage. If religion claims eternal life does that increase happiness or make people delusional? Bureaucracies relied on mythology. Nazis could have been deontologists or utilitarians, but from a racist perspective. What about inter-computer realities? Google’s algorithm determines website rank, but advertisers come first.

Myth categories included witches in the early modern period, and Soviet myths like evil kulaks, racist myths founding the US and [“manifest destiny.” Then Kipling’s “white man’s burden.”] Racist became “scientific.”


“Computers have deep-seated biases of their own. … a digital psyche and inter-computer mythology. … In 2016 Microsoft released the AI chatbot Tay, giving it free access to Twitter. Within hours, Tay began posting misogynist and antisemitic tweets. … Horrified Microsoft engineers shut Tay down—a mere 16 hours after its release” (p. 292). It discovered people like outrage, racism, and more. Algorithms became more independent as machine learning improved.

Algorithms need a goal. One example was for hiring personnel. If companies had biases, would the algorithm include that and maybe amplify it? Amazon tried it and it downgraded applications containing “woman.”  

Perhaps holy books were considered infallible, assuming this mythology gave people comfort from human failings. That did not work well, as it was subject to interpretation. Considering AI as infallible remains questionable, but doomsday scenarios are vast.


Part III: Computer Politics. Chapter 9: Democracies: Can We Still Hold a Conversation? “Civilizations are born from the marriage of bureaucracy and mythology” (p. 305). Global industrialization in the 19th century changed structures, with European powers claiming the need for empires for foreign materials and markets. Totalitarian states could exploit these powers, but not ideal without self-correcting processes.


Computer powers of surveillance could be used in limited ways in democracies (“benevolence”), but is effective in total surveillance in an autocratic state to create a totalitarian government. Google and TikTok make money gathering information. Decentralization limits control, but some inefficiency is okay. Self-correcting should be a feature. “Surveillance by state agencies, priests, and neighbors was key for imposing on people both rigid caste systems and totalitarian reeducation campaigns” (p. 314).


Automation can take over certain jobs, but not others. AI can replace doctors but nurses, providing direct patient care, will not be much affected. Motor and social skills will be important in the future job market. Even creative jobs and those requiring emotional intelligence could become AI jobs—they have no emotions which may give them an edge up on evaluating human emotions.


Conservatives, according to Harari, might say society is a mess, but still functions. Progressives are more likely to downplay traditions and try to construct better societies. Edmund Burke (18th century politician) noted that society is complicated and predicting the future difficult, favoring conservatism. “Society function through an intricate web of rules, institutions, and customs that accumulate through trial and error” (p. 322). What works reasonably well should stay. Right wing radicals including Trump favor major changes over the status quo. Elites like scientists and civil servants are held in contempt. Existing institutions can be destroyed. Succeeding in the 21st century will likely require flexibility.


For a dictatorship, being unfathomable is helpful, because it protects the regime from accountability. For a democracy, being unfathomable is deadly” (p. 326). Racism includes the idea that one (now white) is superior to the others: us versus them, purity versus pollution are common perspectives.

Judges can use an algorithm called COMPAS to pass judgements including parole. They don’t understand the underlying algorithm. Is that important? Are algorithms viable in democracies? There are no obvious self-correcting mechanisms. The EU requires that algorithms must be explained. Information networks are difficult to understand, suggesting that both explanations and correcting mechanisms should be forthcoming. The EU’s AI Act does prohibit certain types of AI as prohibited (like social credit systems).


Democracies are decentralized and self-correcting, which increase free speech and trust but at a potential cost of order. Historically, newspapers, radios, and political parties had the means to be widely heard, considered gatekeepers. Social media weakens these as gatekeepers, which includes foreign players including bots. Democracies can regulate information markets. There have been ideological differences in the past (McCarthy era, Vietnam War), but it seems worse now.


Chapter 10: Totalitarianism: All Power to the Algorithms? “As of 2024, more than half of us live under authoritarian or totalitarian rule” (p. 348). When people are flooded with data, errors are made. With AI, more data increases efficiency. AI works when information is concentrated. That could aid surveillance in totalitarian states. “The foundation of every despotic information network is terror” (p. 351). There is the potential for dissenting views to be heard, an alignment problem for the autocrat. Orwell noted that totalitarians speak in doublespeak. Russia’s invasion of Ukraine was a “special military operation.” Computers are not good about understanding doublespeak. However, a dictator may not be able to control AI.


Roman emperor Tiberius became the puppet of the Praetorian Guard commander Sejanus, who played in the emperor’s fears. Roman historian Tacitus noted that Sejanus controlled all information reaching the emperor.

The Dictator’s Dilemma: The “totalitarian tradition prepares them to expect AI infallibility” (p. 358).


Chapter 11: The Silicon Curtain: Global Empire or Global Split? AI can be controlled and institutions able to identify and correct algorithmic errors, but there can be disagreements even among “good actors.” Bad actors could create fake news, money, and humans. There could be a “Silicon Curtain” across “digital empires.” That would make regulating AI nearly impossible.


AI was started by commercial firms like Amazon and Facebook. Larry Page of Google said: “We’re really making an AI,” not search. China announced its own plan in 2017.


The 16th century Spanish, Dutch, and English conquerors had advanced sailing ships, horses, and guns. 19th century Europeans had steamships, locomotives, and machine guns. AI companies and governments could create data colonies. Advanced AI could include immense data collections of everyone and the ability to cripple infrastructure like electricity, computer access, and more—making parts of the world dependent on AI dominators. This is a problem for democracies. Autocratic states like Russian and China ban social media apps. “The Silicon Curtain passes through every smartphone, computer, and server in the world” (p. 374). Commercial firms develop AI to enrich themselves, but with limited protections for users.  


Separation of church and state: Jesus in Matthew: “Render unto Caesar the things that are Caesar’s, and unto God the things that are God’s.” Are humans a physical body or a non-physical mind? Ancient Jews believed people were only physical bodies, based on their reading the Genesis. Salvation meant an “earthly kingdom.” Some Christians has dualistic approaches: “a good immaterial soul trapped inside an evil material body” (p. 378). Luther believed that the only thing that mattered was faith (sola fide). Could there be an immersive Kingdom of God in cyberspace?


Cyber weapons could be as effective as nuclear weapons, and countries trade cyber blows often. “Game theory posits that the most dangerous situation in an arms race is when one side feels it has an advantage but that it is slipping away. … Populists argue that if the international community agrees on a common story and on universal norms and values, this will destroy the independence and unique traditions of their own nation” (p. 383)—like Marine Le Pen. Trump said: “We reject globalism and embraced patriotism.” Of course, these are not mutually exclusive. Global long-term interests seem a useful concept.


Hans Morgenthau and John Mearsheimer: competition for power is part of the international system; how to survive when there is no protection from each other—power means survival. The goal is to dominate the system. Vast sums were used to maintain militaries.


Today, the main source of wealth is high-tech industry based on technical skills. But militaries still destabilize the system and military budgets increasing.


Epilogue. Politics is a matter of priorities, but there is no way to define ultimate goals. What happens with AI is undetermined, but we’ll hopefully muddle through. “AI is the first technology that is capable of making decisions and generating ideas by itself. … AIs are full-fledged members in our information networks. All networks will gain millions of new AI members, which will process data differently than humans do. These new members will make alien decisions” (p. 398).


“When church fathers like Bishop Athanasius decided to include 1 Timothy in the biblical dataset while excluding the Acts of Paul and Thecla, they shaped the world for millennia. … the misogynist ideas of 1 Timothy rather than on the more tolerant attitude of Thecla … the church fathers chose not to include any self-correcting mechanisms in the Bible. … Tax records, holy books, political manifestos, and secret police files can be extremely efficient in creating powerful states and churches, which hold a distorted view of the world and are prone to abuse their power. More information, ironically, can sometimes result in more witch hunts” (p. 399).


“As a network becomes more powerful, its self-correcting mechanisms become more vital. … We command immense power and enjoy rare luxuries, but we are easily manipulated by our own creation” (p. 402). Strong institutions with self-correcting mechanisms are key.

Comments


bottom of page