“There are lots of corporations and governments that are at this moment busy trying to hack us. And the only way to stay in the game is to get to know ourselves better.”
Yuval Noah Harari uses his new book — “21 Lessons for the 21st Century” — to do two things: marvel at the technological advancements, such as artificial intelligence and the algorithms that power it, disrupting our lives and warn about the dangers they pose in all facets of it. “The moment they know you better than you know yourself, then they can play on you, on your emotional system, on your biochemical system,” Harari told me. “And you will never know because you will just identify with whatever noise they produce when they press this button or that button.”
“The three biggest problems humanity now faces are nuclear war, climate change and technological disruption,” Harari said. “Even if we somehow managed to prevent nuclear war and climate change, AI and biotechnology are still going to completely disrupt the job market, the political system and our own bodies and minds.” This gets right at the heart of a paragraph in the introduction of “21 Lessons.”
The merger of infotech and biotech might soon push billions of humans out of the job market and undermine both liberty and equality. Big Data algorithms might create digital dictatorships in which all power is concentrated in the hands of a tiny elite while most people suffer not from exploitation but from something far worse—irrelevance.
I discussed the merger of infotech and biotech at length in my previous book Homo Deus. But whereas that book focused on the long-term prospects—taking the perspective of centuries and even millennia—this book concentrates on the more immediate social, economic, and political crises. My interest here is less in the eventual creation of inorganic life and more in the threat to the welfare state and to particular institutions such as the European Union.
Our conversation goes more into the perils of AI, including who might own the data we produce and at what point the algorithms take over decision-making from humans. “Whoever controls these algorithms will be the real government,” Harari said. “It doesn’t matter if you call it a corporation, or you call it a government by any other name.”
We also get into a discussion of nationalism, which Harari said is “one of the least natural things for homo sapiens.” We talked about religion. “Lots of people go around with this ridiculous idea that all of morality comes from the Bible and the Ten Commandments, and, thank God, it doesn’t.” And we talked about the excesses and self-centeredness of both. “Everybody thinks that history revolves around them,” Harari told me, “that without them, the whole humankind will be living in some dark ignorance and in chaos and whatever.”
Listen to the podcast to hear more of my provocative conversation with Harari, including his belief that truth and power are not necessarily compatible. “Truth and power can go together only so far,” he said. “At one point or another they will have to split to go in different directions. If you really want to know the truth, at some point you will have to give up power. And if you really want to gain power at some point, you will have to give up truth.”
Follow Jonathan on Twitter: @Capehartj
Subscribe to Cape Up, Jonathan Capehart’s weekly podcast