Rachel Botsman looks at whether technology can restore trust in the government and companies. (Jabin Botsford/The Washington Post)

Rebecca MacKinnon is director of the Ranking Digital Rights project at New America and author of "Consent of the Networked: The Worldwide Struggle for Internet Freedom." From 1998-2004, she was a CNN bureau chief first in Beijing then Tokyo.

'Without trust," writes Rachel Botsman, "society cannot survive, and it certainly cannot thrive."

Clearly, we are in trouble. Two-thirds of people surveyed last year in 28 countries for the 2017 Edelman Trust Barometer expressed low levels of trust in "mainstream institutions" of business, government, media and nongovernmental organizations. The national and geopolitical implications of this trust collapse are massive and underpin — among other things — the rise of extremism and demagoguery of all stripes.

In "Who Can You Trust?" Botsman, an Oxford lecturer and corporate consultant, offers a timely and accessible framework for understanding what trust is, how it works, why it matters and how it is evolving. It is an important primer to the obstacles and opportunities we face as a society if we are to repair and redefine trust across socioeconomic, political and cultural divides. The stakes are high. We all have a role to play in reforming or creating new mechanisms necessary for accountable governance and a just global economy.

Through human history, trust has evolved in three basic stages: Local trust was enough when people lived in small communities and everybody knew everybody else; industrialization and urbanization required institutional trust so that people could trust complete strangers running governments, corporations, legal institutions, and the global frameworks and norms for international trade, commerce and finance. We are now living through a massive global devolution of trust from institutions to individuals: distributed trust facilitated by high-tech platforms, many (though not all) of which are run by the private sector.


“Who Can You Trust,” by Rachel Botsman (PublicAffairs/PublicAffairs)

This shift from institutional trust to distributed trust, from experts to individuals, is caused by three main factors. First, accountability is unequal. Stark proof of this was brought to light by coordinated investigative reports published by news organizations across the world based on leaked Panama Papers banking data in 2016: Rich, powerful and well-connected individuals have been able to amass vast quantities of often undocumented wealth by circumventing tax and anti-bribery laws (among many other types of laws), while ordinary people are likely to be caught and punished for lawbreaking. Elites and authority have lost credibility in the Internet age as hierarchies have flattened. People in power are no longer seen to deserve greater respect as the details of their lives are exposed. Internet-empowered leakers and investigative journalists uncover abuses of power that erode public faith in political leaders, financial institutions and major corporations, among others. Yet at the same time, our trust in the news media has fallen dramatically since its high in the 1970s. Segregated echo chambers have proliferated as social media makes it easy for us to share news and opinion directly among like-minded circles of friends and avoid facts or arguments that might challenge our worldviews.

As a result, Botsman writes, "we stand on the threshold of a chaotic and confusing period." Rather than mourn the loss of old mechanisms and cling to their broken remnants, we must build "a new era of hyper individual accountability . . . a new kind of vigilance and decision-making."

Botsman does not prescribe how we accomplish that. But if the old ways of bestowing and revoking trust such as voting, markets and consumer choice are no longer functioning, then we must overhaul or replace them. Systems must be "driven democratically and rationally," become more "transparent, inclusive, and accountable" and, most important, be designed to "put people first," which profit-driven platforms (and the governments that in theory regulate them) have failed to do sufficiently.

Billions of people across the globe have quickly entrusted sensitive personal information — in some cases even their lives — to platforms such as Facebook, Uber and Airbnb in exchange for the convenience, efficiency, cost savings, gratification and even empowerment that they offer. Unfortunately we have delegated trust without first thinking clearly about what we will do when things go wrong — who we will hold accountable and how we will do so. Uber has been exposed for (among other things) deprioritizing passenger safety and driver background checks, and misusing the vast amounts of data collected about drivers and users. Facebook has treated users as guinea pigs for experiments around their emotional sensitivities and allowed advertisers to target people with specific messages designed for maximum manipulation — political as well as commercial.

Tech executives are responding to the trust crisis mainly with promises of more and better technology: More powerful artificial intelligence will help identify and root out malicious behavior on their platforms. Ride-sharing companies are investing heavily in the development of self-driving cars as a solution to the untrustworthiness and unpredictability of human drivers. But Botsman warns that the responsibility for ensuring that the robots being deployed are trustworthy lies with the human beings who design and deploy them. We have not thought through how we hold those people accountable, let alone their robots. She warns against a natural tendency "to become over-reliant on machines." Ideally machines should be programmed to "understand" their own limitations and even seek human help or intervention.

A growing number of businesses, policymakers, entrepreneurs and even techno-activists hope that new trust mechanisms can be established through the use of exciting new technologies such as the blockchain. In essence, blockchains are digital public ledgers of transactions that cannot be altered, thereby creating greater transparency and accountability — as well as transactional efficiency — and making corruption much harder. The technology for validating each "block" or record in the chain makes it impossible to alter one without altering all subsequent blocks. Blockchains are starting to be deployed to track items throughout business supply chains, establish records of property ownership, or most famously serve as ledgers for financial transactions — of virtual currencies such as bitcoin but also of more traditional banking transactions, eliminating intermediaries needed to verify sources and recipients. This design has raised hopes that the technology will boost trust by making foul play, theft and manipulation more difficult to hide.

However, Botsman warns that the blockchain is no panacea for human trust, any more than the Internet has fulfilled its inventors' dreams as the force that would obliterate authoritarianism with global free flows of information. Whether blockchain systems lead to more accountable governance and a more just global economy will depend on their design and the intentions of those who build them. There is no app for fixing trust.

It is difficult to write about this subject either here in the United States or in Britain, where Botsman lives, without taking sides in the raging political battles and controversies of our day. She manages not to go there, which is both a strength (more people from different backgrounds will read it) as well as a weakness of the book. A number of other recent books call for regulation and stronger government as the main answer to over-concentration of power by technology platforms. But with trust in government at historic lows and politics across the democratic world polarized to the point of dysfunction, Botsman does not address questions of how democracies might reform political processes and institutions to restore trust in democratic governance in particular or democracy in general.

Meanwhile, the world's most powerful authoritarian state is working seamlessly with corporate platforms to develop a system of "social credit" aimed at improving trust — and discouraging bad behavior — across digitally powered platforms that mediate communications and transactions of all kinds. Botsman describes in a chilling chapter how in China, users of major social media and commercial platforms are now given a social credit rating. The score not only aggregates such things as whether you have paid your credit card bills or mortgage, or have had "undesirable" or "illegal" content (as defined by the government) deleted from China's homegrown social media platforms, but it also polices information such as arrest records and location data from mobile devices. High scores bring perks like discounts, lower interest rates, shorter lines at airports and faster processing of passports for overseas travel. Low scores can result in a person being barred from certain professions (including journalism and education) and even from the right to travel overseas. The system, now experimental, will be fully implemented by the Chinese government and participating corporations by 2020.

That may sound fantastical to citizens of Western democracies, but Botsman warns, "Today China, tomorrow a place near you." Already companies across the democratic world are creating systems through which individuals are rated for various behaviors and characteristics. Some are used for social and commercial purposes, such as helping people without a credit history get loans. Others are used by government agencies including law enforcement, border authorities and courts to predict who might pose a greater threat to society. It is vital that we establish ways to make sure that online ratings and data are used "responsibly and with our permission." It is unacceptable for algorithms and scoring mechanisms to be opaque and unaccountable to scrutiny.

Again, regulation — data protection laws and transparency requirements in particular — might help here, if citizens of the United States, home to the world's most powerful Internet platforms, actually trusted the officials who make, interpret, implement and enforce such laws. While Botsman insists that "we do have choices and rights we need to be exerting now," she stops short of calling for citizens to take matters into their own hands and drive reform movements that can impose real consequences on corporations, politicians, and other actors such as media and private foundations that are propping up the status quo.

"Who Can You Trust?" does make a clear case for why it is unquestionably in the long-term interest of companies, governments and other institutions to be much more transparent and subject themselves to new mechanisms that can credibly hold them accountable. It is the only way they can hope to earn and maintain trust in the future. But are they capable of voluntarily sharing and ceding power to the public in exchange for our trust? Historically, power has rarely been ceded without violence, or a credible threat of violence, or other massive crises with serious human costs.

One can only hope that citizens can make today's power-holders across all sectors see the light before it's too late. We face a daunting challenge: navigating a massive "trust shift" without the wars and violent revolutions that accompanied the last major shift, into the industrial age.

wHO cAN
yOU tRUST?
How Technology Brought Us Together and Why It Might Drive Us Apart

By Rachel Botsman

PublicAffairs. 322 pp.