“I said, ‘This was definitely written by someone who had a slant in mind,’ ” said the 36-year-old physicist, who goes by the name XOR’easter on the online encyclopedia and spoke on the condition of anonymity to avoid harassment. “I had to get in there and clean it out like a garbage disposal. Sometimes you just have to muck around.”
XOR’easter’s behind-the-scenes work is emblematic of the way information is shaped, colored and delivered in the current moment, when online snippets — true or not — are critical in influencing Americans’ views of fast-moving events. XOR’easter, who says he has no connection to the Biden campaign, says his sole motive is protecting the truth.
Hunter Biden served nearly five years on the board of Burisma, Ukraine’s largest private gas company, which was investigated by Ukrainian prosecutors. He was not accused of any wrongdoing in the probe, which then went dormant.
Joe Biden later pressured Ukraine to fire its top prosecutor, Viktor Shokin, who, in the judgment of the United States and other Western countries, was not sufficiently pursuing corruption in Ukraine.
The task of elucidating these details on Hunter Biden’s Wikipedia page — a political and epistemic minefield — has been shouldered by the physicist-editor. He began work on the entry shortly after news of the allegations, confirmed by a rough transcript released Wednesday, that Trump had brought up the former vice president’s son in a phone call with Ukraine’s president.
The page has been viewed nearly 230,000 times in the past 30 days, more than the page for Nancy Pelosi, the speaker of the House, or for Vice President Pence. Wikipedia dominates Google’s search results and helps supply the information spit out by Siri and Amazon Alexa.
Unlike social media platforms, however, the crowdsourced encyclopedia is governed by strict standards and style guides, which are enforced by volunteers allowed to remain anonymous.
“Wikipedia, demanding citation in reliable sources, is very resistant to fake news,” said Mike Dickison, a museum curator, zoologist and New Zealand’s first “Wikipedian-at-large,” meaning he recruits editors to add information about lesser-known topics.
Wikipedia has no need to goose page views for advertisers because there are none. There is no fear of alienating one side of the political spectrum because the platform has no mandate other than to inform. Any financial motive is marginal because the Wikimedia Foundation is a nonprofit entity.
XOR’easter began editing pages more than two years ago, and he spends an hour or two each day combing through the capacious online space. He set to work revising the information about Hunter Biden on Friday night after seeing an image of the page on Twitter.
The entry immediately set off alarms, he said, because of insinuations and “fly-by-the-night sources.”
In place of the pro-Trump citations, he added PolitiFact, Bloomberg and The Washington Post. Then he contacted an administrator and urged him to keep an eye on the Hunter Biden page in case it became a magnet for trolling.
He saw that a lengthy quote from Peter Schweizer, a senior editor-at-large at Breitbart News, had been restored after another user had removed it, so he axed it a second time. Nothing stops partisans from making new changes after someone like XOR’easter has gone through an entry.
Over several days, the academic physicist said, he helped take the page from a D-plus to a B-minus. “It’s hard because we’re tracking a moving target in some ways,” he said, noting that facts are still being unearthed.
Meanwhile, other users began exercising their own editorial oversight. Eliminating details about Biden’s relationships, one user wrote, “Cut it out with this tabloid stuff.” The information was later restored by another anonymous user, who deemed the detail a “non-salacious reference to his relationship with his sister-in-law.” (Hunter Biden at one point was romantically involved with his late brother Beau’s widow.)
Wikipedia’s rules of engagement have gradually accreted over the years. The guidelines are most stringent for living people, governed by three main principles: neutral point of view, verifiability and no original research.
Bots are employed to guard against basic disruption, and the automated software is responsible for as many as one-third of the edits to the site globally, and many more to its underlying data, according to a
. A “recent changes patrol,” or RC patrol, is composed of individual users, who watch for more subtle intrusion and hash out disagreements about edits on a page’s “talk” section. Administrators oversee the process.
The setup has faced high-profile tests before. In the spring of 2018, anonymous editors detected suspect activity on a page for Maria Butina, a Russian woman accused of running a covert operation to gain influence with American conservatives. Some of the activity — an attempt to excise unflattering information — was traced to the university in Washington where Butina had been studying. The information was restored.
“It’s remarkable what Wikipedia editors get done, given that it’s a huge database and there are not that many of them,” said Eric Goldman, co-director of the High Tech Law Institute at Santa Clara University School of Law.
He compared that rigorous fact-checking to the limited oversight conducted by giant technology companies such as Facebook, which on Tuesday reaffirmed that it would not fact-check speech by politicians on its platform.
Nick Clegg, the company’s vice president of global affairs and communications, said it was not the platform’s role “to referee political debates.”
XOR’easter doesn’t see himself as a referee, either. His work is simpler, he said. Between routine tasks, or sometimes over a sandwich, he checks in on his “watch list” of pages that interest him, or on which he has previously worked.
Then he does what critics say has eluded multibillion-dollar technology companies: “adhering to a minimum standard of scientific, historical or journalistic respectability.”