What if the Web Looked More Like Wikipedia?

Posted By : Rina Latuperissa
13 Min Read

[ad_1]

My first encounter with Wikipedia came in the form of an admonishment: a teacher’s warning we shouldn’t trust anything we read on the site, because anybody could write on it and anybody could edit it. Of course, the first thing any teenager does when they’re told not to do something is exactly that thing, and so began a lifelong fascination with Wikipedia. (Much to the chagrin of my grandfather, who, around the same time, had gifted me the entire Encyclopedia Britannica on CD-ROM; I’m fairly confident those disks never spun, just as many sets of the printed version given with the best of intentions were never opened.)

More intriguing than Wikipedia itself was, and remains, the idea at its core: that the Internet can be a place not just for communication and entertainment, but collaboration and truth-seeking. It has rightfully been hailed many times as the pinnacle achievement of the philosophy of the “open web,” which has many definitions, but to me simply means: you can do almost anything here, together, without corporate influence. Today, it’s the open web’s last stand—Apple, Google and Amazon’s decision to ban right-wing darling social media app Parler from their respective platforms and services was absolutely the right choice, but also made abundantly clear that the days of hoping for a truly open web have long since passed.

Yet Wikipedia, which celebrates its 20th birthday on Jan. 15, lives on—and it’s not just surviving, but thriving. It’s the fifth most popular website among U.S. internet users, according to web tracking firm Semrush, with more than 15 billion visits every month, underscoring its evolution from distrusted upstart to the most dependable of all places its size on the Internet. Yes, it has its problems—generations of philosophers would scoff at the notion that any single article could possibly represent the entire “truth” of a thing, and it endures constant attacks from vandals attempting to change the historical record. But those assaults are generally sorted out quickly.

In part, Wikipedia is trusted because it’s open about what former Defense Secretary Donald Rumsfeld famously called “known unknowns.” On Facebook, Twitter or other social media sites, users often present fiction as fact, and uncertainty as confidence. Wikipedia, conversely, is up front about what it doesn’t know. “Our editing community does a phenomenal job being very transparent about what is known and unknown,” Katherine Maher, executive director of the Wikimedia Foundation, Wikipedia’s parent organization, told me in mid-December. “You see that in all breaking news articles, where they’ve got a little tag at the top that says, ‘this event is happening in real time, and some information may be changing rapidly.’ And it’s really a flag, it’s a warning to say ‘we don’t know all the facts here.’” That spirit, says Maher, permeates the site and its design. “You see this when Wikipedia says ‘this content is disputed,’ or ‘this article may not be neutral,’ or how it….presents different sides of controversy, so that the reader themselves has the opportunity to say, ‘now that I’ve reviewed this information, what’s the determination that I want to make?’”

Read More:  Goggle this — AR makes a spectacle

Wikipedia was already on my mind heading into January after my conversation with Maher. But I really glommed onto it while grappling with last week’s horrific attack on American democracy. The episode can be understood through many valid lenses, including those of racism, anti-Semitism and neo-fascism, given the presence of Confederate battle flags and neo-Nazi regalia among the rioters. But it’s also the most tangible real-world result yet of our current epistemological crisis (following other disturbing episodes, like Pizzagate). Many of the participants in the attempted government coup believe that last year’s presidential election was stolen, thereby justifying their actions—even though that’s a lie ginned up by the President and his allies and repeated verbatim on social media platforms like Facebook, Twitter and YouTube (plus certain cable news channels, of course).

You’d be hard pressed, however, to find any of that bunk on Wikipedia; what it has are clear-eyed articles discussing the events as historical phenomena. That got me wondering: why does Wikipedia seem to have a general immunity to bullshit? And can we confer that immunity onto other social media platforms, as if receiving a vaccine?

Wikipedia’s most obvious answer to our crisis of truth is moderation. When the social media platforms began to crack down on President Trump and various conspiracy theories after last week’s attack, they were exercising their immense moderation power in a way they’ve been reluctant to do until now. (That change might have something to do with the fact that the end of Trump’s tenure is just days away). But Wikipedia has had a culture of moderation since day one, with contributors and editors—all volunteers—going back and forth on articles until they agree on some form of truth. (In fact, the more contested a given Wikipedia article is, the more accurate its contents are likely to be, says Maher).

Social media platforms hire moderators, but their focus is generally on the psychologically traumatizing work of removing illicit content like child pornography, rather than on Wikipedia-style debates sifting fact from fiction. So the work of countering falsehoods on social media falls to journalists, academics and other experts.

Read More:  How to prepare your Google accounts for your digital afterlife

As valiant as those efforts are, they’re doomed to fail. As I spoke with Maher, it became clear just how focused Wikipedia is on giving its moderators not just the power to do their jobs, but the tools, too. That includes long-standing features like footnotes, citations and changelogs—none of which are available to those attempting to correct falsehoods on social media platforms. Additionally, in my conversation Maher, it became clear that Wikipedia’s product roadmap is built around truth. For example, Maher says longtime editors who have earned the community’s trust now have access to technology that can identify and block or reverse attempts to vandalize pages by sophisticated attackers using multiple Internet Protocol (IP) addresses to make it seem like multiple users are agreeing on a given change, thereby faking community consensus.

“This kind of—we call it P.O.V. pushing—is not just around misinformation, it can also be around whitewashing the articles of politicians, or it can be a state-sponsored initiative in order to do reputation management or political point of view pushing,” says Maher. “So we’ve worked with our communities to think about what tools they need to be able to more rapidly address these sorts of issues. The same tools that we use for this are the tools that we use for spam fighting. They’re the same tools that we use for identifying incidents of harassment.”

Put another way, Wikipedia can offer the truth because it was (and is still being) built for the truth. It attracts truth-minded volunteers by offering them the tools they need to do their job—a stark comparison to social media sites, where pretty much every new feature (messages! snaps! stories!) are designed to goose engagement, and often make bullshit easier to spread and harder to check. They were built so users could share banal life updates or pictures, their founders never anticipating their products would one day contribute to an attempted subterfuge of American democracy. Their user interfaces aren’t meant for truth, and the business models they have built up around those interfaces—based on hyper-targeted advertising and maximal engagement—can’t possibly accommodate it. You crack down on all the super-engaging bullshit, and your profits go out the window.

“Unlike social media platforms, [Wikipedia] editors don’t fight for engagement—the incentives that push content to the extremes on other platforms simply don’t exist on Wikipedia,” say Miles McCain and Sean Gallagher, students performing research with the Stanford Internet Observatory who jointly responded to my questions. “After all, Wikipedia has no incentive to maximize engagement: it’s a non-profit, and not ad-supported.” Social media executives, meanwhile, have long held that they mostly shouldn’t be held accountable for the content posted on their platforms, and that policing that content opens a potentially dangerous Pandora’s box. “Having to take these actions fragment the public conversation,” said Twitter CEO Jack Dorsey on Wednesday night, reflecting on his company’s decision to ban Trump. “They divide us. They limit the potential for clarification, redemption, and learning. And sets a precedent I feel is dangerous: the power an individual or corporation has over a part of the global public conversation.”

Read More:  UK watchdog on Big Tech likely to remain toothless for a year

As popular as Wikipedia is, its survival is not a given. Like all encyclopedias, it’s dependent on the existence of high-quality primary sources, including journalism—and my industry, for the most part, is not in great shape, especially at the local level. This isn’t just a question of the financial solvency of media outlets. If people don’t trust the sources upon which Wikipedia is based, why should they trust Wikipedia? Maher says the Wikimedia Foundation is cognizant of this, and is working on ways to support the knowledge ecosystem upon which Wikipedia relies.

“It does no good for people to highly trust Wikipedia, but then not trust the institutions of the free press or to trust academic inquiry or scientific research,” she says. “All of these things need to have an underlying public confidence in order for us to be able to face them and for people to then have confidence in what’s on Wikipedia. So we are looking at this as a broader ecosystem question and saying, ‘where are the ways in which we sit maybe as a point of introduction to knowledge, but then also, what are our obligations for how we think about how to support this?’”

If there’s a lesson to be learned from Wikipedia’s continued success, it’s this: build people the tools to effectively call out bullshit, and, like baseball-playing ghosts emerging from an Iowa cornfield, they will come. But what works for Wikipedia—an old-school Internet labor of love—is unlikely to work for a major corporate power with quarterly goals to meet and a market to appease. Maybe the answer, then, is for us as users to spend less time on Twitter, and more on Wikipedia.



[ad_2]

Share This Article
Leave a comment