User:Durova/The dark side
Let's face facts: Wikipedia has become an important force on the Internet. If you've got a business to run or a belief to circulate there's a big temptation to hit that edit button and do your thing. You're on your honor here. Most people have honor, which is why Wikipedia is huge and (usually) pretty darn good, but then there's that thought—what if you could harness this site and make it work for you?
You aren't the first person to get that idea. And hello there, I was a Wikipedia administrator.
It's kind of like solving detective mysteries, the sort of volunteer work I do. I'll speak candidly about some things I've learned while I've made over 19,000 edits at this site—things I've wanted to say to a lot of the people I've wound up blocking or banning. I've kept quiet until now because these words could get misinterpreted if I wrote them in the wrong context, so I'm setting this out in essay form.
If you're getting ideas to bend Wikipedia to your own purposes then you are almost certainly devising plots that I have read many times before. And if you follow through on those ideas you will leave a trail of mistakes that is very familiar to a wikisleuth. Plenty of users before you thought the same tricks would be foolproof. People like me foil those plots all the time. We even have our own slang for the tactics. Attempts to manipulate this site rarely stay in the live version very long, but those edits do get archived in site histories. That should make you stop and think. Some people who've tried to exploit this site have ended up very sorry that they acted rashly.
On 26 May 2005, a delivery service manager from Nashville, Tennessee, thought it would be a good joke to vandalize a page on Wikipedia. Six months later the two edits he made became national news. A Wikipedia critic traced the location of the edits to the vandal's place of employment. The press identified the individual who made those edits by name, then the man resigned from his job and apologized. Fortunately for that person, Rush Delivery rehired him and John Seigenthaler Sr. decided not to sue. In June 2007 another user decided to post a rumor on Wikipedia; the police seized that individual's personal computer as part of a homicide investigation after the rumor turned out to be true.
Major news stories have resulted from other abuse where people embarrassed their employers or damaged their own careers because of bad decisions they made about how to participate at Wikipedia. Ill-judged decisions in other language versions of Wikipedia have created adverse nationwide news in their respective countries. In every one of those cases the negative impact of public exposure far outweighed any benefit to the people or organizations behind them. I'm just here to keep the site honest—to convince you to become a productive editor if that's possible or to show you the door otherwise. Nearly everything I use to perform onsite investigations is public information, disclosed by the individual who made the edits, and available to anyone on the planet who has an internet connection and the smarts to find it.
Dynamics of disruption
Despite Wikipedia's open-door editing, this place did not become one of the most popular sites on the Internet by being easy to abuse. The typical arc of a disruptive editor's career goes as follows: the wrong kind of person becomes aware of Exploit A and tries it. When Exploit A fails to be durable the person tries to get around the watchdogs with Exploit B—not considering that Exploit B is something Wikipedia's volunteers have dealt with many times before. Then the disruptive editor treads the well-worn path to Exploit C, imagining that he or she is really bushwhacking a new trail to a brilliant discovery. We volunteers roll our eyes and chuckle, then implement our usual solutions. This continues until the disruptive user gets sitebanned or until that user realizes it's much more work to attempt a disruptive campaign than for the site's volunteers to revert edits, implement blocks, and protect articles.
All the obvious exploitation methods have already been tried. The seductive thing about them is that sometimes they seem to work just long enough to become major embarrassments for their perpetrators if and when they become generally known—and people who attempt to manipulate the site create a perilous situation for themselves because the trail of evidence is already public information.
As one of the world's ten most popular websites, Wikipedia is a big target. It is also big news. Soon the press will become more sophisticated about finding stories here—and trust me: people who know how to dig learn a lot of interesting things. Wikipedia doesn't set out to hurt anybody's career or their business. Courtesy services such as oversight and the right to vanish are available upon request. But a lot of problem editors place themselves at risk because they don't consider the big picture or they think they're too clever to get caught. Pretty much inevitably, some more attempts to manipulate this site are going to backfire on the individuals and companies who perpetrated them.
I write these words in April 2007. If my prediction holds true, then by the end of next year major news organizations will have research staffers who spend their working days reading Wikipedia edit histories. In a couple of years ordinary people won't just peruse your biography or your company's current Wikipedia entry when they decide whether to do business with you, they'll be clicking on alternate Google returns to see whether you've been in an arbitration case. Blogs and forking sites will chronicle ideological and corporate attempts to manipulate Wikipedia.
Some of the editors I block or siteban accuse me of being naive because I abide by conduct standards Wikipedians call civility and assume good faith. I hope people who are on the edge turn around and edit productively, but I've come to the conclusion that most of the people who abuse this site are the ones who are naive. So before you go over to the dark side, remember how easy it is to edit at Wikipedia the right way: read up on site policies, ask for mentorship, and contribute properly cited material. That sort of participation is much more durable and won't come back to haunt the contributor.
Update: August, 2007
Wired published a Wikipedia IP scanner tool by Virgil Griffith. The next day dozens of major news media were publishing the results of IP searches that used this tool. In my opinion as a Wikipedian, this is only the beginning.