April 22, 2013 § 7 Comments
Over on Gawker, noted Reddit-warrior Adrian Chen just posted a short article discussing Reddit General Manager Erik Martin’s decision to apologize for the site’s role in the Boston bombing vigilante witch hunts. The apology is straightforward enough, but what interests me more is Chen’s response to a commenter asking about the legal implications of the Sunil Tripathi case (recall that Tripathi found himself at the top of Reddit’s suspect list). Essentially, does Tripathi’s family have legal grounds to sue Reddit for libel? Chen says no, stating that
I don’t think the family would have a case against Reddit, which is protected from liability over what its users post by the DMCA safe harbor provision. But they could go after individual Redditors.
The problem is, the DMCA –Digital Millennium Copyright Act– is geared towards just that, copyright. I can see how Reddit wouldn’t and couldn’t be held legally responsible for copyrighted content posted to the site, but what happened in the wake of the Boston bombings isn’t copyright violation, it’s libel (though whether it would -also?- qualify as slander isn’t entirely clear to me — the content was posted in written form, but also spread in that not-quite-writing-not-quite-speech gray area that is internet, and in some cases was eventually published by actual news outlets).
Furthermore, in the case of copyrighted content, the platform is only not legally liable if it immediately removes whatever infringing content. Refusal to comply with the takedown notification forfeits any claims to safe harbor. Not only was Reddit not dealing with copyrighted content, they hosted the content for days — either because they were caught up in the craziness and were simply preoccupied, or because they were drinking their own Kool-Aid and were so enamored of the press they’d get if they DID solve the case that they didn’t stop and consider what would happen if they failed. I mean despite the fact that posting personal, identifying information onto Reddit is in direct violation of their own terms of service:
You may not provide to or post on or through the Website any graphics, text, photographs, images, video, audio or other material that invades anyone’s privacy, or facilitates or encourages conduct that would constitute a criminal offense, give rise to civil liability, or that otherwise violates any local, state, federal, national or international law or regulation (e.g., drug use, underage drinking). You agree to use the Website only for lawful purposes and you acknowledge that your failure to do so may subject you to civil and criminal liability.
You know, details.
Yes, the initial racist whack-a-mole doxxing thread (r/findbostonbombers) was eventually deleted by the site admins (subsequent to posting the above article, Chen tweeted this link to Redditors’ reactions to the that decision), but not until two days into their ill-fated investigation. By that point, the damage had already been done. But it’s not like people weren’t already voicing their concerns about Reddit’s vigilante amateur hour even while the r/findbostonbombers thread was live. Some of the sanest people in the conversation proved to be other Redditors, who urged the vigilante hivemind to grow a brain and control itself. Given that, Reddit’s admins could have –and arguably should have– deleted the misinformation honeypots as they were created, not after they became a problem slash liability.
The fact is, if Redditors were posting kiddie porn (and not jailbait pics, you know, per usual, but explicit content that meets the legal criteria of child pornography) and Reddit admins so much as debated whether or not to delete the content, this wouldn’t even be a conversation. Game over, goose cooked, etc. But the issue here is much fuzzier.
But even if you take the legal question off the table (I’m not even sure a crime was committed), the ethical question remains. At what point should site administrators (i.e. the people whose website it is) step in and preempt this sort of damaging content before it gets loose and begins terrorizing the online countryside? Yes I know, slippery slope, and FREE SPEECH (incidentally, many of the Redditors reacting to the admin’s decision to can r/findbostonbombers were incensed by the admin’s decision, because THIS IS AMERICA), but imagine if you were Sunil Tripathi’s family — I suspect that if you were given a choice between having your life turned upside down, again, and knowing that a few threads would be unceremoniously deleted off a pseudonymous online forum before anyone even noticed they were there, thus resulting in the nation NOT falsely accusing your missing son of being terrorist, I don’t think you or anyone would hesitate to push that button.
I don’t know any of the answers here, for the record.
April 20, 2013 § Leave a comment
There are some interesting conversations happening around the web regarding McGruff the Crime Dog E-Sleuthing and the effectiveness of similar. Or lack thereof, as seems to be the case. New York Magazine focuses on all the mistakenly identified bombing suspects; The Wall Street Journal asks if Reddit helps or harms during a crisis; The Guardian discusses the limits of crowdsourcing, using Reddit’s interference in the Boston Bombing case as their primary case study; The New Statesman (which I linked to yesterday, but while we’re laundry listing) chronicles the Sunil Tripathi angle; and the BBC argues that Reddit didn’t just get the bombing case wrong, but very wrong.
Although I agree that Reddit should be taken to task over this (well not ALL of Reddit, the Redditors who blithely amplified innuendo and misinformation), they are hardly the only or even the biggest fish to fry. The worse offenders in the Boson Bombing story are the sloppy journalists who took this dubious information, did about as thorough a job as the Redditors had done in fact checking (in that they did not fact check), and then ran with the story as if the information was anything more than speculative fiction — which is one of the reasons this news cycle has been such an irredeemable clusterfuck.
Hopefully, the overarching narrative surrounding crowdsourcing doesn’t elide the fact that in order for crowdsourced content to reach critical mass (thus risking the unfair exposure and exploitation of innocent parties), it has to be amplified by larger mainstream outlets. Meaning that the conversation doesn’t and can’t stop after wagging one’s finger at the hivemind. Had the information Reddit generated stayed on-site, we wouldn’t be having this conversation. But it did not stay on-site, it was picked up by outlets that, frankly, should have known better. In short, the most pointed criticisms should be directed not at the amateurs, who do what they do on their free time, but rather with the professionals, who do what they do for a paycheck, and with their editors’ blessing. Without the latter, the worst offenses of the former would never see the light of day.
April 19, 2013 § Leave a comment
NBC.com’s Nidhi Subbaraman just posted an article discussing the public’s role in the Boston Bombing case. I am quoted at the end of the piece, and wanted to clarify my position, because what it says I said is not entirely what I meant (word limits!).
…Shirky and others, such as Whitney Phillips, an Internet scholar and lecturer at NYU who has studied the trolling behavior of 4chan, think caution is necessary at times like this.
“Strong moderation is just what you need to keep this kind of ridiculous gossip at bay,” said Phillips, but strong moderation is not the kind of thing usually seen on Reddit.
“I think crowdsourcing is a terrible idea during criminal investigations,” Phillips told NBC News.
I do think that crowdsourcing is a terrible idea during criminal investigations, under certain circumstances — namely when the crowdsourcers are making up the rules as they go along, without any consideration for the serious personal and legal repercussions of their actions. Vigilante crowdsourcing, in other words. Once law enforcement has actual confirmed information and are looking for public feedback, great! I’m with Shirky; if you see something, say something. But until then, stop trying to solve the crime using MS Paint. As the Sulahaddin Barhoum and Sunil Tripathi cases highlight, this sort of approach has the potential to do far more harm than good. In fact nothing good has come of Reddit’s involvement in the story. (Friday night update: see this detailed thread criticizing participating Redditors’ haphazard sleuthing)
As for the issue of moderation — my full quote explained that platform moderators have and should exercise the ability to quash falsely incriminating details and/or surreptitiously acquired personal information before whatever information gets snatched up by lazy journalists and splashed across the front page of The New York Post. This is not a violation of “free speech,” as many Redditors might argue. Rather this is a preemptive protective measure. As Shirky explains in the NBC article, the cost of failure is very high — and above and beyond being the responsible thing to do, it is (at least, seems like it should be) in the platform moderators’ best interest to prevent, for example, being sued for libel.
April 19, 2013 § Leave a comment
Alex Hern of The New Statesman just posted an excellent analysis of Reddit’s failed vigilante efforts. Silver lining: participating Redditors may suck at pretending to be the FBI, but they are extremely good at libeling random brown people! The essence of the story is: despite their best efforts to solve the case and change the world, participating Redditors failed to identify the bombing suspects, and then once the FBI released images of the actual suspects, they misidentified one of the men, thereby associating an apparently unstable missing college student with the Boston bombings. This is a good thing to read, particularly if you are a journalist on the hunt for a hot tip about the latest internet craze of being irresponsible and trigger-happy.
April 18, 2013 § Leave a comment
Earlier the FBI released images of the two suspected Boston Marathon bombers — not very shockingly, neither man was on Reddit’s list of likely suspects. Participating Redditors are now focusing on those two men, with explicit instructions not to post any personal information because it would be just awful to falsely accuse any innocent bystanders of being domestic terrorists twice in the same week.
Good job, internet!
April 18, 2013 § Leave a comment
Yesterday I linked out to this post written by The Atlantic‘s Alexis Madrigal, which handily criticizes the Redditors who think they can solve the Boston Bombing case by, like, looking at some pictures online and figuring out WHO DUNNIT. Given the participants’ years of experience
in law enforcement watching television, what could possibly go wrong?? I mean other than a false accusation or two, a point BoingBoing’s Rob Beschizza just tweeted out, and which some Redditors are currently discussing here (hat tip to Chris for sending me those links). It’s no big deal, to be accused of domestic terrorism because you happened to be wearing a backpack while watching a marathon.
I’ll allow for the possibility that some of the Redditors/anons in question mean well. I’m sure they think they’re doing the FBI a favor; I’m sure these teenagers feel very pleased with themselves. But for god’s sake, these are pseudonymous and anonymous tipsters. Meaning there is no way to parse the well-intentioned from the trolls, who are probably cackling with glee knowing that Fox & Friends, including the always-reliable New York Post, who included the crowdsourced image of Reddit’s prime suspects on their front page, are primed to eat this information right out of their hands.
On the other hand, doing your job carefully is hard, and there are deadlines to meet, and this ad space isn’t going to buy itself, so…
March 9, 2013 § 13 Comments
…it would have been very interesting indeed to observe the Reddit SXSW panel audience split between staunch supporters of Reddit (FREE SPEECH/YOU DON’T WANT TO END UP LIKE CHINA DO YOU???) and those who a) take issue with the many unsavory aspects of Reddit’s culture(s) and b) enjoy getting a rise out of Redditors because let’s face it, white guys unaware of their own privilege are lulzcows.
John Herrman’s article (linked above), which seems to side with the SXSW panelists (or at least, seems to side against the most vocal supporters of the site), sums it up thusly:
The best criticism of Reddit is that it can, on occasion, victimize people. Yet drawing attention to jailbait photos, creepshots and other forms of victimization — low-level, constant misogyny included — does not compel Reddit to look within itself. Instead, according to the site’s most vocal proponents, it makes Reddit the victim.
1,000% yes to the last part, but I would push the first half of that statement a bit further*, since the best criticism of Reddit is not just that Reddit is –at least, the main subreddits are, allowing for exceptions within smaller subreddits– a cesspool of generalized misogyny, generalized racism and generalized homophobia, attitudes that are so pervasive they’re kind of just there. Just as problematically, Reddit as a company is permissive of the cesspool — on the grounds that FREE SPEECH, obviously. Which as Jezebel pointed out the other day, doesn’t actually work the way most people on the internet seem to think it works.
But other than that, victim complex it pretty much exactly right. It is so hard to be
an oppressed minority middle class educated white men, on the internet.
*Herrman’s use of the term “victimization” is somewhat confusing — “victimization” seems to imply a specific target, but then he also glosses low-level misogyny under the same umbrella, suggesting he means “victimization” in the broader sense, as synonymous with “oppression,” maybe. Consequently I’ve revised this paragraph several times trying to figure out what he means, and how best to respond.