Reddit’s Apology and the Question of Ethical Culpability
April 22, 2013 § 7 Comments
Over on Gawker, noted Reddit-warrior Adrian Chen just posted a short article discussing Reddit General Manager Erik Martin’s decision to apologize for the site’s role in the Boston bombing vigilante witch hunts. The apology is straightforward enough, but what interests me more is Chen’s response to a commenter asking about the legal implications of the Sunil Tripathi case (recall that Tripathi found himself at the top of Reddit’s suspect list). Essentially, does Tripathi’s family have legal grounds to sue Reddit for libel? Chen says no, stating that
I don’t think the family would have a case against Reddit, which is protected from liability over what its users post by the DMCA safe harbor provision. But they could go after individual Redditors.
The problem is, the DMCA –Digital Millennium Copyright Act– is geared towards just that, copyright. I can see how Reddit wouldn’t and couldn’t be held legally responsible for copyrighted content posted to the site, but what happened in the wake of the Boston bombings isn’t copyright violation, it’s libel (though whether it would -also?- qualify as slander isn’t entirely clear to me — the content was posted in written form, but also spread in that not-quite-writing-not-quite-speech gray area that is internet, and in some cases was eventually published by actual news outlets).
Furthermore, in the case of copyrighted content, the platform is only not legally liable if it immediately removes whatever infringing content. Refusal to comply with the takedown notification forfeits any claims to safe harbor. Not only was Reddit not dealing with copyrighted content, they hosted the content for days — either because they were caught up in the craziness and were simply preoccupied, or because they were drinking their own Kool-Aid and were so enamored of the press they’d get if they DID solve the case that they didn’t stop and consider what would happen if they failed. I mean despite the fact that posting personal, identifying information onto Reddit is in direct violation of their own terms of service:
You may not provide to or post on or through the Website any graphics, text, photographs, images, video, audio or other material that invades anyone’s privacy, or facilitates or encourages conduct that would constitute a criminal offense, give rise to civil liability, or that otherwise violates any local, state, federal, national or international law or regulation (e.g., drug use, underage drinking). You agree to use the Website only for lawful purposes and you acknowledge that your failure to do so may subject you to civil and criminal liability.
You know, details.
Yes, the initial racist whack-a-mole doxxing thread (r/findbostonbombers) was eventually deleted by the site admins (subsequent to posting the above article, Chen tweeted this link to Redditors’ reactions to the that decision), but not until two days into their ill-fated investigation. By that point, the damage had already been done. But it’s not like people weren’t already voicing their concerns about Reddit’s vigilante amateur hour even while the r/findbostonbombers thread was live. Some of the sanest people in the conversation proved to be other Redditors, who urged the vigilante hivemind to grow a brain and control itself. Given that, Reddit’s admins could have –and arguably should have– deleted the misinformation honeypots as they were created, not after they became a problem slash liability.
The fact is, if Redditors were posting kiddie porn (and not jailbait pics, you know, per usual, but explicit content that meets the legal criteria of child pornography) and Reddit admins so much as debated whether or not to delete the content, this wouldn’t even be a conversation. Game over, goose cooked, etc. But the issue here is much fuzzier.
But even if you take the legal question off the table (I’m not even sure a crime was committed), the ethical question remains. At what point should site administrators (i.e. the people whose website it is) step in and preempt this sort of damaging content before it gets loose and begins terrorizing the online countryside? Yes I know, slippery slope, and FREE SPEECH (incidentally, many of the Redditors reacting to the admin’s decision to can r/findbostonbombers were incensed by the admin’s decision, because THIS IS AMERICA), but imagine if you were Sunil Tripathi’s family — I suspect that if you were given a choice between having your life turned upside down, again, and knowing that a few threads would be unceremoniously deleted off a pseudonymous online forum before anyone even noticed they were there, thus resulting in the nation NOT falsely accusing your missing son of being terrorist, I don’t think you or anyone would hesitate to push that button.
I don’t know any of the answers here, for the record.