April 30, 2013 § Leave a Comment
We’re coming up on the one year anniversary of my dissertation defense, which…how, but ok. This reminded me of some of the odd compulsive behaviors I normalized during those last few months of writing. I had this thing about music, and not just music but bad/weird/otherwise shameful music — I would ironically-ish listen to the same, say, Britney Spears song one hundred times on repeat, which I’m assuming kept the snark part of my brain occupied while I wrote. Whatever, here is one of the songs I must have listened to two thousand times during that summer (a bad lip read of this musical splenda-fest). I love this shit so much. It is playful, completely unnecessary to society, and not mean. I’m going to listen to it ten times right now!
Also, unrelated, here is another thing I like. Posting only things that don’t give me an ulcer is fun!
April 29, 2013 § Leave a Comment
As evidenced by my not posting since the 24th, I have been struggling to internet since the Boston bombing. This burnout –I guess you could call it?– connects in surprisingly well-timed ways to the online communication unit I’m currently teaching in my Intro to Human Comm class. Normally when I teach this sort of stuff, which can be tl;dr’d as THERE’S SO MUCH TEXTING GOING ON THESE DAYS AND NO COMMUNICATING, I’m like pfft whatever. The internet is good! Hyperconnectivity is great! I loooooove email and push notifications! But this time around I find myself nodding along with Sherry Turkle, not because my social media use makes me lonely (I’m not on Facebook, haven’t been to Google+ in years, rarely tweet, avoid gchat, and use Skype only when a face to face meeting is required but otherwise impossible due to geography — so the conversation about connected loneliness isn’t really applicable) but because the internet as a whole has been making me feel anxious and thinly-spread. It’s interesting; this is the first time I’ve ever felt this way in relation to my own online engagement. So for the next few weeks, I’m going to cut back on everything. I’ll still post here, but for the time being will be focusing only on the stuff I like that doesn’t make my head explode. For example this masterwork, found via Chris (all the best things in my life are found via Chris).
Happy Monday everybody!
April 24, 2013 § Leave a Comment
That he is riding a Roomba and chasing a ducking around the kitchen before being joined by a pitt bull dressed as a hammerhead shark is an added bonus. Also, the moment at 1:48 is why god invented the internet.
April 22, 2013 § 4 Comments
Over on Gawker, noted Reddit-warrior Adrian Chen just posted a short article discussing Reddit General Manager Erik Martin’s decision to apologize for the site’s role in the Boston bombing vigilante witch hunts. The apology is straightforward enough, but what interests me more is Chen’s response to a commenter asking about the legal implications of the Sunil Tripathi case (recall that Tripathi found himself at the top of Reddit’s suspect list). Essentially, does Tripathi’s family have legal grounds to sue Reddit for libel? Chen says no, stating that
I don’t think the family would have a case against Reddit, which is protected from liability over what its users post by the DMCA safe harbor provision. But they could go after individual Redditors.
The problem is, the DMCA –Digital Millennium Copyright Act– is geared towards just that, copyright. I can see how Reddit wouldn’t and couldn’t be held legally responsible for copyrighted content posted to the site, but what happened in the wake of the Boston bombings isn’t copyright violation, it’s libel (though whether it would -also?- qualify as slander isn’t entirely clear to me — the content was posted in written form, but also spread in that not-quite-writing-not-quite-speech gray area that is internet, and in some cases was eventually published by actual news outlets).
Furthermore, in the case of copyrighted content, the platform is only not legally liable if it immediately removes whatever infringing content. Refusal to comply with the takedown notification forfeits any claims to safe harbor. Not only was Reddit not dealing with copyrighted content, they hosted the content for days — either because they were caught up in the craziness and were simply preoccupied, or because they were drinking their own Kool-Aid and were so enamored of the press they’d get if they DID solve the case that they didn’t stop and consider what would happen if they failed. I mean despite the fact that posting personal, identifying information onto Reddit is in direct violation of their own terms of service:
You may not provide to or post on or through the Website any graphics, text, photographs, images, video, audio or other material that invades anyone’s privacy, or facilitates or encourages conduct that would constitute a criminal offense, give rise to civil liability, or that otherwise violates any local, state, federal, national or international law or regulation (e.g., drug use, underage drinking). You agree to use the Website only for lawful purposes and you acknowledge that your failure to do so may subject you to civil and criminal liability.
You know, details.
Yes, the initial racist whack-a-mole doxxing thread (r/findbostonbombers) was eventually deleted by the site admins (subsequent to posting the above article, Chen tweeted this link to Redditors’ reactions to the that decision), but not until two days into their ill-fated investigation. By that point, the damage had already been done. But it’s not like people weren’t already voicing their concerns about Reddit’s vigilante amateur hour even while the r/findbostonbombers thread was live. Some of the sanest people in the conversation proved to be other Redditors, who urged the vigilante hivemind to grow a brain and control itself. Given that, Reddit’s admins could have –and arguably should have– deleted the misinformation honeypots as they were created, not after they became a problem slash liability.
The fact is, if Redditors were posting kiddie porn (and not jailbait pics, you know, per usual, but explicit content that meets the legal criteria of child pornography) and Reddit admins so much as debated whether or not to delete the content, this wouldn’t even be a conversation. Game over, goose cooked, etc. But the issue here is much fuzzier.
But even if you take the legal question off the table (I’m not even sure a crime was committed), the ethical question remains. At what point should site administrators (i.e. the people whose website it is) step in and preempt this sort of damaging content before it gets loose and begins terrorizing the online countryside? Yes I know, slippery slope, and FREE SPEECH (incidentally, many of the Redditors reacting to the admin’s decision to can r/findbostonbombers were incensed by the admin’s decision, because THIS IS AMERICA), but imagine if you were Sunil Tripathi’s family — I suspect that if you were given a choice between having your life turned upside down, again, and knowing that a few threads would be unceremoniously deleted off a pseudonymous online forum before anyone even noticed they were there, thus resulting in the nation NOT falsely accusing your missing son of being terrorist, I don’t think you or anyone would hesitate to push that button.
I don’t know any of the answers here, for the record.
April 20, 2013 § Leave a Comment
There are some interesting conversations happening around the web regarding McGruff the Crime Dog E-Sleuthing and the effectiveness of similar. Or lack thereof, as seems to be the case. New York Magazine focuses on all the mistakenly identified bombing suspects; The Wall Street Journal asks if Reddit helps or harms during a crisis; The Guardian discusses the limits of crowdsourcing, using Reddit’s interference in the Boston Bombing case as their primary case study; The New Statesman (which I linked to yesterday, but while we’re laundry listing) chronicles the Sunil Tripathi angle; and the BBC argues that Reddit didn’t just get the bombing case wrong, but very wrong.
Although I agree that Reddit should be taken to task over this (well not ALL of Reddit, the Redditors who blithely amplified innuendo and misinformation), they are hardly the only or even the biggest fish to fry. The worse offenders in the Boson Bombing story are the sloppy journalists who took this dubious information, did about as thorough a job as the Redditors had done in fact checking (in that they did not fact check), and then ran with the story as if the information was anything more than speculative fiction — which is one of the reasons this news cycle has been such an irredeemable clusterfuck.
Hopefully, the overarching narrative surrounding crowdsourcing doesn’t elide the fact that in order for crowdsourced content to reach critical mass (thus risking the unfair exposure and exploitation of innocent parties), it has to be amplified by larger mainstream outlets. Meaning that the conversation doesn’t and can’t stop after wagging one’s finger at the hivemind. Had the information Reddit generated stayed on-site, we wouldn’t be having this conversation. But it did not stay on-site, it was picked up by outlets that, frankly, should have known better. In short, the most pointed criticisms should be directed not at the amateurs, who do what they do on their free time, but rather with the professionals, who do what they do for a paycheck, and with their editors’ blessing. Without the latter, the worst offenses of the former would never see the light of day.
April 19, 2013 § Leave a Comment
NBC.com’s Nidhi Subbaraman just posted an article discussing the public’s role in the Boston Bombing case. I am quoted at the end of the piece, and wanted to clarify my position, because what it says I said is not entirely what I meant (word limits!).
…Shirky and others, such as Whitney Phillips, an Internet scholar and lecturer at NYU who has studied the trolling behavior of 4chan, think caution is necessary at times like this.
“Strong moderation is just what you need to keep this kind of ridiculous gossip at bay,” said Phillips, but strong moderation is not the kind of thing usually seen on Reddit.
“I think crowdsourcing is a terrible idea during criminal investigations,” Phillips told NBC News.
I do think that crowdsourcing is a terrible idea during criminal investigations, under certain circumstances — namely when the crowdsourcers are making up the rules as they go along, without any consideration for the serious personal and legal repercussions of their actions. Vigilante crowdsourcing, in other words. Once law enforcement has actual confirmed information and are looking for public feedback, great! I’m with Shirky; if you see something, say something. But until then, stop trying to solve the crime using MS Paint. As the Sulahaddin Barhoum and Sunil Tripathi cases highlight, this sort of approach has the potential to do far more harm than good. In fact nothing good has come of Reddit’s involvement in the story. (Friday night update: see this detailed thread criticizing participating Redditors’ haphazard sleuthing)
As for the issue of moderation — my full quote explained that platform moderators have and should exercise the ability to quash falsely incriminating details and/or surreptitiously acquired personal information before whatever information gets snatched up by lazy journalists and splashed across the front page of The New York Post. This is not a violation of “free speech,” as many Redditors might argue. Rather this is a preemptive protective measure. As Shirky explains in the NBC article, the cost of failure is very high — and above and beyond being the responsible thing to do, it is (at least, seems like it should be) in the platform moderators’ best interest to prevent, for example, being sued for libel.
April 19, 2013 § Leave a Comment
Alex Hern of The New Statesman just posted an excellent analysis of Reddit’s failed vigilante efforts. Silver lining: participating Redditors may suck at pretending to be the FBI, but they are extremely good at libeling random brown people! The essence of the story is: despite their best efforts to solve the case and change the world, participating Redditors failed to identify the bombing suspects, and then once the FBI released images of the actual suspects, they misidentified one of the men, thereby associating an apparently unstable missing college student with the Boston bombings. This is a good thing to read, particularly if you are a journalist on the hunt for a hot tip about the latest internet craze of being irresponsible and trigger-happy.