Something's Rotten in the State of Facebook
content warning: domestic terrorism
A friend posted the following on Wednesday:
Dear everyone posting That Video to Facebook today: Yes, there was another shooting. VA this time, caught on live news.
Yes, the murderer (no, he doesn't deserve to be named) filmed his victims' deaths and immediately posted them to his Facebook and Twitter.
No, you're not obligated to help him out by publicizing them. All that does is put attention on the wrong person, make the victims' friends and families miserable, inspire imitators, and give the murderer exactly what he wanted. So please, knock it off.
He was more timely than I, but I've still got a few short things to add on.
In the days since, we've learned that the killer wrote a manifesto calling for a race war. This is probably good grounds for rounding up everyone who helped to publicize the last killer who wrote a manifesto calling for a race war, and give them a stern talking-to about how, yes, their carelessness has real consequences. It is certainly good grounds for not giving his political opinions any publicity, since that's quite obviously what he wanted here.
Remember, terrorists lose when you forget about them. Killers who share videos of themselves killing and espouse political agendas that they want discussed more broadly lose when you don't give them the satisfaction of watching their videos, or paying their political agendas any mind. They're deranged and wrong; what good are you doing by enabling them?
Zeynep Tufekci, in the Atlantic:
You might not have noticed, but the mass media rarely reports on suicides, particularly teen suicides. When it does, the coverage is careful, understated, and dampened. This is no accident: Following guidelines endorsed by the Centers for Disease Control and National Institutes of Mental Health, the media carefully and voluntarily avoids sensationalizing such deaths especially among teenagers. They almost never make the news unless the person is a public figure; methods of suicide are rarely mentioned; suicide pacts are not reported upon.
This is for good reason: Suicide, especially among teens, is contagious. It's a morbidly attractive idea that offers an established path of action for a troubled youngster. And we know from research in many fields that establishing a path of action -- a complete narrative in which you can visualize your steps and their effects -- is important in enabling follow-through. (...)
And Facebook has an oversized share of this, too. Several times, I found obnoxious and harmful third-party posts in my feed solely because aforementioned friend commented on them telling them to stop posting such things. Sure, sure, Streisand effect[?], but on the other hand, when you step back and think about it, there is no way to tell Facebook's algorithms to publicize a thing less broadly. None!
One of the online communities I frequent is Quora. There, I have >600 answers in the past year or so, >300 followers, and over a quarter-million views on answers I've written, lifetime. By comparison, I've had less than a tenth that number of views on this blog in that time. Heck, I've had more views on things I've written on Quora in the last 30 days than I have on my blog in the past year. There are some advantages to being part of a platform, I suppose.
However, I don't actually spend the majority of my time on Quora writing. I spend most of it reading, and I spend a plurality of my clicks upvoting or downvoting things. I upvote about a third of the answers I see, and I downvote about a third of the answers I see. I downvote probably 20% of the questions I see. I downvote things written by people I like, when I expect that their answer isn't, for any reason at all, a positive contribution to the abstract ideal of a Quora feed. It's not personal -- it's just curating a community-produced content.
And I think that Quora has better content because people like me do this. We can re-tag homework questions from the "Mathematics" topic to the "Mathematics Homework Questions" topic, so that they don't get shown to people who are interested in questions about the former, but not the latter. We can upvote things that we believe deserve to be read, and "Re-Ask" questions that we'd like to see answered. But most importantly, we can downvote answers -- and whole questions -- and we do. Heck, it's not uncommon for me to answer a question, and then downvote the question itself, because I thought my answer would be useful to one person, not to be spread more broadly.
So inappropriate, duplicate, content-less, or otherwise useless answers get collapsed and never read. Bad questions get shown to people less, and are answered less -- and so are seen even more rarely. The community isn't perfect, but at least having the lever to push goes a long way toward weeding out content that (by polled opinion) is less interesting or useful to people.
Facebook's got none of that. Now, Eric Meyer at Meyerweb is pessimistic that a "dislike" button is a good answer:
Tufekci's piece perfectly describes the asymmetrical nature of Facebook's "engagement" mechanisms, commented on for years: there is no negative mirror for the "Like" button. As she says:
Of course he cannot like it. Nobody can. How could anyone like such an awful video?
What happens then to the video? Not much. It will mostly get ignored, because my social network has no way to signal to the algorithm that this is something they care about.
What I've been thinking of late is that the people in her network can comment as a way to signal their interest, caring, engagement, whatever you want to call it. When "Like" doesn't fit, comments are all that's left, and I think that's appropriate.
In a situation like Tufekci describes, or any post that deals with the difficult side of life, comments are exactly what’s called for. Imagine if there were a "Dislike" button. How many would just click it without commenting? Before you answer that question, consider: how many click "Like" without commenting? How many more would use "Dislike" as a way to avoid dealing with the situation at hand?
When someone posts something difficult—about themselves, or someone they care about, or the state of the world—they are most likely seeking the support of their community. They're asking to be heard. Comments fill that need. In an era of Likes and Faves and Stars and Hearts, a comment (usually) shows at least some measure of thought and consideration. It shows that the poster has been heard. (...)
He's coming from a very particular place, and I think that he's entirely right that Facebook doesn't need "sympathize" button. But he's wrong that Facebook doesn't need a "show me and everyone else less of this and everything like it" button -- especially if the opposite buttons are everywhere, easy to use, and, in fact, impossible to not-use.
aside: Did you know that Facebook tracks how long you spend reading a "story", and uses it to decide whether to send it to other people or not? And what do you think its clever, mobile-based browser is for?
Of course, if you're Facebook, why bother? Why run the risk that the existence of an anti-like button -- even one like Quora's, which is anonymous, doesn't display a count, and doesn't notify the poster -- might look bad the first time it comes up in a bullying case? What incentive do you have to break the cycle of glorified, nigh-impossible-not-to-publicize violence begetting violence across the nation?
Yeah, I don't know, either.