Internet abuse culture is a tech industry problem
After Jesse Frazelle blogged about the online abuse she receives, a common reaction in various forums[1] was "This isn't a tech industry problem - this is what being on the internet is like"[2]. And yes, they're right. Abuse of women on the internet isn't limited to people in the tech industry. But the severity of a problem is a product of two separate factors: its prevalence and what impact it has on people.
Much of the modern tech industry relies on our ability to work with people outside our company. It relies on us interacting with a broader community of contributors, people from a range of backgrounds, people who may be upstream on a project we use, people who may be employed by competitors, people who may be spending their spare time on this. It means listening to your users, hearing their concerns, responding to their feedback. And, distressingly, there's significant overlap between that wider community and the people engaging in the abuse. This abuse is often partly technical in nature. It demonstrates understanding of the subject matter. Sometimes it can be directly tied back to people actively involved in related fields. It's from people who might be at conferences you attend. It's from people who are participating in your mailing lists. It's from people who are reading your blog and using the advice you give in their daily jobs. The abuse is coming from inside the industry.
Cutting yourself off from that community impairs your ability to do work. It restricts meeting people who can help you fix problems that you might not be able to fix yourself. It results in you missing career opportunities. Much of the work being done to combat online abuse relies on protecting the victim, giving them the tools to cut themselves off from the flow of abuse. But that risks restricting their ability to engage in the way they need to to do their job. It means missing meaningful feedback. It means passing up speaking opportunities. It means losing out on the community building that goes on at in-person events, the career progression that arises as a result. People are forced to choose between putting up with abuse or compromising their career.
The abuse that women receive on the internet is unacceptable in every case, but we can't ignore the effects of it on our industry simply because it happens elsewhere. The development model we've created over the past couple of decades is just too vulnerable to this kind of disruption, and if we do nothing about it we'll allow a large number of valuable members to be driven away. We owe it to them to make things better.
[1] Including Hacker News, which then decided to flag the story off the front page because masculinity is fragile
[2] Another common reaction was "But men get abused as well", which I'm not even going to dignify with a response
Much of the modern tech industry relies on our ability to work with people outside our company. It relies on us interacting with a broader community of contributors, people from a range of backgrounds, people who may be upstream on a project we use, people who may be employed by competitors, people who may be spending their spare time on this. It means listening to your users, hearing their concerns, responding to their feedback. And, distressingly, there's significant overlap between that wider community and the people engaging in the abuse. This abuse is often partly technical in nature. It demonstrates understanding of the subject matter. Sometimes it can be directly tied back to people actively involved in related fields. It's from people who might be at conferences you attend. It's from people who are participating in your mailing lists. It's from people who are reading your blog and using the advice you give in their daily jobs. The abuse is coming from inside the industry.
Cutting yourself off from that community impairs your ability to do work. It restricts meeting people who can help you fix problems that you might not be able to fix yourself. It results in you missing career opportunities. Much of the work being done to combat online abuse relies on protecting the victim, giving them the tools to cut themselves off from the flow of abuse. But that risks restricting their ability to engage in the way they need to to do their job. It means missing meaningful feedback. It means passing up speaking opportunities. It means losing out on the community building that goes on at in-person events, the career progression that arises as a result. People are forced to choose between putting up with abuse or compromising their career.
The abuse that women receive on the internet is unacceptable in every case, but we can't ignore the effects of it on our industry simply because it happens elsewhere. The development model we've created over the past couple of decades is just too vulnerable to this kind of disruption, and if we do nothing about it we'll allow a large number of valuable members to be driven away. We owe it to them to make things better.
[1] Including Hacker News, which then decided to flag the story off the front page because masculinity is fragile
[2] Another common reaction was "But men get abused as well", which I'm not even going to dignify with a response
no subject
(Anonymous) 2015-07-06 06:12 pm (UTC)(link)Regarding HN: As much as people talk about "filter bubbles" (which is not that much, really, but DuckDuckGo made it a thing with a convenient name), people like Google for giving them what they want and hiding from them what they don't want. Is it broken that HN hides things that an apparently significant subset of its members aren't interested in, or is the brokenness that there are people in HN that aren't interested in it? I'd tend to believe the latter (and in particular that the subset of people is the same as the subset who leave awful comments); the former is the result of showing people what they want to see and hiding what they don't.
At least when articles like this hit HN, it seems like there's a substantial number of people downvoting and flagging the awful comments they attract. The fact that such comments don't tend to get upvoted gives me some hope that, while HN contains many awful people, the good ones outnumber the bad.
(To clarify: HN clearly has a serious issue they need to eliminate. I'm not questioning that they have a problem; the evidence doesn't lie. I'm asking what the right way to eliminate it is, and I'm wondering if the right solution involves changing people rather than algorithms. Because in the end, it's always about people, the communities they construct, and the other people they put up with in those communities. I've seen enough good things come out of HN that I'd like to avoid throwing the baby out with the bathwater^Wraw sewage.)
no subject
FAQs filter out questions. People name and shame logical fallacies to keep arguments from being derailed. Geek Feminism has a whole wiki of common responses to reports of abuse which are known to be / proven unhelpful. I personally have clinical PTSD, and rely on a number of "filter bubbles" to protect my sanity and ability to function.
I feel that the filtering going on at HN is slightly different though, because it comes from a place of willful ignorance, of easily verifiable knowledge which would undermine their whole way of life. They aren't choosing not to rehash stuff they already know (and/or know to be unhelpful), they are choosing to put their convenience and privilege ahead of other people's basic needs.
It's important to acknowledge the difference, I think, if you want to try to fix the brokenness.
no subject
(Anonymous) 2015-07-08 06:06 pm (UTC)(link)As mentioned in my parenthetical at the end, I think HN has multiple serious problems in its community. I'm interested in how to fix it.
There are a pile of commenters demonstrating both active and institutional sexism. While they do typically get downvoted and occasionally flagged (though rarely enough to disappear completely), they're not actually removed from the community, and thus they retain their ability to upvote/downvote/flag. (And the institutional sexism sometimes passes unnoticed when it manages to honey its words enough to be subtle and sound plausible.) It doesn't take many flags to bury a story, so all it would take is a few of those people systematically flagging stories to cause the effect that Matthew has repeatedly called out. And thus HN's filter bubble filters out stories about how bad online abuse is, how broken the tech community is, or how to fix it. (The stories that get through the bubble tend to be the positive success stories, which compounds the problem by making the situation look better than it is.)
It's one thing to systematically filter out all discussion of social/community issues as off-topic. That would be understandable and defensible; not all forums need to support discussion of all topics. However, it's pretty clear from the comments that that's not the whole story at HN; there's clearly a set of people who specifically target any stories about sexism and gender-specific issues. And while HN has progressed enough to call out and squash outright "brogrammer" garbage, they're years behind in terms of understanding and dealing with the institutional versions, as well as in having an explicit understanding of even the 101-level issues (e.g. no, the police will not track down and arrest people for online abuse; no, attempting to creating a meritocracy doesn't actually result in one; no, a code of conduct does not mean you can't tell someone how their patch is broken).
no subject
(Anonymous) 2015-07-09 06:44 am (UTC)(link)While conversations about sexual equality and oppression *are* *important*, they are not appropriate for *every* forum. There *need* to be *some* forums that are largely free from such topics. Why?
I've been on Metafilter for a loooooooong time. I've been on HN for -Jesus- nearly five years. I've seen how the free proliferation of rabid Social Justice Warriors changed most discourse on Metafilter from a civil -and sometimes enlightening- meeting of folks with often wildly differing viewpoints to a shaming, browbeating, pile-on of anyone who happens to represent (or tacitly support) the Monster of the Week.
The few sexual equality/oppression articles that *do* make it to the front page of HN don't create any better conversation than happens on Metafilter. I'm fairly convinced that discussion of such topics in soft-moderated forums comprised entirely of random pseudonymous people is bound to be troubling, frustrating, and profoundly unproductive. (Much like the comments section of most any article on most any newspaper's website.)
Sexist Internet Blowhards are pretty much never going to be converted by being told how and why they're wrong by Random Internet Strangers. They require -like anyone else with unpleasant deep-seated habits- gentle, constant, corrective pressure over long periods of time.
This sort of correction is (to my knowledge) *never* found in the comments section attached to sexual equality articles. What *is* typically found in such comments are re-hashes of the same old arguments, with proponents and opponents fighting past each other for the nth time.
For those of us who try to spend a large amount of our down-time learning, and those of us who are drained by the injustices of the world, running into the same old fights and arguments when we were instead seeking new knowledge saps us of our will to learn and create. This is deeply unfair to those of us who *already* understand what we must do to help create a more fair and equitable society.
Perhaps some might think it fair that a generation of the Socially Privileged Classes has their psyche perpetually worn at by the plight of the less fortunate, and their time tithed in support of those same unfortunates. However, this doesn't seem to comport with the message of social justice and equality that I see the more moderate of the Social Justice folks advancing.
It *is* right and proper to -over time- correct Sexism, Racism, and all the other -isms. But, even the most stalwart crusaders for a cause need a place to rest, recuperate, and pursue their hobbies in peace.
This is not a social justice problem, this is an communication/education problem
(Anonymous) 2015-08-08 04:56 am (UTC)(link)This is not a unique problem to social justice, this is a problem about large groups with information asymmetry more generally. September is still not over: every year millions of people get added to the internet, get old enough to enter academia, and in general -- become members of the global conversation and system of thinking beings on earth. Every year those millions have to pick up the beginning markers of discussions that have been going on for a *long time*. And unfortunately, some of us have to participate in that.
Threaded conversation is miraculously better that flat conversation, which is better than the combination of paper and your local community, in terms of allowing viewpoints to both be present on complex issues, and for large groups to coherently converse, but the problem comes when we get to the point where complex threads have been gone over, as you point out, for the nth time, and we statistically speaking speak past eachother more often than not.
There is a tradeoff between allowing for diverse viewpoints in a global conversation, and filtering those who don't know that what they are trying to say has been said a thousand times before, and the learning that comes along with those very same rehashed arguments in practice. A tradeoff between having the ability to learn, as a group, and the ability to communicate, as a group, on the order of millions at least.
One thing's for sure: subreddits fails after the first million users or two. Splitting back into subcultures that interact haphazardly doesn't work: we end up talking mostly about pictures of cats, and technical forums become a clusterfsck of drama.
I think part of the problem is who owns the commanding heights, right now -- reddit has tools for detecting high level patterns in conversations -- meme detection. Instead of using these tools to resolve these complex, multi-million person arguments, they are currently used for the benefit of advertisers.
There's little glimpses of what the next step could be, but no whole picture. In the bitcoin subreddit, someone wrote a bot for the purposes of identifying sockpuppets, which failed miserably at its task but which actually succeeded at pairing users together who had different opinions on the same topic but who thought most similar to eachother. Think of it as a kind of meiosis of ideas -- after the community found itself in a position where it was too big to make progress on a complex issue, and the issue wasn't going away, what *could* have happened would be a splitting of the issue into a bipartite graph, with the two sides of the graph being the two sides of the issue, and the links being between those who had the best chance of being at the level of understanding/wavelength. This kind of tool has never been used at scale to solve social justice problems.
We won't know whether it could work unless it could be credibly tried.