No quick fix for threats to women on Twitter
The Women, Action and the Media () activist group announced on Friday a to address online harassment of women, which it claims has "reached crisis levels".
The group, concerned by the vicious targeting of women online, started a pro-bono project to support users experiencing gendered harassment or abuse. Their website now hosts a so they can validate and escalate claims of abuse to Twitter.
So while this seems a positive step, historically Twitter hasn't addressed issues of harassment on its platform well at all. Will this new collaboration change anything?
A growing issue
A quarter of 18- to 24-year-old women by the Pew Research Center reported having been stalked or sexually harassed online, and a from Demos found terms such as "rape", "slut" and "whore" were often used in tweets to convey casually misogynistic sentiments, or to threaten and abuse other users.
of are often reported in :
- a Guardian journalist was after asking Twitter users about tampons
- a Fox News panellist was after rejecting the notion that firearms were a solution to sexual violence
- a former DC Comics editor received a after criticising a comic book cover that clumsily sexualised teenaged characters
- a feminist critic was after criticising the depiction of women in video games
- a grieving daughter was with faked photographs of her father mere hours after his death.
Countless other women deal with online threats and abuse .
The reactions to these women were extreme and irrational. There is no justification for threatening , and , or for the continued of women online and on social media.
In many cases the threats are not only contrary to Twitter's rules , but also under conventional offline laws.
How does Twitter respond?
In response to bomb and rape threats against a number of women in the UK, Twitter was notoriously . Twitter's eventual solution was a for the Twitter app that allowed users to report abusive tweets to Twitter's moderation team.
As with any technical solution to a social problem, the button (and Twitter's ensuing moderation process) .
Some of those were apparent during , where users who tried to help victims by reporting harassing accounts claimed that Twitter refused to moderate tweets that were reported by people other than the victim.
, American technology journalist :
If you see abuse on twitter, and you report it, twitter emails you to tell you they will ignore it because it didn't happen to you. In the real world, when you see abuse and report it, your observation/testimony is part of the societal feedback loop for correction. Twitter's approach to abuse reporting is to minimise false reporting rather than solve abuse.
Too-hard basket
It's possible that Twitter doesn't yet understand the extent of online abuse on its platform.
Harassers can use anonymising tools to obscure their identity, they can use "disposable email" services to set up on social media to minimise consequences of being blocked.
In some cases, they harass socially, loosely organising across multiple message-boards and social platforms to create what American law professor Danielle Keats Citron terms as "". The behaviour of harassers can be complex, and some of the tools and tactics deployed are intentionally used to make their behaviour difficult to police.
Of course, misogynistic behaviour does not exist in a vacuum. The abusive, harassing, threatening behaviour I've highlighted is to Twitter.
Similarly, Twitter's indolence is compounded by law enforcement organisations that are to combat online issues, or that consider online abuse to police.
It'd be cynical to dismiss Twitter's failings towards the victims of harassment, threats and online bullying but sadly, the most visible advances Twitter's made towards combating online abuse are almost always made to popular outcry and media attention.
The newly-announced collaboration with WAM is arguably their biggest advance yet, and it's only a short pilot project, organised by a non-profit organisation.
To quote WAM executive director :
I don't think we should have to do this work. I think it's a scandal that a tiny, under-resourced nonprofit with two staff members is having to do free labour for them.
In an online space where conventional law enforcement is often unable or unwilling to participate, Twitter's intervention in abusive behaviour is critically important. So when will Twitter step up and take responsibility for the discourses it fosters online?
For now, let's hope that their collaboration with WAM encourages them to do better.
Source: The Conversation
This story is published courtesy of (under Creative Commons-Attribution/No derivatives).