In Melissa King’s article, Better Online Living through Content Moderation, she explains how in today age of technology there is prevalent amount of abuse and harassment on the internet. King offers a solution to the problem through content control features such as block and ignore functions, content/trigger warnings, blocklist and privacy options. By employing these variety of functions, one can choose to make their online experience less irritating, but to others who may suffer from PTSD; these content control features are vital to them so that they can avoid topics and people that may trigger their anxiety. Yet these tools face constant cultural opposition. Users of optional moderation tools are often criticized for being weak or too sensitive. Disparaging and discouraging the use of content control features, creates a culture that pressures people to expose themselves to content that can be detrimental to a victim with anxiety issues. Though content control features don’t guarantee a stop to the effects of abuse online, they do aid in limiting attacks that can cause PTSD if severe or long-term enough.
One of the major misconceptions of content control is the argument that people who are victims to online harassment blow it out of proportion, and they should be less sensitive and ignore it. Melissa King points out that this argument composes a misleading parallel to exposure therapy; a type of therapy designed to combat severe anxiety through gradual and controlled exposure to its source, to inure an individual to these triggers and lesson the disruptions they can cause. In the perspective concerning online harassment, the concept exposure therapy is clearly a misapplication compared to content control, because having random internet strangers hurl insults and threats at someone with the hope they somehow come out more mentally durable. In this way victims suffering from PTSD will likely have their trauma amplified rather than reduced. On the same ignorance that holds the misunderstanding of exposure therapy, popular culture presumes that PTSD is something only military veterans suffer from. Argument against content control employs that online harassment is merely mean words said on the internet and have no real threat to the safety of someone or their family. In reality long-term exposure to threats of violence online is one of the major causes of PTSD.
Blocklists are one of more recent content control tools available, arguments against blocklist are that people are being defamed for statements and opinion that they didn’t make when they are added to mass blocklist. However, blocklist creators like Randi Harper, creator of Good Game Auto Blocker, claims that blocklist creates its filtering methodology and appeal process clear therefore claims to defamation do not hold; when one is flagged it is absolutely correct because it blocks the majority of interactions with Gamergate on Twitter. These points however fail to recognize how ruthless and pervasive online harassment can be. Gamergate, for example, is notorious for doing everything in its power to threaten people into silence – from calling and threatening family members, to posting pictures of their targets’ homes and addresses online. These are all intimidation tactics that are designed to silence people – and are clearly illegal. Often women are especially targeted to this type of abuse, particularly women who are in positions that are considered male-dominate. Women in the tech industry or in the video game culture are often the main targets. Asking victims not to use content control tools is basically forcing the abused to spend more time with their abusers.
Content control tools like trigger or content warnings, blocklist and muting features can block unwanted harassment and abuse on the internet especially for people who suffer from psychological trauma. Not everyone is able to ignore these online abusers especially if they have PTSD. Ultimately, the solution to this issue would be that content control tools shouldn’t be discouraged because it gives others the power to personally moderate the worst of the internet in no way violates anyone else’s rights, and is often the best option victims have. For people that are pro anti-content control, they are supporting misinformed opinion that are insufficient to helping others mentally and is further increasing harmful patterns in online abuse
King, Melissa. “Better Online Living through Content Moderation.” Model View Culture. N.p., 14 Oct. 2015. Web. 8 Mar. 2016.