Content control features are useful for many situations in which people choose to limit or filter what they see online. These features are helpful to those who are victims of PTSD, violent threats, and other negative online experiences. In her article, Melissa King discusses the view of people who oppose the use of content control and how their claim that those who use it are simply too weak or sensitive is actually creating a culture that puts even more pressure and self-blame on these victims.
Against Content Control
The foremost argument against the use of content control is that people make out their online experience of harassment to be worse than it is because they are extra sensitive. This viewpoint compares unrestricted Internet exposure, without content control, to exposure therapy- “a type of therapy designed to combat severe anxiety” (King). The issue with this comparison is that it is not entirely accurate. Exposure therapy requires controlled exposure to the source of the victim’s anxiety. Allowing the victim exposure to all threats would have the opposite effect, causing amplified trauma. Using content control is, therefore, more comparable to exposure therapy than unrestricted Internet use.
Evidence suggests that younger generations are on the side of the argument that is pushing for openness. Younger generations are more open to discussion and exposure to traumas that previous generations experienced. The relatively new technology of the Internet that now allows not only information retrieval, but also participation, creates a platform for discussion.
Another point that arguers against content control says that the online experience stays in the virtual world and that words on a screen cannot threaten someone’s safety or cause PTSD. “This misunderstanding is predicated on the same ignorance that yields metaphors to exposure therapy: the fact is, threats of violence online can be a cause of PTSD in and of itself” (King). Threatening situations, even cyber bullying, can cause PTSD over time.
One of the more recent forms of content control, blocklists, have met also been met with opposition. Arguments that blocklists lead to defamation and that they are equal to the harassment itself are invalidated by how they fail to recognize the level of viciousness that online harassment can reach.
King uses the example of the harassment employed by Gamergate, who has gone to the lengths of threatening targets and their families, and posting targets’ addresses and information. Blocklists free victims from the harassment, threats, stalking, and intimidation that they are subject to when there is no content control.
The most typical victims of this kind of abuse are women who openly oppose sexism or delve into male-dominated environments, such as the technology and video game industries. The persistent harassment that women in these fields undergo can easily lead to PTSD. The clear evidence of this kind of exploitation makes it hard to believe why some can oppose content control.
Defending Content Control
Blocklists, muting features, and other content control devices allow people the option of protecting themselves from trauma and harassment without having to limit their own participation on the Internet. King points out that controlling what you see does not silence someone else, it’s a way to refuse to listen to their intimidation. Neither does it cause defamation.
The psychological makeup of each person is different. Therefore, nobody but the person being targeted should decide if content control is necessary for the target’s safety. Nobody should be burdened with shouldering harassment because of the inadequate assertions of another. In simple words, “People should be allowed to set their own personal boundaries” (King).
Cyberbullying. Digital Image. End Cyber Bullying. End To Cyber Bullying Inc, 2012. Web. 6
Intimidation Game. Digital Image. PC Mag. Ziff Davis, 2015. Web. 6 March 2016.
King, Melissa. “Better Online Living Through Content Moderation.” Model View Culture 28
(October 14, 2015)/ Web: https://modelviewculture.com/pieces/better-online-
living-through-content-moderation. 5 March 2016.