Books and Literature, media, Media Literacy and Analysis, Social Deconstruction

I Don’t Need to Tell You Censorship Laws are Bad, but Do You Realize How Bad They Really Are?

There was this trend of conversation on BookTok for a bit, but outside that I still occasionally see conversations about wanting a book rating system similar to the MPAA or ESRB. I’ve said in a TikTok before, and I’ll say again, “no you don’t.”

Especially not in the current climate when so many books are at threat of being banned for queer material. 

One, the basic structure already exists. Middle grade, young adult, and adult already set up the same basic principles that the MPAA is trying to establish: what’s appropriate for what age group. We’re also in a day and age when we’ve been able to crowd-source content information for the vast majority of even semi-modern media. And even if a book rating system existed, you’d still have to monitor your kid’s reading because broad systems can’t cover every eventuality and nuance. The parameters for grading also aren’t uniform. We treat gore, violence, sex, and profanity completely differently, and changes in sensibility through time dramatically affect how we rate movies. The Ring is PG-13. Titanic is PG-13. 

So if your goal is to “protect the children” a broad rating system doesn’t and never has done that alone, and we already have an integrated broad age rating system for books. You already have all the tools that you need. 

More importantly, you don’t want the government touching a book rating system with a ten-foot pole. That’s how we got the Hays Code. How we got the CCA. It’s how some small libraries are already in a place where they can’t allow minors in without complex caveats because of the way their state has decided to legislate kids’ even potential access to “adult material.” 

Because the thing to remember in this new wave of proposed content management bills coming down the pipe is that the people who propose these things don’t actually give a shit about protecting children or making everyone safer. Instead, it’s about, in a broad sense, attempting to create a media environment that matches their specific ideals, and they’re leaning on a “but think about the children!” sentiment to get there. 

Consider the Interstate Obscenity Definition Act and its attempt to clarify legal parameters at the federal level around what classifies as “obscenity.” Currently the Miller Test is utilized on a case by case basis to determine whether something is obscene, and it has a very subjective feel to it. So in a semantic sense, there is a benefit to clarification. 

Why do you need to legally define obscenity (not protected by the first amendment) in the first place, though? The implication is to maintain a sense of community morality, and, on the surface, keep obscene material out of the reach of children. 

Practically, there are laws in place that prohibit the manufacture, transportation, and sale of obscene material at a holistic level. Now, if everyone involved in the issue is an adult consenting to the viewing and creation of this obscene material during this process, why the hell is that the government’s business? At an existential level, why should the government give a crap if a grown adult partakes in ethically sourced prurient interests?

More often than not, laws around media and content management are about playing morality police, and they very rarely fix the things they say they’re going to. The FOSTA-SESTA combo (which was described as a censorship bill in hiding) was purportedly intended to stop sex-trafficking. In practice, though, it just made everything worse for sex workers who utilized online services to advertise their wares. They didn’t go away, they just went back to the objectively more dangerous practice of street-level solicitation.

This is because legislation is so focused on an eradication of sex-work for moral reasons, that they don’t take into account the people on the ground saying “this is never going to actually happen.” It’s a lack of practicality in terms of how we culturally approach sex and sexual content.

Age-verification on pornography sites doesn’t actually keep teens off them, and just introduces data security issues. We know that a more complex mousetrap just makes smarter mice. Creating an uplifting and ethically created media environment comes from lateral social change and providing social services to those who’ve been victims of actual exploitation. Not a huddle of old men who, dollars to donuts, are conveyors of exploitation themselves.

So you’ll see a lot of people talking about the slippery slope of censorship, and how we get there. And how you don’t want to be the person helping the ball roll down hill. At an existential level, as we continue to examine bills like these, remember that, at best, the government as a body doesn’t really know what it’s doing, and, at worst, wants to create legal frameworks for taking down “deviants.”

And you’ll be the “deviant,” eventually.

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.