It seems like all of these "serious" shows that are met with such positive critical acclaim are all about how much shocking violence and human suffering they can pack into a 50-minute episode. These shows are filled with rapes, graphic murders ranging from dismemberment to shoving girls in front of trains, torture of animals and people, and any other means of making the characters miserable and the viewers uncomfortable. At one point in this season of House of Cards I had to leave the room so I could not hear the squealing of a guinea pig about to be crushed underfoot. It has gotten so bad in all of these shows that Matt and I have begun to identify which characters--human and animal--are introduced solely so they can be brutally tortured or murdered on camera later in the season.
Now, I am not necessarily prudish about violence. I understand that rape is a reality and that people lie, cheat, murder, and rape every day. My issue with these shows is that it has reached the point where that is all these shows are doing. In Slate's defense of the misogyny in True Detective (which is blatant and atrocious), the author concludes that it is okay because the message of the show is that men do bad things. No shit. Men do bad things? Men do bad things sums up the messages of Sons of Anarchy, The Walking Dead, and House of Cards. Imagine that. AHS:Coven would like to add that women do bad things, too. Yes, I get it. It's true. People suck. Nihilism is cool. But I am looking for something with more substance.
I am also suspicious of this message. We have a million shows telling us that people suck, we are all depraved, the world is shitty, and there is nothing we can do about it. That message seems to me to be a cop out. It invites us to ignore social realities. It allows the privileged viewers of the show to get desensitized to violence and human suffering, and it ignores the fact that there are real problems in this world that can be helped by human action. It glosses over the fact that much of the suffering in this world occurs as a result of racism, sexism, and wealth disparity. And the fact that most of these shows are about white men and their problems only strengthens my suspicions.
So, I am sick of this trend in television. I want something with character development, something in which female characters exist as something more than plot devices, something with a real message, something that doesn't leave me feeling both violated and implicit when I finish an episode. Sadly, some of these shows began with promise. I think American Horror Story's first two seasons are great and I much prefer them to this last season in which each new episode depicts women finding a new way to torture each other. I am probably most upset by the demise of Sons of Anarchy, which I thought was going somewhere in the first few seasons, but the last season made it clear that the show is only about violence and suffering and the female characters that I became so invested in are simply there to facilitate more gratuitous violence and human suffering.
That being said, I am looking to season two of The Americans to restore my faith in television and provide me with the smart show with substance that I am looking for. Until then, I will continue to rewatch Gilmore Girls, Buffy, and 30 Rock.
I owe this post to Netflix; without it I never would have been able to binge watch all these shows and reach the dark, frustrating place that led me to write this.