From a longer post on the worldviews found in “The Walking Dead,” “Breaking Bad,” and “Game of Thrones”:
Here’s the problem: none of them offer hope. Sure, particular situations end well – occasionally. But the pot at the end of the rainbow is not filled with gold. The characters are forced to create their own meaning in a meaningless world; to find their own kind of hope when there is none; to rage agains the dying of the light while not actually believing there will be any end other than darkness.
These movies and shows may all be good and noble. But without hope, the truth about reality is incomplete, so…it’s really not true. And a partial truth is a lie.
I am starting to think “Breaking Bad” and “The Walking Dead” move me because I bring to them a hope and a belief in redemption that is not actually in the story. If I step outside of my Christian presupposition about the world – through Christ, all can be made well – the shows strike me as mesmerizing nihilism.
If that is true, what does the popularity of these shows say about our culture? Do they make us feel better? I don’t think so. Do they remind us of the really important things in life? Well, yes, but to what purpose? These shows have a lot of brilliantly packaged sound and fury – without hope, do they also signify nothing? Walter White is Macbeth, but there is nobody to clean up the kingdom when he’s gone. Rick Grimes is as nobly doomed as Eddard Stark, except humanity’s winter has now arrived, and there will be no summer.