Monday, 3 November, 2014 - 16:07

What's the deal with dystopias?

by CharlotteD

Right now, there are lots of very successful young adult books have one thing in common – they're all dystopias; that is, they all have a community or a society that might seem to be content and happy, but something dark and wrong is lurking beneath.

The Hunger Games trilogy is a good example of this – although for the people in the Districts society is unpleasant, the people in the Capitol don't seem to notice. The Giver, by Lois Lowry, is another one. At first Jonas, the main character, believes life is perfect, before discovering that terrible things are happening.

However, though dystopian books are doing particularly well now, they have been around for a long time. Two of the most famous examples are Nineteen Eighty-Four by George Orwell, where the characters live in a state where every move is controlled by the government, and Fahrenheit 451 by Ray Bradbury, where books are burned by "firemen" because they are outlawed.

Perhaps what makes books based around dystopian societies so compelling is that they are so different to our own. We know, along with the main characters, that something is wrong – it connects us to them. We want them to fight against it and their fights, in these books, are always larger than life. They always end up fighting for the fate of humanity, even if it is just their little piece of it.

As well, it is interesting to read about a version of the future that an author has imagined. When we read historical books, we know that some things are made up, but most things are fact because we have a good idea of how people lived and worked. In a dystopia, everything is so different that even the smallest things, like the contrasts between the five factions in Divergent, become more intriguing.

Maybe what is so important about books set in dystopias, though, is that they make us think, for however long, about our society and the way it might shape up in the future. Will we ever end up with a perfect world, where people don't get sick and everyone lives well and is happy? Or will we end up living in a place similar to one of these books – somewhere that seems so lovely on the outside, but is rotten within?

Language level

Have you read any books set in a dystopia? What did you think? Do you think in the future, our world will end up like a dystopia, or not?

English courses near you