I can’t be the only one who’s read a book or watch a movie/series/anime or whatever, where there’s some huge calamity in some region or the there or something to the effect, and there’s no mention of what’s going on in the rest of the world. You know what I mean right?

Take the Hunger Games. Panem is a post apocalyptic North America where every year 12-24 teenagers from 12 Districts are take to a massive arena to fight for their lives and, by extension, for the survival of the people of their District. Alright great … and the rest of the world is… where?

So you’re telling me that these people, who barely have any human rights, are being totally ignored by the world? England across the sea sipping tea


Canada up north like


And the rest of the world is probably just breathing a sigh of relief because someone finally took the giant guns and missiles away from America.

But in all seriousness, I can’t be the only one seeing dramatic world/country changing event happening in a book/show and wondering, “Is the rest of the world experiencing this or are they just dead?” Or am I just really weird?