Ebony Wrote:A friend is of the opinion that Japan is the first post-apocalyptic culture on Earth. I can see the evidence of that claim. Seventy years ago, the Empire of Japan, which had never experienced true defeat prior, was swatted aside in two definitive shows of force by the United States, that left two metropolitan areas destroyed and changed the paradigm of how war was waged. Shortly after that, core tenets of Japanese culture and religion were systematically dismantled, but they were not completely rewritten. As a result, the people of Japen have had to fill in the blank spots left, as well as figure out how to survive in the post-war world. They didn't have a slow decline, like the British Empire, but rather a short and fast removal. They've had to recover ever since, and I think they're still recovering.
Wouldn't post-WWII Germany count as well? It too was ravaged pretty badly, with the added "bonus" of spending much of the latter half of the 20th Century split into two countries (and East Germany apparently took a lot longer to recover than West Germany, due to a variety of factors).