Ever since the end of ww2, Japan has been an American colony.
Though imagine being a nip politician and your legacy is "i doomed my people's future by pretending i had a stomach ache to avoid voting on a crucial moment because i let myself be bribed or blackmailed"
Though this could also be what happens when a country is completely in debt to the bankers.
Post Imperial Japan has always been a disgrace, I can't understand that people are surprised that this happened, it was inevitable. The moment Hideki was proclaimed a war criminal and the rest of the japs were guilt tripped into thinking they're the spawn of Satan, is when Japan started going downhill.
It used to control numerous islands around it, it used to be a fearsome nation, now it's small shit island full of social retards who cope with shitty cartoons, and now it's also apparently setup to be a breeding grounds for faggots, congratulations Japan, welcome to the ''great'' western world.
It's always a case of them being better than other first world alternatives. For example their post-war self-flagellation was nowhere near as bad as Germany, despite being constantly attacked over it. They're still getting flack for "anti-Korean" material in textbooks.
I do remember hearing an old Japanese guy in the 90s complaining about how weak and pathetic the modern generation is, how they follow blindly rules without even knowing why. I had a feeling he was right and it was only going to get worse.
Whilst their self flagellation was not as bad as Germany's their demonization of their own warrior culture is really bad. Japan was a military country for most of its history yet now the military is so demonized.
Ever since the end of ww2, Japan has been an American colony. Though imagine being a nip politician and your legacy is "i doomed my people's future by pretending i had a stomach ache to avoid voting on a crucial moment because i let myself be bribed or blackmailed"
Though this could also be what happens when a country is completely in debt to the bankers.
Post Imperial Japan has always been a disgrace, I can't understand that people are surprised that this happened, it was inevitable. The moment Hideki was proclaimed a war criminal and the rest of the japs were guilt tripped into thinking they're the spawn of Satan, is when Japan started going downhill.
It used to control numerous islands around it, it used to be a fearsome nation, now it's small shit island full of social retards who cope with shitty cartoons, and now it's also apparently setup to be a breeding grounds for faggots, congratulations Japan, welcome to the ''great'' western world.
It's always a case of them being better than other first world alternatives. For example their post-war self-flagellation was nowhere near as bad as Germany, despite being constantly attacked over it. They're still getting flack for "anti-Korean" material in textbooks.
I do remember hearing an old Japanese guy in the 90s complaining about how weak and pathetic the modern generation is, how they follow blindly rules without even knowing why. I had a feeling he was right and it was only going to get worse.
Whilst their self flagellation was not as bad as Germany's their demonization of their own warrior culture is really bad. Japan was a military country for most of its history yet now the military is so demonized.