Seeing as I have been diving into History and things that happened outside of the United States as of lately, (as in the past six months) I decided to put the two together, and voila, you've got this list. As we all know, World War II is arguably one of the major events that has shaped the relationship between the United States, the United Kingdom, Germany and Japan in present day. Without World War II, those four nations wouldn't be allies, or at least it would take a longer time for the four to achieve that status. But are there a few things that may have been forced into the history books to create a slightly altered narrative? Although it's not as bad as the fake news we get these days, yes, there are.