Originally posted by bluemirage5
reply to post by Eurisko2012
The Americans won WW2?
ahhhhhhhh NO !!!
You took out Japan but Europe's war was won by the Russians and the Brits.
Did you fail your history classes, or just skip them? Let me guess, you're from the UK, right, where the history lessons have been "revised" to
allow the UK to save a little face.
When the UK was "on their own", the results were the Dunkirk and Norway debacles.
Then when the UK was out of supplies, the U.S. started the Lend-Lease program to keep them in the war.
Finally, the U.S. turned the whole of the UK into a U.S. military base for the invasion of France. the Russians certainly were a key factor in
winning the war, but the UK could not have done much to help without the U.S. and what was then called "the arsenal of democracy". What passes for
an education these days ...