It had to have been some time after WW2. Europe loved us as liberators in WW2.
So, was it because we dropped the big one? The world started hating us because we were so powerful?
Or did it start during the Vietnam War?
When, exactly, were the seeds of hatred for America planted?
Was there a certain time in our foreign policy when the tide of world opinion turned?
Was it a certain President? A certain something we did?
Does anyone know? Does anyone have any links they can post to define this for me? I'm not being facetious or posting this just to have someone say the world doesn't really hate us, just liberals, etc.
I really would like to know. Dates and times. Links and sources. Etc.
So, was it because we dropped the big one? The world started hating us because we were so powerful?
Or did it start during the Vietnam War?
When, exactly, were the seeds of hatred for America planted?
Was there a certain time in our foreign policy when the tide of world opinion turned?
Was it a certain President? A certain something we did?
Does anyone know? Does anyone have any links they can post to define this for me? I'm not being facetious or posting this just to have someone say the world doesn't really hate us, just liberals, etc.
I really would like to know. Dates and times. Links and sources. Etc.