buritobr
YaBB God
Posts: 3,659
|
|
« on: July 04, 2015, 10:32:56 PM » |
|
How did the american entertainment industry present us the World War Two since 1945?
Do you think that there was a shift on the Hollywood views of the WWII?
I have the impression that in the immediate postwar years, there were many movies about the Pacific front. In the recent time, there are many movies about the Western European front.
If this shift really happenned, do you think geopolitics played a role?
Although there were important battles between Americans and Germans, the largest % of the Wehrmacht was destroyed by the Red Army. The most important enemies of the Americans were the Japanese. Maybe, in the worst years of the Cold War, there was few willing to show the former enemy of the present enemy as the biggest villains. Maybe, when the USA started to support Israel in the Middle East conflicts, the interest on horror committed by the nazis increased.
And maybe, movie directors think that German military uniforms are fancy to show in the big screen.
|