When did Hollywood become really liberal? (user search)
       |           

Welcome, Guest. Please login or register.
Did you miss your activation email?
April 27, 2024, 05:27:46 AM
News: Election Simulator 2.0 Released. Senate/Gubernatorial maps, proportional electoral votes, and more - Read more

  Talk Elections
  General Politics
  U.S. General Discussion (Moderators: The Dowager Mod, Chancellor Tanterterg)
  When did Hollywood become really liberal? (search mode)
Pages: [1]
Author Topic: When did Hollywood become really liberal?  (Read 7101 times)
ingemann
YaBB God
*****
Posts: 4,308


« on: May 29, 2012, 11:28:04 AM »

I'm with Gully Foyle on this, when we look at Hollywood from outside USA, we don't see this liberal monolith, yes sometimes Hollywood produce film critical of the existing order, but just as often it produce film, where the dialog could be replaced with people yelling "U.S.A U.S.A U.S.A U.S.A".
Of course Hollywood are likely more libertine than the average American, but it hasn't kept them from producing movies and series which celebrate small town America and conservative values.

Logged
Pages: [1]  
Jump to:  


Login with username, password and session length

Terms of Service - DMCA Agent and Policy - Privacy Policy and Cookies

Powered by SMF 1.1.21 | SMF © 2015, Simple Machines

Page created in 0.02 seconds with 12 queries.