When did Hollywood become really liberal? (user search)
       |           

Welcome, Guest. Please login or register.
Did you miss your activation email?
April 27, 2024, 02:33:59 AM
News: Election Simulator 2.0 Released. Senate/Gubernatorial maps, proportional electoral votes, and more - Read more

  Talk Elections
  General Politics
  U.S. General Discussion (Moderators: The Dowager Mod, Chancellor Tanterterg)
  When did Hollywood become really liberal? (search mode)
Pages: [1]
Author Topic: When did Hollywood become really liberal?  (Read 7091 times)
freepcrusher
YaBB God
*****
Posts: 3,832
United States


« on: May 21, 2012, 08:14:28 PM »

Hollywood had been founded by a largely jewish group of people who had left New York in the 1920s. Hollywood was always very cosmopolitan and liberal. Remember in the 40s that there were celebrities who wanted to abolish the HUAC and many of them were blacklisted for refusing to sign communist loyalty oaths. In the 60s, there were a lot of hollywood celebrities who supported leftwing causes. Greg Peck and Paul Newman were on Nixon's enemy list. Gene Wilder might have been too.
Logged
Pages: [1]  
Jump to:  


Login with username, password and session length

Terms of Service - DMCA Agent and Policy - Privacy Policy and Cookies

Powered by SMF 1.1.21 | SMF © 2015, Simple Machines

Page created in 0.022 seconds with 13 queries.