When did Hollywood become really liberal? (user search)
       |           

Welcome, Guest. Please login or register.
Did you miss your activation email?
April 27, 2024, 03:26:07 AM
News: Election Simulator 2.0 Released. Senate/Gubernatorial maps, proportional electoral votes, and more - Read more

  Talk Elections
  General Politics
  U.S. General Discussion (Moderators: The Dowager Mod, Chancellor Tanterterg)
  When did Hollywood become really liberal? (search mode)
Pages: [1]
Author Topic: When did Hollywood become really liberal?  (Read 7095 times)
Torie
Moderators
Atlas Legend
*****
Posts: 46,076
Ukraine


Political Matrix
E: -3.48, S: -4.70

« on: May 27, 2012, 08:24:54 PM »

You guys are making this more complicated than it needs to be.  Hollywood is liberal because artists are liberal and Hollywood is full of artists.  Sure, some of the "money" isn't liberal (but many are), but the "money" cares less about politics than it does about making more dough.  The people making the stuff we enjoy (or mock)...the actors, the writers, the directors, many/most of the producers..these people are liberals (with a few exceptions, obviously...but we can name them on two hands probably...and most of those won't be "regular" conservatives). 

Yes, but Hollywood is more ideological than it used to be in the sense that it generates more psychological energy in their little minds. I guess that is due to the culture wars.
Logged
Pages: [1]  
Jump to:  


Login with username, password and session length

Terms of Service - DMCA Agent and Policy - Privacy Policy and Cookies

Powered by SMF 1.1.21 | SMF © 2015, Simple Machines

Page created in 0.023 seconds with 12 queries.