When did Hollywood become really liberal? (user search)
       |           

Welcome, Guest. Please login or register.
Did you miss your activation email?
April 27, 2024, 02:43:46 AM
News: Election Simulator 2.0 Released. Senate/Gubernatorial maps, proportional electoral votes, and more - Read more

  Talk Elections
  General Politics
  U.S. General Discussion (Moderators: The Dowager Mod, Chancellor Tanterterg)
  When did Hollywood become really liberal? (search mode)
Pages: [1]
Author Topic: When did Hollywood become really liberal?  (Read 7092 times)
hopper
Sr. Member
****
Posts: 3,414
United States


« on: May 23, 2012, 05:24:21 PM »

I was thinking maybe after George W. Bush declared war on Iraq and his spending spree. Hollywood hated Bush W. Remember stars like Pink, Faith Hill, and Madonna made comments about Bush? I remember Faith Hill and Madonna complaing about Bush's spending spree but they never say anything about the debt going up 5 trillion in Obama's 4 and a half years president. I never heard anything about Hollywood Stars commenting on politics in the 90's but of course we didn't have Lindsay Lohan, Paris Hilton(What Happened to Her?) and The Kardashian Sisters back then in terms of the big media exposure.
Logged
Pages: [1]  
Jump to:  


Login with username, password and session length

Terms of Service - DMCA Agent and Policy - Privacy Policy and Cookies

Powered by SMF 1.1.21 | SMF © 2015, Simple Machines

Page created in 0.025 seconds with 12 queries.