When did Hollywood become really liberal? (user search)
       |           

Welcome, Guest. Please login or register.
Did you miss your activation email?
April 27, 2024, 03:52:42 AM
News: Election Simulator 2.0 Released. Senate/Gubernatorial maps, proportional electoral votes, and more - Read more

  Talk Elections
  General Politics
  U.S. General Discussion (Moderators: The Dowager Mod, Chancellor Tanterterg)
  When did Hollywood become really liberal? (search mode)
Pages: [1]
Author Topic: When did Hollywood become really liberal?  (Read 7096 times)
Beet
Atlas Star
*****
Posts: 28,916


« on: May 29, 2012, 01:19:00 AM »

dead0man is correct. 'art' as it is interpreted in the West (e.g., a creative endeavor in which originality and imagination are rewarded, and whose ends stop at aesthetic pleasure) is inherently liberal. Even capitalistic. Illiberal states have a different conception of art, or else strictly subordinate art to something else, artistic creation when it does happen being a means to some other ends.
Logged
Pages: [1]  
Jump to:  


Login with username, password and session length

Terms of Service - DMCA Agent and Policy - Privacy Policy and Cookies

Powered by SMF 1.1.21 | SMF © 2015, Simple Machines

Page created in 0.023 seconds with 12 queries.