Talk Elections

General Politics => U.S. General Discussion => Topic started by: Vermin Supreme on May 21, 2012, 04:43:21 PM



Title: When did Hollywood become really liberal?
Post by: Vermin Supreme on May 21, 2012, 04:43:21 PM
It had to been around the late 1960's since Hollywood pre New Hollywood still had conservative actors such as John Wayne and Jimmy Stewart.


Title: Re: When did Hollywood become really liberal?
Post by: opebo on May 21, 2012, 07:15:32 PM
It still has loads of those types, VS, and still mostly peddles right-wing movies and TV shows.


Title: Re: When did Hollywood become really liberal?
Post by: freepcrusher on May 21, 2012, 08:14:28 PM
Hollywood had been founded by a largely jewish group of people who had left New York in the 1920s. Hollywood was always very cosmopolitan and liberal. Remember in the 40s that there were celebrities who wanted to abolish the HUAC and many of them were blacklisted for refusing to sign communist loyalty oaths. In the 60s, there were a lot of hollywood celebrities who supported leftwing causes. Greg Peck and Paul Newman were on Nixon's enemy list. Gene Wilder might have been too.


Title: Re: When did Hollywood become really liberal?
Post by: Phony Moderate on May 21, 2012, 08:15:40 PM
When the Republican Party became really conservative.


Title: Re: When did Hollywood become really liberal?
Post by: BaldEagle1991 on May 22, 2012, 02:02:27 AM
It was always liberal. Keep in mind it was George Clooney who said that the Oscars gave an award to a black man when blacks were still having to sit in balconies.


Title: Re: When did Hollywood become really liberal?
Post by: 🐒Gods of Prosperity🔱🐲💸 on May 22, 2012, 11:42:06 PM
You mean as opposed to the 40s when Frank Capra was making a pro-Soviet documentary?


Title: Re: When did Hollywood become really liberal?
Post by: k-onmmunist on May 23, 2012, 02:58:16 PM
Hollywood is liberal?


Title: Re: When did Hollywood become really liberal?
Post by: hopper on May 23, 2012, 05:24:21 PM
I was thinking maybe after George W. Bush declared war on Iraq and his spending spree. Hollywood hated Bush W. Remember stars like Pink, Faith Hill, and Madonna made comments about Bush? I remember Faith Hill and Madonna complaing about Bush's spending spree but they never say anything about the debt going up 5 trillion in Obama's 4 and a half years president. I never heard anything about Hollywood Stars commenting on politics in the 90's but of course we didn't have Lindsay Lohan, Paris Hilton(What Happened to Her?) and The Kardashian Sisters back then in terms of the big media exposure.


Title: Re: When did Hollywood become really liberal?
Post by: ○∙◄☻¥tπ[╪AV┼cVê└ on May 24, 2012, 01:27:02 AM
Ugh, Reagan, Arnold, and SOPA aren't liberal.


Title: Re: When did Hollywood become really liberal?
Post by: opebo on May 24, 2012, 04:49:17 PM

No, it isn't at all.  It is perhaps leans Democratic in voting/donating, but as a business they churn out mostly very right wing movies - insidious stuff really, like all the culture the corporations produce.


Title: Re: When did Hollywood become really liberal?
Post by: Free Palestine on May 27, 2012, 04:05:08 PM
It's not really liberal, like, San Francisco or Seattle liberal.  Hollywood liberalism strikes me as being centrist, with a hint of libertarianism.  But with environmentalism, and support for idiotic causes like Kony 2012, PETA, etc.  Certainly culturally conservative, with the whole housewife culture and teen mom bull[Inks].


Title: Re: When did Hollywood become really liberal?
Post by: BaldEagle1991 on May 27, 2012, 05:41:31 PM
It's not really liberal, like, San Francisco or Seattle liberal.  Hollywood liberalism strikes me as being centrist, with a hint of libertarianism.  But with environmentalism, and support for idiotic causes like Kony 2012, PETA, etc.  Certainly culturally conservative, with the whole housewife culture and teen mom bull[Inks].

Well they have to limit their liberalism, they can't afford alienating fan bases and face several boycotts. 


Title: Re: When did Hollywood become really liberal?
Post by: Person Man on May 27, 2012, 06:09:48 PM
Basically, its a place filled with overeducated lottery winners...well that is a major oversimplification but you can see how that fits to a point... and when they do things that reek of neoliberalism or antiabortion messages it comes off to me as a desperate ploy to try to desperately reach out to the rest of society that they are out of touch and unconnected with.


Title: Re: When did Hollywood become really liberal?
Post by: 🐒Gods of Prosperity🔱🐲💸 on May 27, 2012, 07:29:01 PM
Certainly culturally conservative, with the whole housewife culture and teen mom bull[Inks].

Basically, its a place filled with overeducated lottery winners...well that is a major oversimplification but you can see how that fits to a point... and when they do things that reek of neoliberalism or antiabortion messages it comes off to me as a desperate ploy to try to desperately reach out to the rest of society that they are out of touch and unconnected with.

Wait, are we talking about Hollywood, Florida now ???


Title: Re: When did Hollywood become really liberal?
Post by: dead0man on May 27, 2012, 08:02:20 PM
You guys are making this more complicated than it needs to be.  Hollywood is liberal because artists are liberal and Hollywood is full of artists.  Sure, some of the "money" isn't liberal (but many are), but the "money" cares less about politics than it does about making more dough.  The people making the stuff we enjoy (or mock)...the actors, the writers, the directors, many/most of the producers..these people are liberals (with a few exceptions, obviously...but we can name them on two hands probably...and most of those won't be "regular" conservatives). 


Title: Re: When did Hollywood become really liberal?
Post by: Torie on May 27, 2012, 08:24:54 PM
You guys are making this more complicated than it needs to be.  Hollywood is liberal because artists are liberal and Hollywood is full of artists.  Sure, some of the "money" isn't liberal (but many are), but the "money" cares less about politics than it does about making more dough.  The people making the stuff we enjoy (or mock)...the actors, the writers, the directors, many/most of the producers..these people are liberals (with a few exceptions, obviously...but we can name them on two hands probably...and most of those won't be "regular" conservatives). 

Yes, but Hollywood is more ideological than it used to be in the sense that it generates more psychological energy in their little minds. I guess that is due to the culture wars.


Title: Re: When did Hollywood become really liberal?
Post by: BaldEagle1991 on May 27, 2012, 11:10:41 PM
I think also a significant amount of gays and lesbians are often attracted to such an industry that promotes art and performance. I am not saying that every gay/lesbian is like that, but I think that could be a slight factor into this.


Title: Re: When did Hollywood become really liberal?
Post by: ○∙◄☻¥tπ[╪AV┼cVê└ on May 29, 2012, 12:12:08 AM
You guys are making this more complicated than it needs to be.  Hollywood is liberal because artists are liberal and Hollywood is full of artists.  Sure, some of the "money" isn't liberal (but many are), but the "money" cares less about politics than it does about making more dough.  The people making the stuff we enjoy (or mock)...the actors, the writers, the directors, many/most of the producers..these people are liberals (with a few exceptions, obviously...but we can name them on two hands probably...and most of those won't be "regular" conservatives). 

Not every artist is a liberal.
()


Title: Re: When did Hollywood become really liberal?
Post by: Beet on May 29, 2012, 01:19:00 AM
dead0man is correct. 'art' as it is interpreted in the West (e.g., a creative endeavor in which originality and imagination are rewarded, and whose ends stop at aesthetic pleasure) is inherently liberal. Even capitalistic. Illiberal states have a different conception of art, or else strictly subordinate art to something else, artistic creation when it does happen being a means to some other ends.


Title: Re: When did Hollywood become really liberal?
Post by: Tetro Kornbluth on May 29, 2012, 10:18:48 AM
It isn't.


Title: Re: When did Hollywood become really liberal?
Post by: BaldEagle1991 on May 29, 2012, 10:39:00 AM

No it is.


Title: Re: When did Hollywood become really liberal?
Post by: Tetro Kornbluth on May 29, 2012, 10:46:01 AM

Explain.

As far as I can see Hollywood (talking in general, individual directors may be different) has any political lean, it is a fuzzy "we-are-the-world-let's-all-be-nice-to-each-other" which admittely may not appeal to those American conservatives who do not seem to be fond of niceness and prefer coarse shouting and overt nastiness (this is apparently a large element of the conservative demographic) but I don't think that would classify it is a liberal.

After all, this philosophy does not prevent Michael Bay making movies about heroic American soldiers saving the day from evil foreigners. 


Title: Re: When did Hollywood become really liberal?
Post by: tpfkaw on May 29, 2012, 10:53:52 AM
Look at the villains in movies.  Without fail they are either "evil businessmen" or "evil corporations."  In the rare event one has a public sector villain, it is revealed in the "shocking twist" that the government is in fact doing the bidding of some evil corporation.  In the incredibly rare event that the villain is the government alone, the filmmakers are sure to show that it is controlled by straw conservatives.  When the villain is "terrorists" etc. they tend to be neo-Nazis or the like (they are sure to be shown making some racist or sexist comment to prove their Evilness), or if they appear to be fighting for a leftist cause it is unfailingly revealed in a Shocking Twist that they are in fact being controlled by an Evil Corporation.


Title: Re: When did Hollywood become really liberal?
Post by: Tetro Kornbluth on May 29, 2012, 11:00:52 AM
Look at the villains in movies.  Without fail they are either "evil businessmen" or "evil corporations."

Not always. Aliens? Foreign Terrorists? Team America: World Police, while imo a failure as a film, did try to satirize something very real.

And besides, big faceless things make excellent villians. You might as well say the all distopic novels are against free buspasses for senior citizens.

  
Quote
In the rare event one has a public sector villain, it is revealed in the "shocking twist" that the government is in fact doing the bidding of some evil corporation.

Petty corruption is rarely a source of hackeneyed drama á la hollywood. Bureaucrats may be stupid and incompetent and occasionally evil, but never are they interesting.  

Quote
In the incredibly rare event that the villain is the government alone, the filmmakers are sure to show that it is controlled by straw conservatives.  When the villain is "terrorists" etc. they tend to be neo-Nazis or the like (they are sure to be shown making some racist or sexist comment to prove their Evilness), or if they appear to be fighting for a leftist cause it is unfailingly revealed in a Shocking Twist that they are in fact being controlled by an Evil Corporation.

I can't think of the example of the last one, but then again I hardly watch big hollywood movies any more (but I certainly did in the past). Otherwise, I don't see how your comment is relevant to my argument. Many people work at big corporations (obviously), most people dislike their boss.. they are perfect villians. Politics never enters into it (if it did, than America must be much, much more left-wing than most people think).


Title: Re: When did Hollywood become really liberal?
Post by: ingemann on May 29, 2012, 11:28:04 AM
I'm with Gully Foyle on this, when we look at Hollywood from outside USA, we don't see this liberal monolith, yes sometimes Hollywood produce film critical of the existing order, but just as often it produce film, where the dialog could be replaced with people yelling "U.S.A U.S.A U.S.A U.S.A".
Of course Hollywood are likely more libertine than the average American, but it hasn't kept them from producing movies and series which celebrate small town America and conservative values.



Title: Re: When did Hollywood become really liberal?
Post by: tpfkaw on May 29, 2012, 11:32:04 AM
If you do polls of the American public, there's a lot more hostility to government than to business.  The former is much more obvious to produce a murderous conspiracy in the first place - there are dozens of examples of governments conducting secret illegal plots that would produce exciting movies, but not too many of "evil corporations" (who have limited resources, have to pay attention to profitability, have to report their income to tax agencies, and are held accountable for their actions).  Popular conspiracy theories almost always involve the government and very rarely involve "evil corporations" (if they do, typically banks, rarely depicted as film villains).

Even if we say that, regardless of realism, if Hollywood were centrist it would have an equal number of "liberal" villains (evil corporations, evil businessmen, neo-Nazis, etc.), and "conservative" villains (the government, Islamic terrorists, leftist terrorists), it doesn't hold up.  The former occur more than the latter by a ratio of about 10-1 or so.


Title: Re: When did Hollywood become really liberal?
Post by: Phony Moderate on May 29, 2012, 02:11:56 PM
The government itself is essentially an evil corporation.


Title: Re: When did Hollywood become really liberal?
Post by: All Along The Watchtower on May 30, 2012, 04:36:34 PM
If you do polls of the American public, there's a lot more hostility to government than to business.  The former is much more obvious to produce a murderous conspiracy in the first place - there are dozens of examples of governments conducting secret illegal plots that would produce exciting movies, but not too many of "evil corporations" (who have limited resources, have to pay attention to profitability, have to report their income to tax agencies, and are held accountable for their actions). Popular conspiracy theories almost always involve the government and very rarely involve "evil corporations" (if they do, typically banks, rarely depicted as film villains).

Even if we say that, regardless of realism, if Hollywood were centrist it would have an equal number of "liberal" villains (evil corporations, evil businessmen, neo-Nazis, etc.), and "conservative" villains (the government, Islamic terrorists, leftist terrorists), it doesn't hold up.  The former occur more than the latter by a ratio of about 10-1 or so.

LOL at the part I bolded.