Yes. But as a liberal I could care less.
Although I've always wondered why this has been the case. Were the liberals the first ones to establish media companies? Is the journalism world filled with high educated people that requires a high education level that comes with liberalism? I'm not sure.
Journalism as a profession attracts left leaning types in all countries. But generally that effect is countered by media companies being controlled by rich, Conservatives.
Shouldn't that provide some sense of balance then? My guess is the liberals established the companies, but the fact that FOX News/News Corp. exists seems to counter that.