-
The radical rightwing pegs Hollywood as a leftist town, which is completely wrong. There are a lot of actors, writers, and directors who talk a liberal agenda... but all the studio bosses, for as long as there have been studios, have all been as far rightwing as you can possibly imagine.
Topics
Cite this Page: Citation