-
Hollywood has always been political. They consider it their right and duty to tell us what is politically good and right.
Cite this Page: Citation
Hollywood has always been political. They consider it their right and duty to tell us what is politically good and right.