You Mean There Are Republicans in Hollywood?

It’s no secret in this country that the Hollywood-driven entertainment industry of America is a realm mostly dominated by liberal Democrats. While there are various theories as to why this is the case, it is more important to recognize that Hollywood has its fair share of Republicans who are steadfast in their political approach. Let’s take a look at some Hollywood entertainers that you may not have expected were Republicans.



Comments are closed.