Define Hollywood Liberal Meaning

Hollywood Liberal
A Los Angeles based individual who uses their base level far-left political agenda as a means of promoting their own vanity and self-righteousness rather than actually promoting liberal political ideologies or social change.

"I didn't watch the Oscars this year, sorry but I didn't feel like listening to a bunch of Hollywood liberals lecture at me."
By Veradis