Define Film Industry Meaning

Film Industry
(noun) The enterprise of making and producing films.

(adjective) A form of virulent putrescence found on toxic urban wastescapes, characterised by egocentric nihilism and posturing futility; a rich source of sustenance for assorted lower orders and parasitic detritus, including the morally odious, sycophantic and genetically impolite.

The film industry is full of arrogant egotists, sycophants and poseurs.
By Mirelle