English dictionary
hollywood meaning and definition
Definition and meaning of hollywood at MeaningMonkey.org. hollywood meaning and definition in the English Dictionary.HOLLYWOOD noun
Definition of Hollywood (noun)
- the film industry of the United States
- a flashy vulgar tone or atmosphere believed to be characteristic of the American film industry
- "some people in publishing think of theirs as a glamorous medium so they copy the glitter of Hollywood"
- a district of Los Angeles long associated with the American film industry
HOLLYWOOD adjective
Definition of Hollywood (adjective)
- of or relating to the film industry in the United States
- "a Hollywood actor"
- flashy and vulgar
- "young white women dressed Hollywood style"; "Hollywood philandering"
Source: Princeton University Wordnet