Meaning of “the West Coast” in the English Dictionary


(Definition of “the West Coast” from the Cambridge Advanced Learner’s Dictionary & Thesaurus © Cambridge University Press)

"West Coast" in American English

See all translations

West Coastnoun [ U ]

us /ˈwest ˈkoʊst/

(in the US) the part of the country near the Pacific Ocean

(Definition of “West Coast” from the Cambridge Academic Content Dictionary © Cambridge University Press)

Need a translator?

Translator tool

Get a quick, free translation!