Cambridge Dictionaries online Cambridge Dictionaries online

The most popular online dictionary and thesaurus for learners of English

  

English definition of “west”

west

noun [U]    /west/ (abbreviation W.)
the direction where the sun goes down in the evening that is opposite east, or the part of an area or country which is in this direction: The points of the compass are north, south, east, and west. The sun sets in the west.
(Definition of west noun from the Cambridge Academic Content Dictionary © Cambridge University Press)
Focus on the pronunciation of west

Definitions of “west” in other dictionaries

Word of the Day

pump iron

to lift heavy weights for exercise

Word of the Day

Blog

Read our blog about how the English language behaves.

Learn More

New Words

Find words and meanings that have just started to be used in English, and let us know what you think of them.

Learn More