Cambridge Dictionaries online Cambridge Dictionaries online

The most popular online dictionary and thesaurus for learners of English

  

English definition of “the Wild West”

the Wild West

noun [S]    
the name given to the western part of the US during the time when Europeans were first beginning to live there and when there was fighting between them and the Native AmericansRenaissance: 1501 to 1899Named regions of countries
(Definition of the Wild West noun from the Cambridge Advanced Learner's Dictionary & Thesaurus © Cambridge University Press)
Focus on the pronunciation of the Wild West

Translations of “the Wild West”

Word of the Day

work out

to exercise in order to improve the strength or appearance of your body

Word of the Day

Blog

Read our blog about how the English language behaves.

Learn More

New Words

Find words and meanings that have just started to be used in English, and let us know what you think of them.

Learn More