Meaning of “Dixie” in the English Dictionary

american-english dictionary

"Dixie" in American English

See all translations

Dixie[ U ]

us /ˈdɪk·si/

the southern states of the US that fought against the northern states during the Civil War

(Definition of “Dixie” from the Cambridge Academic Content Dictionary © Cambridge University Press)