Meaning of “Germany” - Learner’s Dictionary

Germany

noun uk us /ˈdʒɜːməni/

a country in Europe

German uk us /ˈdʒɜːmən/ adjective

coming from or relating to Germany

German noun

someone from Germany

(Definition of “Germany” from the Cambridge Learner’s Dictionary © Cambridge University Press)