Meaning Of The Word West German
What's the definition of West German? Find West German meanings, definitions and more at the Real Dictionary online.
West German Meaning
| West German Definition |
|---|
What's The Definition Of West German?
West German
adjective: West German means belonging or relating to the part of Germany that was known as the Federal Republic of Germany before the two parts of Germany were united in 1990. West German also means belonging or relating to the people or culture of this part of Ger West German in British English adjective: of or relating to the former republic of West Germany (now part of Germany) or its inhabitants noun: a native or inhabitant of the former West Germany |
More Definitions
A | B | C | D | E | F | G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z
Definitions Of The Day
- Welding ‐ noun: the activity of uniting metal or plastic…
- Fur fly ‐ If an event sets the fur flying, it causes a great…
- Tuladi ‐ noun; noun: a type of large lake trout found mainly…
- Herodotus ‐ 484?-425? b.c.; Gr. historian; noun: called the…
- Developing world ‐ noun: developing countries…
- Nursery nurse ‐ countable noun: A nursery nurse is a person who…
- The opposite sex ‐ women in relation to men or men in relation to…
- Sturt ‐ noun: Charles. 1795–1869, English explorer…
- Carnifex ‐ noun: an…
- Aroma ‐ a pleasant, often spicy odor; fragrance, as of…