Meaning Of The Word West German
What's the definition of West German? Find West German meanings, definitions and more at the Real Dictionary online.
West German Meaning
| West German Definition |
|---|
What's The Definition Of West German?
West German
adjective: West German means belonging or relating to the part of Germany that was known as the Federal Republic of Germany before the two parts of Germany were united in 1990. West German also means belonging or relating to the people or culture of this part of Ger West German in British English adjective: of or relating to the former republic of West Germany (now part of Germany) or its inhabitants noun: a native or inhabitant of the former West Germany |
More Definitions
A | B | C | D | E | F | G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z
Definitions Of The Day
- Niemöller ‐ ˈ(Friedrich Gustav Emil) Martin (ˈmɑʀtin)…
- Multivariate regression ‐ (of a distribution) involving a number of distinct…
- Ichthyodorulite ‐ noun: a spiny plate located on the tail and back…
- Discrepancies ‐ noun: a conflict or variation, as between facts…
- Exorbitant ‐ going beyond what is reasonable, just, proper…
- Sukkah ‐ a temporary structure with a roof of leafy boughs…
- Mass-energy ‐ noun: mass and energy considered as equivalent…
- Digitule ‐ noun: any small…
- Make no secret ‐ phrase: If you make no secret of something, you…
- Unsafe ‐ If you are unsafe, you are in danger of being…