Meaning Of The Word West German
What's the definition of West German? Find West German meanings, definitions and more at the Real Dictionary online.
West German Meaning
| West German Definition |
|---|
What's The Definition Of West German?
West German
adjective: West German means belonging or relating to the part of Germany that was known as the Federal Republic of Germany before the two parts of Germany were united in 1990. West German also means belonging or relating to the people or culture of this part of Ger West German in British English adjective: of or relating to the former republic of West Germany (now part of Germany) or its inhabitants noun: a native or inhabitant of the former West Germany |
More Definitions
A | B | C | D | E | F | G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z
Definitions Of The Day
- In residence ‐ actually resident; appointed to work at, and usually…
- Overfast ‐ adjective: too fast
- Deathwatch beetle ‐ a beetle, Xestobium rufovillosum, whose woodboring…
- Back emission ‐ noun: the secondary emission of electrons from…
- Reinsurance ‐ noun: Reinsurance is insurance protection taken…
- Lubber line ‐ noun: a mark on a ship's compass that designates…
- Bats-wing coral-tree ‐ noun: a small tree, Erythrina verspertilio, of…
- Back-patting ‐ noun: an act or instance of offering praise or…
- Charter school ‐ an alternative school that is founded on a charter…
- Irish water spaniel ‐ any of a breed of spaniel with a liver-colored…