What is the treaty of Versailles. How did it change life in Europe.
The treaty of Versailles was one of the treaties that ended the war. The treaty helped Europe come together again. The treaty helped stop the fighting in Europe. It was one of many treaties that brought back the peace in Europe. It helped Germany not try to kill everybody anymore.