Answers

2015-07-21T22:51:53+05:30
No it didn't it made Germany more aggressive and to fight back for the humiliation done with them .The ill treatment due to treaty of versailles had made people of Germany to make someone there leader . As this treaty had ruined germany
0