The question is where do you have your history knowledge from? Germany did not try to dominate Europe since the fall of Napoleon or crap like that. Bismarck created the European system of alliances precisely because he believed that Germany was a 'satiated state' (which was also why he opposed the annexation of some French areas in the war before that) and wanted to ensure peace.
Granted, the new and last Kaiser sung a more imperialistic tune, but he mostly wanted more colonies, like the other great European powers. World War 1 was an immense diplomatic failure on many sides, not a war of aggression started by any one country. Heck, the Kaiser and the Tsar were both trying to prevent it up until the last minute, but lack of trust ultimately doomed those attempts.
Edit:
In reality? No.
In the minds of people who think Germans are evil personified? Yes.