Sometimes It’s Important to Kill Each Other! The Most Important Wars in American History

Image Credit: NatUlrich /
Let’s be honest here: it’s completely impossible to be an accurate judge of history. Although it may feel like the United States was founded a very long time ago, less than 250 years barely gives us enough time to develop decent hindsight. But the even bigger problems are that there’s no single right way to define important and we can never known how things would have turned out if war had been avoided or if someone else had won. But, now that we’ve argued that it’s impossible, we give you the top five most important wars in America’s history.

5 Vietnam

Image Credit:

The Vietnam War didn’t really have any geopolitical or military impact at all, which is a big part of what contributed to the huge social impact that it did have. Never in the history of the United States have the American people been so against a war their military was fighting. It was seen as pointless violence and fed the development of a culture of protest. This combined with other influences of the era to bring the country together in certain ways while it was divided in others. Political protest meshed with sex, drugs and rock and roll to create environmental awareness, change America’s sexual culture and to bring us some of the greatest music of the century. Of course, it still wasn’t worth the hundreds of thousands of deaths.


Image Credit: Wikipedia

Here’s where we learned that minding our own business isn’t always a good idea. Of course, we only gave into that idea after it was too late to save the millions upon millions of people who would be killed by the Nazis or die in brutal massacres all across Europe and Asia. But for America, the legacy of WWII was very different. It widened women’s opportunities in the workforce and helped the country to a full economic recovery from the Great Depression of the 30s. In fact, between the economic improvements we’d made and the fact Europe and Asia had been ravaged by the war, we were now the undisputed financial superpower. Plus, we had just dropped the Atom Bomb. Between the atrocities of the war and the new existence of nuclear weapons, the world had become a much scarier place for Americans and everybody else.

3 World War I

Image Credit: Wikipedia

“The War to End All Wars” turned out to be a bit of an overstatement and even the title “The Great War” looked a little silly when we joined WWII just 23 years later. And I can admit that the impact on Europe was a lot stronger than it was on the U.S. and most of us still can’t quite understand what the war was about and what we were doing there. But nonetheless, it was an important milestone in our history. It was the first time that we got involved in a war in Europe and the first time that it wasn’t about us directly. Although there was a strong return to isolationism after this war ended, it was the beginning of the end for the U.S. minding its own business.

2 The Civil War

Image Credit: Wikipedia

Ok, this one’s kind of obvious. Not only did it kill more Americans than any other war (and that all the wars put together minus Vietnam) but it ended slavery in the United States. I think that’s really enough to prove its immediate, undisputable impact but I’ll keep going anyway. The end of slavery meant that the country, especially the South, had to rethink its entire economic structure — but it would have to do it as one country. Ever since the Articles of Confederation were abandoned in favor of the Constitution, the country had been trying to figure out if they were a conglomeration of individual sovereign states, or a single country made up of a group of regions that just happened to make some laws for themselves. The Civil War answered that question once and for all.

1 The American Revolution

Image Credit: Wikipedia

Did you really think I was going to skip the obvious? You can argue that eventually the United States would have gotten independence from Britain anyway or that the colonists were just overreacting to taxes that weren’t actually that high. But it’s the way the Americans reacted and the way we got independence that impacted the U.S. and world history the way it did. The war was a lot more than just a collection of battles with a political result. It was a strong philosophical statement. The British government had long since said that representation is required in government but the American Revolution said that representation is a right not a privilege and that the denial of that right is a reason to fight, kill and die. I won’t credit the war with the model of democracy that was then set up because the Constitution actually did that and it wouldn’t be written until 4 years after the war ended. Also, as tempted as I may be to try to convince you that the War of 1812 was important too, I’ll just include it here as an aftershock of the Revolutionary War. But an important aftershock, trust me.

Honorable Mention: The “War on Terror”

We’re combining the most recent wars in Afghanistan and Iraq into one here, since we’re still much too close to say which was more influential. It’s clear that these wars have impacted at least one decade of foreign relations and foreign policy and there’s no knowing how it could influence history. So just in case, we hereby cover our asses with this honorable mention.


Spanish-American War?Korean War?Mexican-American War? Anyone want to argue for those?Now’s your chance.

5 Most Brainless Comments Shouted from Atop the Fiscal Cliff 5 Most Brainless Comments Shouted from Atop the Fiscal Cliff