In college, the most interesting class I've ever had was my US History since 1865. The professor seemed to know what he was talking about. He believes we're bringing ourselves down and that any war action we get involved in will pan out to be worse than we expect. I've never lived anywhere other than the US, and sadly I can see it bringing itself down from the inside. Anyone, educated or not, can see we're our biggest problems. I don't believe America will disappear like other civilizations of past, but we simply cannot be on top forever.