HAS AMERICA SIMPLY BEEN A LIE ALL ALONG?
Do you suppose that America is simply an enormous lie that we all bought into? The Home of the Brave was carefully presented to us, when we were all too young to know the difference, by the brush of Norman Rockwell and the lens of Frank Capra. Were we hoodwinked? The twentieth century saw America emerge as an imperialist, expansionist, unstoppable juggernaut, swallowing little countries one after another. Building the canal and annexing Panama. Faking the explosion of The Maine in Havana harbor to start a phony war with Spain, out of which America gobbled up Cuba, Puerto Rico and the Philippines. America, in spite of popular belief, did not win World War Two, the Russians did. After that war, the newly formed CIA went to work deposing elected heads of state in Iran, and throughout Central and South America, and replacing them with dictatorial strong men friendly to ever-expanding America, and willing to cash in on the CIA’s seemingly limitless payroll. Then America replaced France in Southeast Asia, defoliating tiny countries who wound up kicking the crap out of us. The American demonizing of Communism became a world wide joke. The USSR would die of its own faulty reality, not of anything America and its CIA could conjure. And George Bush decided to invade Iraq in a felonious war the would forever change the balance of power in the Middle East, and create the largest refugee crisis since World War Two. We Americans are an arrogant bunch, sweeping our many sins under the carpet while marching the path of self righteousness. We never forgave African Americans their freedom, espousing our fair mindedness in between lynchings, both public and private. And now America has Donald Trump, an openly misogynistic, racist, white supremacist buffoon, moving into the White House. Maybe we’ve finally gotten exactly what we deserve. These feelings have been festering inside me since Black Tuesday. They simply won’t go away. Has it all been a lie all Along? I’m just asking.