You are herecontent / The Deist Roots of the United States of America
The Deist Roots of the United States of America
To suit their political agenda of merging church and state, right wing religious fundamentalists misrepresent that our nation’s founding fathers were “Christians” when in truth, they were Deists.
This article clarifies our national origins by describing the actual beliefs of our founders. ~Chip :)
Excerpt: ...Many sincere people believe that America was founded on Judeo-Christian principles. Even the powerful US Senator and candidate for US President, John McCain said, "The Constitution of the United States established the United States of America as a Christian nation." He says this even though the Constitution does not! In fact, nowhere in the Constitution is the word "God" ever even mentioned!
The Declaration of Independence mentions God but ONLY in Deistic terms! Nowhere in the Declaration is Jesus, Moses or the Bible ever mentioned. If America was founded as a Christian nation this would not be the case....Read more.