16 Truths About the Rise of the Religious Right in America

16 Truths About the Rise of the Religious Right in America

Trista - December 18, 2018

16 Truths About the Rise of the Religious Right in America
A painting at the Capitol shows that Washington was not a Christian. williamhenry.net.

8. America’s History Was Reimagined

Remember the secularization thesis? Weber theorized that post-Enlightenment, society would become more secularized as people increasingly gave up religious values in favor of science and reason. What was beginning to happen in America, though, was a reversal of this trend. Consider that the founding of America was primarily a product of the Enlightenment. The Founding Fathers had been schooled in Enlightenment ideas and wrote them into documents like the Declaration of Independence, the Constitution, and the Federalist Papers. Political life, especially the political life of the new state they were creating, had no room for religion.

Around the time of the bicentennial, 1976, just as the issue of abortion was beginning to galvanize Christian voters, there was a shift in how Americans, particularly fundamentalist and evangelical Christians, told the history of the country. The Founding Fathers were purported to be devoted Christians who wanted to establish a new state on religious ideals when they were known to be Deists who had little concern for Christianity. The Christian public came to view them as champions of religion who had purposely written Christian values into Enlightenment-inspired documents (such as the Constitution). Not coincidentally, political action became seen as a religious value, in the present age as it had (supposedly) been at the time of the founding.

Advertisement