American liberalism is a descendant of the new ideas and thinking which emerged in the Age of Enlightenment, in which the divine right of kings and monarchical governments were refuted. The idea of the people ruling as their own sovereign, with government as their servant, rather than the other way around, was liberal thinking to the point of radicalism. The Constitution of the United States put the idea on paper for the first time, giving birth to the first republic since the ancient world, and denying in writing the existence of an aristocracy. Yet at the same time, it curtailed individual liberty, disenfranchising many, and acknowledging and accepting slavery.
It was liberal thinking which led the abolitionists to cry for the end of slavery in the United States. It was liberal thinking which called for publicly funded education for children of both genders. Throughout American history, the call for change in governments and programs they sponsor, for the benefit of the public which they exist to serve, has created an ever-changing relationship between the government and the governed. Here are just a few of the changes to American life and history wrought by the advance of liberal thought and policies in the United States and its relationship with the world.
1. Public education predated the American Revolution in some areas
The legislatures of the New England colonies all encouraged the towns within their jurisdictions to establish schools paid for by tax revenues. They continued to lead the way in public education following Independence. Horace Mann established one of the first designed public education programs in Massachusetts, copied among several states during the early 19th century. Reading, writing, arithmetic, geography, and history were all included, while religion was not, raising the ire of conservatives from several religious groups, including the Puritans and Calvinists.
Beyond what would today be called elementary school were privately funded academies and schools, to prepare students for entry into colleges and universities. What would be equivalent to modern high schools, funded publicly, did not appear until after the American Civil War, again emerging largely in the Northeast. They too were considered liberal and a government intrusion when attendance in them began to be made mandatory, by state laws, in part to end the practice of child labor. By the 1820s, schools dedicated to the training of teachers for public schools were established, to the great annoyance of conservatives, who believed education should be completed at home and in church.