A History of Liberalism
Liberalism became the dominant ideology of the West when it was adopted by Britain and the United States in the 19th century. But its origins lie elsewhere.
Liberalism became the dominant ideology of the West when it was adopted by Britain and the United States in the 19th century. But its origins lie elsewhere.