4548 stories

all of us in: the Disagreement Nexus

1 Share
archive - contact - sexy exciting merchandise - search - about
July 8th, 2024next

July 8th, 2024: There's a lot of nexuses out there in the world, and not all of them are good! You have to be careful when entering a big ol' nexus!

– Ryan

Read the whole story
2 days ago
Washington, District of Columbia
Share this story

Collections: The Philosophy of Liberty – On Liberalism

1 Comment

It is once again the week of July 4th and so, as is customary here, I am going to use this week’s post to talk about the United States or more correctly this week about the political philosophy the United States was founded on: liberalism. Now an immediate clarification is necessary, because in the United States especially, the word ‘liberalism’ has come to mean more broadly the left half of the political spectrum and (with no small amount of irony) ‘big government’ solutions to problems. That is not what I mean here.

Instead, what I mean when I say liberalism is its original (and broadly international) meaning: the political philosophy which first emerged fully in the early modern period and which places individual freedoms – liberty – as its central, defining value. This is the ideology of the Declaration of Independence and the political theory upon which – however imperfectly – the United States was predicated. You may fairly ask why I am using a term that is going to be confusing to some folks and the answer is: there really isn’t a better one (‘libertarian,’ as we’ll see, means something related but different). Though at the same time, American political vocabulary seems to be undergoing a shift where the The Left is more comfortable claiming ‘progressive’ or ‘socialist’ as a label, while at least some factions on the right are more comfortable openly aligning as anti-liberal (or illiberal), which is leading the term ‘liberal’ to shift back to its original meaning. I don’t want to get too sidetracked by political terminology for the American political system, so I’ll put my own preferred terminology in a footnote.1

So let’s talk about it: what is liberalism, this political philosophy upon which the United States was founded? And perhaps equally to the point, why do I think that liberal principles remain crucial to organizing human affairs? Now I have kept trying to reorganize my thoughts here in a way that I like and I have not yet succeeded, so this may be a bit more of a ramble than usual – we’re going to have to walk through a bit of pre-modern societies, a bit of Greece, a bit of Rome, and also a bit of early modern Europe before we get to what I think the core of this idea is, which is the value of liberalism today, particularly with reference to the United States.

But I promise we are getting to that part!

Defining Liberalism

Liberalism often gets defined as a collection of commitments to various principles: free speech, freedom of religion, freedom of movement, free markets and so on, while liberal societies are those which institute those principles as policies. And certainly, liberals – again, in the old, international sense – do tend to support all of those things, but this sort of policy-bricolage often obscures the key first principles from which all of the policy positions derive (further obscured by the fact that ‘freedom’ is a word with a fairly plastic meaning). So I think we should start by drilling down to the root idea of liberalism.

Etymologically, of course, liberalism is the philosophy of liberty, of a certain form of freedom – but of course that just raises the question of what sort of freedom. The root here is the Latin word liber, which was a free person (technically a free man, a free woman was libera; cf. libertus/liberta, a freedman or woman), from which derived both libertas, ‘liberty’ and liberalis, ‘of or relating to a free person.’ It is that latter word, liberalis which gives us ‘liberalism’ and of course libertas that gives us the abstract idea of ‘liberty’ so in a real sense, liberalism is ‘liberty-ism.’ Liberalis implies more than just a freedom from slavery, I should note: the word in Latin includes a certain sort of dignity and indeed, generosity. Liberty is, as we’ll see, more than just ‘not being a slave,’ though it is also certainly that. Crucially, libertas is something possessed individually as much as, if not more than, by the state. For the res publica to be liberalis, the people in it must possess libertas, not merely the state (we may contrast Greek eleutheria; put a pin in that, we’ll come back to it).

And it is that distinction that brings us usefully to the particular kind of freedom: liberalism is a political philosophy which recognizes, indeed which chiefly values, individual freedom from communal constraints.

We are often so used to liberal societies – or illiberal ones that use liberalism’s language as a mask – that we miss the radicalism of that vision. As Patricia Crone notes, traditional pre-modern societies, by and large, have little space for the individual:

If society was a body and the functional orders its limbs, individuals were simply cells…like cells, they were programmed for performance of pre-determined roles and derived their value from performing as expected, not from refusing to conform. Individual interests were subordinated to and defined by collective ones…the individual existed for the benefit of the overall group, not the other way around.2

Of course it isn’t particularly hard to see how this sort of ideology might be quite attractive to the ruling elite and indeed elites of various kinds who chose what the “overall group” valued and did. But in as much as we can see the thinking of the common folk of these societies, they tend to share the same assumptions, with the individual existing to fill certain roles in a society and being valued in so much as they did so in accordance with traditional notions.

By contrast, liberalism as an ideology recognizes limits on the social or communal claim on the individual.3 The individual is not merely a cog in the machine designed to produce communal ends, but rather has certain rights they may press against society, certain zones in which they have a freedom on to which their neighbors and the broader community cannot intrude.

That in turn leads to all of the other principles. Free speech is the rejection of a communal claim on individual speech and expression – you can’t tell me what to say. Free markets are a rejection of a communal claim to resources and labor – you can’t tell me what to make, buy or sell. Freedom of association and an associated freedom of movement4 is a rejection both of a communal claim on where one can live but also on how they set up their social life – you can’t tell me where I live, who my friends are, who I can marry or what clubs, parties and associations I can create and join. All of these key decisions under liberal thinking are reserved to the individual, who may well have certain moral imperatives to act in the interest of the community, but it not bound to do so by force or law.

In that vision, the purpose of government becomes protecting those rights, those spheres of freedom, from outside interference (for instance by other individuals or other governments), which is itself a radical reformulation of what the state does and how it is constituted. Whereas the earliest states almost invariably represented themselves as possessing authority from a divine mandate and deep tradition, here the authority of the state arises from its function: protecting individual liberties. It is a short leap to democratic forms of government – if the legitimacy of the state arises from individuals, why shouldn’t its policy do so as well? The idea of using written law to constrain the actions of magistrates and judges is an ancient one – absent a traditional governing framework, the jump to (written) constitutionalism is not a long one.5 The focus on the rule of law as a key component in protecting liberties from the exercise of arbitrary power is thus an outgrowth of the initial focus on liberties.

At the same time, democracy – ‘liberal democracy’ – is a fit for liberalism because no other form of government could be trusted to protect those key liberties. As we’ll see, the track record of ‘enlightened despotism’ at actually delivering freedom, stability and prosperity for the people is far worse than the record of liberal democracy in doing so. And it should be little surprise: the despot or oligarchy has every incentive to infringe on the liberties of any people not represented in the political system. Only a democracy, in which everyone has a political voice, can be trusted to protect the liberties of everyone and even then only when the will of the majority is constrained by the rights of the minority.

But there is an important thing to note in this: liberalism does not begin with the governing side of this equation (the democracy, constitutionalism and rule of law). It begins with individual rights rooted in natural law and reasons back to find a governing order best equipped to protect those rights. That’s important for two reasons: first, because democracies need not necessarily be liberal but more importantly because liberalism was designed to solve problems.

The Problems Liberalism Was Made to Solve

Liberalism emerged as a political philosophy at a particular historical moment. The temptation here is to conflate liberalism with democracy, rolling both ideas together as some sort of generic ‘freedom’ and attribute them to a ‘western tradition’ that stretches back to antiquity.6 Instead, while 17th and 18th century liberals – most importantly John Locke – reach back to classical ideas and language to frame their ideas (which also owe something to North European political customs stretching back into the Middle Ages), liberalism was very much a product of their early modern context.

In general, I think the key context here is actually the emergence of the modern administrative state in Europe combined with the religious fragmentation brought on by the Protestant Reformation. In the broader Mediterranean world (including Europe), prior to this period, you might have intensively governed states where the community (acting through the state) intervened a lot, but these tended, due to difficulties in communication and coordination to be small. The Greek polis actually makes a good example: individual citizens (politai) had few if any rights the community as a whole was bound to respect. The polis was free, in the sense that it was self-governing and the citizens were free in the sense they were not slaves, but they didn’t have liberties in the way we understand. The Athenian demos – the people – could, famously, temporarily exile a politician for being unpopular (or too powerful) despite breaking no law, execute generals for losing battles (or even winning them) and philosophers for asking the wrong questions and being generally irritating. Likewise, property rights could be curtailed, as Greek poleis often compelled individual citizens (typically the rich) to provide state services at personal expense (called a liturgy), rather than a uniform system of taxation (such as Rome’s land tax, the tributum). Athens was democratic, but not liberal.

Alternately, you might have very large states, which tended to delegate quite a lot of daily governance to local authorities and so rules and laws might differ quite a bit from one place to another. The Achaemenid Empire functioned this way, as did the early Roman Empire. It is, of course, striking that when later Roman Emperors tried to enforce a greater degree of unity on the empire, the result was actually fragmentation. But mass literacy, the printing press and bureaucracy made a new sort of government for this part of the world possible: a government that intensively governed a lot of people over a wide area. The result was the exposure of a much wider range of people to a lot more state power from a state that might differ quite a lot more from them. Absent local autonomy, the question of who ruled suddenly became extremely high stakes as rulers were in a position to try – and in the end, fail violently – to enforce their own uniformity.

On the continent, this produced the Wars of Religion (1522-c. 1700), reaching their bloody climax with the Thirty Years War (1618-1648). It is a struggle to communicate just how devastating these wars were. For many places in Europe, the Wars of Religion were, on a per-capita basis, more destructive than the World Wars (the Thirty Years War, for instance, killed about a third of the population of the Holy Roman Empire). And yet no one won. By 1650, it was clear no one had or could have the military power necessary to actually enforce religious unity in any one country, let alone all of them.

Meanwhile, England was going through the English Civil War (1642-1651, in three phases), the causes of which I think may be fairly described as complex, but which included the effort to enforce uniformity on the church in Britain (which provoked a revolt in Scotland) as well as the balance of power between the king and parliament. One thing which could not escape anyone at the time was how the winner-take-all nature of the struggle intensified its lurch to violence as both royalists and parliamentarians regarded disagreement as criminal (e.g. John Eliot’s imprisonment in 1624, the Bill of Attainder against the Earl of Stafford in 1641 and then the effort to arrest the ‘Five Members’ in 1642). The Glorious Revolution (1688) was in turn explicitly about religious tensions and the supremacy of parliament, though of course it was far less violent than the Civil War that had proceeded it. I am, of course, greatly simplifying both of these, the point here is that questions of how to resolve the tension between increasingly strong states and the religious and political differences of the people they governed were very prominent in the 17th century.

Liberalism is designed to solve these problems by taking some of the key questions out of the realm of politics and instead placing them in what we may call the ‘realm of liberty’ – that is, of individual freedom. It is not an accident that some of John Locke’s (1632-1704) first major works are on religious toleration and one of his key arguments is that trying to enforce religious uniformity causes more problems that it solves, a point that he could have observed in action through much of his early life. Locke goes further in the Second Treatise, taking it as a component of natural law that people have an individual right to life, liberty and property.

Via Wikipedia, John Locke in 1697, the father of modern liberalism, though hardly the only liberal political thinker.

Now, Locke explains this notion by supposing that in a state of nature – that is, absent all of the complex social structures we’ve developed – this natural law (a concept we’ll get back to in just a second) held sway and that states were only formed as a social contract by the ruled with the ruler, such that the state existed to protect those natural rights, with the subjects ceding a small portion of liberty (taxation, conscription, basic laws, etc.) in exchange for the protection of the lion’s share of their natural rights. Now, as we’ve already noted, Locke is quite wrong about what humans are like in a state of nature and how early states formed. In early states and indeed even in complex non-state societies, the commons were generally reduced to fairly extreme subjugation with a sense, so far as we can tell, not that this was in exchange for some protection of rights, but because the aristocracy had some inherent right to do so on their own. Early states formed not to protect rights, but as ‘violence machines’ designed to buttress internal repression and external conquest.

That said, Locke is reaching for something ancient and that is the notion of natural law itself, a concept that in antiquity reached perhaps its fullest expression among the Romans (particularly Cicero), although it had antecedents in Greek philosophy. Natural law is the idea that there is a fundamental set of rules and rights which apply to all humans, in all places, at all times – a code against which human action may be judged with universal application. This is often justified on religious grounds as a code of conduct set down by God (or the gods), but it need not be: Cicero, while fully admitting the existence of divinities, argues that natural law instead is a product of the existence of reason and not a creation of the gods. Indeed, Cicero goes so far as to argue that natural law binds the gods too.

This is actually quite important because of its universalizing nature. Many ancient religions represented laws as handed down by the gods, but for polytheistic faiths where the gods were local and particular to a people or a place, those laws were particular to peoples and places too. A king, too, might make laws, but these bound only his subjects. In both cases, a people over the hill might well have different laws and crucially need not merit the protection of your laws. By contrast, Cicero imagines the ius naturale as applying to everyone; this even had expression in Roman law in the form of the ius gentium, a baseline law code that applied to all free persons. That said, as I discussed in the post on Cicero and natural law, his conception of it has limits and failings. On the limits, he imagines natural law primarily as a code that binds, rather than a set of rights that protect. And of course Cicero himself never fully absorbs the implications of his philosophy: a wealthy Roman slave-holder, it never occurs to Cicero that perhaps he daily violates the natural law by keeping people in bondage.

The other influence here that has to be noted is a tradition of independence and rights among the English and Scottish elite – mostly the nobility – going back to Magna Carta and even further to older ideas of the inherent dignity and independence of Big Men in non-state or early-state societies in North Europe. This also has a classical parallel, the Roman concept of libertas, which has two conjoined meanings: libertas is the absence of being enslaved, the state of being politically free (or freed) but also individual free from interference, by right rather than the indulgence of some well-meaning master.7 Libertas in this sense is invoked against the use of arbitrary state power (typically actions by magistrates) against individual citizens. In short, then there were things that the agents of the state could not do to a Roman citizen. The checks on state action – most of them associated with the tribunes (provocatio, auxilium and so on) – were in turn the institutions which created and guarded this sort of libertas. In that sense it matched up with the older North European sense of a right of the Big Men to be free from certain sorts of interference from a king.

I should note that both this North European ‘freedom for Big Men’ concept as well as Roman libertas are aristocratic rights, albeit the former more than the latter. Regular Roman citizens certainly could invoke libertas and sometimes do so, but the word was more often a rallying cry for the elite and the tyranny they invoked it against was the arbitrary rule of an individual over the Senate (a body of aristocrats) rather than over the Roman popular assemblies. In both cases, by tying the question to natural law, Locke is taking something that was a privilege of a few – of aristocrats or citizens – and broadening it out to everyone (though, of course, ‘everyone’ here will be read more narrowly than we do today).

That universalizing nature brought in through natural law brings in a value in liberalism that may have been, until now, conspicuous by its absence: equality. If everyone is equally subject to natural law (and that’s what it means for something to be natural law) and that establishes rules and rights, then it becomes difficult to justify rigid classes and orders of people, hereditary nobles or established clergy with special privileges that mark them as better. Instead, liberals will insist, “all men are created equal,” in the very particular sense that they share equally in the rights granted by natural law (and for the religious, thus created and sanctioned by “nature’s God”).

But notice how equality is a secondary value of this system, a derived value – a corollary, rather than a first principle. Liberalism, as an ideology, insists first on rights and second on the fundamental, universal natural state of those rights and only then as an unavoidable consequence, the fundamental equality of the people who have those rights. Which is to say, liberalism values equality in so much as it is necessary for liberty, but not as a value in and of itself in all of its forms, which is why liberalism is relatively tolerant of economic inequality.

Indeed, at some extreme point, the ends of liberty and equality must conflict: to produce perfect equality, a society would need to trample a lot of liberalism’s core rights, not merely to property, but to speech, religion and expression too. We should not overstretch this idea: the point of outright conflict between those values is only at the extremes and just as no liberal democracy practices truly “unfettered capitalism” in today’s world, so too no state is even seriously attempting to enforce true “equality of outcomes,” so to speak. Equality and liberty are not opposites (indeed, most who hold to one value also value the other quite highly), but they exist in a sort of tension. In a quite real sense, social democracy, as practiced (for instance, by some European countries) is an effort to resolve this tension or at least to find a balancing point, as it turns out that a society can ‘buy’ quite a lot of equality before infringing seriously on liberalism’s many liberties; pretty much all modern liberal democracies, including the United States, aim to strike some kind of balance in this regard.

In all of this we should understand Locke as looking to solve a problem created by modernity and reaching back to cultural values and ancient ideas (natural law and liberty) to do so and then positing (incorrectly) an idealized past in which this principle held sway for all, imagining that he was rediscovering when he and other liberal thinkers like him were, in fact, inventing it. Now of course England and Scotland, shortly to become Great Britain (to become the United Kingdom) have their own journey along the path of liberalism – in a few important ways, a swifter one than the United States – but this is a post about the United States for the Fourth of July, so we will now leave our British friends (but take their ideas) and bring them to North America.

Liberalism and the United States

All of this background – the classical precursors and influence, the early modern context, the 17th century thinkers – matters for the United States a great deal because this was the political philosophy ‘on the shelf’ when it came time to create a new country with a new form of government.

It is, I think, all too easy once again to miss the radicalism of this moment. In 1776 there were no governments founded on liberal ideas. There were a handful of European republics (the Old Swiss Confederacy and the Dutch Republic, most notably) which were pre-liberal in their structure and ideology; both continued to have a nobility in this period, for instance. But liberalism was still a new idea, slowly transforming Britain – a process very incomplete in 1776. The idea of founding a new country on the liberal notions that, “all men are created equal” and “endowed by their Creator with certain unalienable Rights” was radical in 1776 – indeed, so radical the men who wrote those words, signed their names, and pledged their “Lives […] Fortunes and […] sacred Honor” fell far, far short of putting their explosive ideas into full practice. Like most ideals, liberalism was only attained in halting, half-steps.

It was, among other things, a radical enough document to have its publication suppressed by various European monarchies for decades; the text of the thing was banned in Russia for eight decades and in Spain for nine. In asserting the fundamental equality of mankind, in denying the divine right of kings – who only, in the document, derive their just authority from the consent of the governed – the Declaration presented an explosive set of ideas. Indeed, a set of ideas that would explode in France not too many years later.

But these ideas were also going to be an important part of holding the new country together. The Thirteen Colonies preparing to fight for their newly independent lives against the British Empire were hardly homogeneous. After all, the territory of the colonies had been subject to not just British, but also French and Dutch colonial settlement, along with waves of immigrants from the German states and other parts of Europe. The southern states were mostly Anglican, in many cases with an established Church of England, but other protestant and reformed denominations had strong presences: Dutch Reformed (particularly around New York), Lutheranism in areas with German settlement, along with Quakers and Menonites in Pennsylvania, along with growing numbers of Baptists and Methodists. There were smaller but still significant numbers of Jewish and Catholic Americans too. Attempting to enforce cultural, religious or linguistic uniformity8 would have only led to the same kind of shattering wars Europe had experienced a century prior.

Liberalism was the answer: take all of those explosive issues and move them from the realm of politics to the realm of liberty to avoid ripping the new country apart in one sectarian war after another. This became an even more pressing issue, of course when it came time to contemplate moving from the weak and loose Articles of Confederation to a much stronger government under the Constitution.

Now it is common to point out that after its issuance, the Declaration of Independence and its opening statement of liberal values didn’t really assume its current status as a nearly co-equal founding document with the Constitution until the Civil War, which is, so far as I can tell, true enough but misses the point. While the Constitution lacks overt statements of liberal principles, the Bill of Rights in particular stands as an effort to implement core liberal principles into the bedrock of American governance. They establish a series of what we might call ‘realms of liberty’ over certain aspects of life, removing certain things from either the exercise of arbitrary power or in some cases from politics (initially just at the Federal level, later from all politics) entirely.

I’ve commented before that the First Amendment’s first two clauses are effectively the ‘don’t have a Thirty Years War’ proviso of the Constitution, by assigning religion completely to the realm of liberty. The next four clauses neatly remove some of the key flashpoints in the lead up to the English Civil Wars: no arresting people for political opinions, speech statements or efforts to bring up grievances. Some key property rights – which remember, are at their foundation a limitation of communal claims to resources and labor – get protected in the Second (arms), Third (quartering), Fourth (searches and seizures) and Fifth (takings clause) Amendments. The Fifth Amendment effectively blocks something like Athenian liturgies or the expropriation of the property of political enemies. Indeed, so concerned is the Bill of Rights to create that large ‘zone of liberty’ that it explicitly clarifies that anything the Constitution doesn’t explicitly say the federal government can do, it cannot do (Ninth and Tenth Amendments). Liberalism’s system of liberties was designed to solve problems and solve problems they do!

Property rights are understandably a sticking point for many left-leaning folks, but they are core to the success of liberalism, because sharply limiting the communal claim on private property lowers the stakes of politics: losing an election cannot directly cost you your home and everything you own, for instance. Yes, policy can have indirect effects that make it easier or harder to, say, own a house (we should make it easier), but that’s a far cry from a Bill of Attainder or a law restricting land ownership to a certain class. And absolutely, politics has not always been so ‘low stakes’ for all Americans – but that is, to be frank, because they were excluded from the benefits of liberalism, discriminated against by laws that were insufficiently liberal. If I want everyone to be secure in their person and possessions, the solution is not to abolish property rights, but to make sure they extend to everyone equally. As we’ll see, the alternative to private ownership is typically not ‘no one owns things’ but rather the state owns things and you do not.9

Of course both the Constitution and the Declaration, as documents of liberalism, contain in them serious defects, the most obvious being the institution of slavery and the exclusion of women. Both of these were clear breaks with the ideals of liberalism and recognized as such at the time (the former more than the latter, but note, for instance, Abigail Adams suggesting in 1776 that women too, ought not to be subject to laws they have no voice in). This is not shocking: few political ideologies are realized in their completeness at the outset and the journey to freer and more just social structures has been a long one. Notably, slavery and misogyny were not defects particular to the United States in 1789: most countries had some form of slavery and no state admitted women to pull political participation at that point. What was particular was that they represented betrayals of the principles that otherwise document: the crime was common, the hypocrisy was special. That is not an exoneration, but it is an important observation.

The American Civil War thus became a fundamentally liberal war, particularly by 1863, against a fundamentally illiberal movement. Indeed, liberal thinkers abroad (e.g. John Stuart Mill) recognized this fact. One thing that comes out quite strikingly in the letters, diaries and memoirs of the Civil War is the steadily growing commitment of the United States’ cause not merely to the Union itself but to the fundamentally liberal goal of abolition – what begins as a whisper ends the war as a battle cry.10 One thing, I will note, that is very striking is how marked the impact of African American soldiers was on that attitude. Howell Cobb, one of the founders of the Confederacy, famously remarked that, “If slaves will make good soldiers, our whole theory of slavery is wrong.” Cobb couldn’t conceive of slavery being wrong, but United States soldiers watched African Americans make very good soldiers and it very clearly accelerated their conclusion that, indeed, the whole theory of slavery was wrong.11 Lincoln added what was, in effect, a liberal war goal in September, 1862, but it was his address on the blood-soaked fields of Gettysburg which transformed the war into a liberal crusade, one that, like the nation it fought to preserve was, “conceived in Liberty, and dedicated to the proposition that all men are created equal.”

However, if it took the United States time to realize it was in a war for liberalism, it took the confederates no time at all to commit to a war to reject liberalism. Here, quite famously, is Alexander H. Stephens, Confederate Vice President, describing the constitution of his new country:

Those ideas [of the United States Constitution], however, were fundamentally wrong. They rested upon the assumption of the equality of races. This was an error. It was a sandy foundation, and the government built upon it fell when the “storm came and the wind blew.”

Our new government is founded upon exactly the opposite idea; its foundations are laid, its corner-stone rests, upon the great truth that the negro is not equal to the white man; that slavery subordination to the superior race is his natural and normal condition. This, our new government, is the first, in the history of the world, based upon this great physical, philosophical, and moral truth.

The Confederacy was thus not just a revolt against the United States but a revolt against liberalism. Indeed, it is hard in the ideology expressed here not to see a grim precursor of Nazi ideology.

Fortunately – and this will shortly become a theme – the economic dynamism of a liberal society gave the United States quite a few important advantages against the illiberal Confederacy, both in economic production and also in manpower. At the same time, the very “self-evident” nature of the liberal cause of the United States made it difficult – impossible, in the event – for the Confederacy to secure meaningful foreign support.

That was, of course, hardly the end of the United States’ struggle to more realize the liberal values of its founding documents. The end of Reconstruction and the implementation of Jim Crow was a setback for liberalism; the success of the Civil Rights movement was a success for it. The campaign for Women’s Suffrage was, too, a fundamentally liberal one. I would argue likewise the campaign to extend legal protections and marriage rights to LGBT folks was a fundamentally liberal one: you may hold whatever moral views you like and say whatever you may about them (because those too, are essential liberties, particularly religious liberties), but the question of who can marry who (among consenting adults) was shifted from the realm of politics to the realm of liberty.

And this is the great success of liberalism as a governing ideology: by taking so much of life and moving it from the realm of politics to the realm of liberty, it makes it possible for a big, pluralistic, diverse, dynamic country like the United States (or indeed, almost any large democracy) to function effectively. The fact is we will all never see eye to eye on everything, but we don’t have to so long as we respect each other’s liberties. Which isn’t the same as saying one has to give up on their morals: persuasion and evangelism remain open options in a liberal society, indeed more open than in any other. And in a sense, the good morals and character of the people, their values, matter in a liberal society more than any other as well; liberalism need not be libertine unless we make it so. So often illiberal movements spring from a sort of moral cowardice, a failure of confidence that one’s ideas could win the contest of persuasion on a fair playing field, so one must resort to force or violence.

Which brings us neatly to:

Liberalism At War

The First World War marks a key turning point in this narrative. While liberalism had advanced (in fits and starts) in Europe and the Americas prior to 1914, most of the great powers remained decidedly traditional regimes. They had survived the great liberal surge of 1848 and continued to predicate their legitimacy on the old ruling authorities of tradition, church and crown.12 The First World War not only broke all of the true ruling monarchies of the European Great Powers, it also discredited them, demonstrating that these traditional forms of government were not up to the task of modern war, while the liberal governments of Europe were. That is a sharper blow than you might at first guess, because the legitimacy of kings and aristocrats was predicted, often explicitly, on the ability to prevail in war and provide political stability. They had proven able to do neither, while the supposedly weak and decadent liberal powers stood triumphant. The blow was fatal.

Via Wikipedia, a War Bonds poster from the First World War, “Weapons for Liberty,” featuring Columbia as the personification of the United States (rather than Uncle Sam, who comes to predominate a bit later).

Of course as a result, the 20th century was in many ways defined by a (messy, confused and morally compromised) struggle between liberalism – itself once a radical idea but now associated with the status quo – and new radical alternatives to both liberal and traditional government, most notably fascism and communism.

And I want to push back, for a moment, on some of Left-mythology about the early stages of this conflict, where one often hears it asserted as self-evident truth that the liberals failed to resist the fascists, leaving the conflict to the Soviet Union. Such a position was ideologically necessary for the USSR because the state ideology insisted that fascism was the logical end-state of capitalism (and thus liberalism), but ideological necessity does not make truth. Part of the issue with his line in the United States is a conflation with the traditional American center-right (free-market liberals) with the Weimar right-wing – figures like Franz von Papen – who were, in fact, monarchist anti-liberals, so when someone says “Hitler used an alliance with the conservatives to seize power” they are leaving out that Weimar’s conservatives were ideology very different from the traditional GOP or Tories. Weimar Germany simply lacked a strong enough liberal political movement to do much either way, while Moscow famously directed the Communist Party in Germany to focus on destroying the Social Democrats, rather than the Nazis in the run-up to Hitler’s seizure of power.

In practice, of course, it was the liberal states of Europe which went to war to attempt to stop Hitler’s conquest of Poland, while the USSR joined WWII as an ally of Nazi Germany. And, indeed, from 1939 right up to the day before Operation Barbarossa, the Soviet Union supplied Hitler’s war machine with the raw materials it desperately needed to prosecute its war against democratic Europe.13 That does not, of course, change the fact that once Hitler invaded the Soviet Union, it was Soviet troops that did most of the fighting and dying in Europe; this is certainly true. But depicting the liberal democracies as haplessly standing by is a distortion. Before it was directly attacked, the United States embarked on an increasingly ambitious program of supplying and eventually effectively funding the effort of just about anyone who would fight the Axis, at the very time the Soviet Union was Nazi Germany’s most important trade partner. Britain was still fighting their lonely war all against Nazism when Stalin was Hitler’s best supplier. When the time came to fight, the liberals fought.

Via Wikipedia, a U.S. war poster from the Second World War, extolling the English (and British more broadly) as partners in a fight for freedom.

For the United States, World War II became itself a war for liberal principles. Frank Capra’s Why We Fight (1942-1945), America’s answer to the Nazi Triumph des Willens (1935) explicitly adopts this framing, presenting the war as an effort to defend the ‘free world’ from a ‘slave world’ defined by Nazi, fascist and imperial Japanese tyranny. Eisenhower’s D-Day speech echoes the same ideas, that “The hope and prayers of liberty-loving people everywhere march with you…you will bring about the destruction of the German war machine, the elimination of Nazi tyranny over the oppressed peoples of Europe, and security for ourselves in a free world.” Of course there was hypocrisy in this: Britain was a colonial empire, the United States certainly had some colonies of its own and Jim Crow and other forms of racial discrimination still cruelly restricted the liberties of many Americans. But the United States had nothing like the extermination and slave labor camps of Nazi Germany (or, for that matter, the slave labor camps of the Soviet Union). Perhaps even more importantly, the western allies at least committed to the ideals of liberalism – however imperfectly realized – whereas the Axis actively opposed them (as did the USSR).

It is an easy and careless mistake to assume that the incomplete realization of the principles of liberal societies makes them the same as anti-liberal societies rejecting those principles entirely. Like the American Civil War, World War II was a liberal war, a war for liberalism against anti-liberal regimes and ideologies.

Of course the Second World War’s ideological contest was follows by the Cold War’s ideological contest, between governments founded on liberal ideals (even though they often fell short of them) and governments founded on Marxist ideals (which, I think it is worth noting, they fell short of too).14 In the end, the triumph of liberalism in that contest was unmistakable. Even still putatively communist regimes like the People’s Republic of China tried to reform their economies along more liberal lines (while not reforming their political systems), a point we’ll come back to in a moment, but also an obvious admission of defeat.

Liberalism and Its Discontents

This makes a good moment to assess the track record of liberalism from the emergence of the first liberal state (arguably in 1789) to the present. Liberalism, after all, was supposed to solve problems. Did it?

And compared to other political systems liberalism has been fantastically successful. Gauging national success is always a bit tricky, because there are many metrics across which it can be measured. In terms of the overall quality of life, something like the Human Development Index provides a cross-section of life expectancy, education and access to economic resources. Here the performance of liberal governments is astounding. Keep in mind, liberal democracies have never made up even half of the world’s countries by either by number of states or population. Yet of the top 30 states and territories by HDI, 26 of them are clearly liberal democracies – the exceptions being a tiny city-state (Singapore), a city-territory (Hong Kong), a small petro-state (the UAE) and then the giant snarl of a question that is Israel, which we needn’t and won’t get into here (or in the comments).

One may, of course, argue causation – that it is rich countries which are liberal, rather than liberalism that makes a country rich (or alternately that rich liberal countries are only so because they held large illiberal empires), but here we have some interesting case experiments. Germany and Japan were both shorn of the empires and bombed into ruin before having liberal governments effectively imposed upon them; they are now the third and fourth largest economies. And of course the German example is even more instructive because after the ruin, the country was split in half, with one half getting a liberal government and the other half an anti-liberal communist one and the difference in economic performance was so vast that decades of narrowing under a united, liberal and democratic Germany have still left East Germany somewhat behind. Likewise, while the roots of rapid economic growth in South Korea and Taiwan date before they adopted truly liberal and democratic political systems, it is after that political shift (in c. 1987 for both) that their economic growth takes off.

Liberalism has proven a better system for providing a high standard of living to the people than any other thus far, particularly given that in nearly all liberal democracies, the free markets of liberalism are paired with social programs for the poorest, funded out of the vast economic productivity of the dynamic sort of economies liberalism creates.

Liberal governments are also generally extremely stable, though the messiness of their day-to-day politics often obscures this. The United States has been ticking along under a single constitution for 235 years, slowly moving closer and closer to realizing the liberal ideals of its foundation. The United Kingdom is more stable than this. Indeed, liberal democracies are so stable it remains unclear if a ‘consolidated’ liberal democracy has ever deconsolidated absent external conquest, as every clear example of internal ‘democratic backsliding‘ has occurred in young and incomplete liberal democracies.15 One of the great lies of authoritarian states is that the strongman-ruler is necessary to deliver stability as compared to the messiness of democracy, but as demonstrated in WWI and ever since, the liberal democracies are actually more stable under pressure.

Liberal governments are also generally quite good at providing security for their people. Their highly productive economies tends to mean that even relatively small liberal states have the potential to be militarily formidable, while the confluence of interests tends to naturally push liberal democracies into a powerful coalition of rich-and-free countries that provides tremendous mutual security. Liberal democracies tend to win their wars, especially defensive ones; the wars they lose tend to be smaller wars of distant foreign intervention, for which their voters have less tolerance precisely because the security risk in such wars is much lower. There is an irony that precisely because liberal democracies tend to have a lot of internal strength, they don’t feel the need to engage in a lot of the legitimacy-building showing off that autocratic regimes do.

At the same time, over the last two centuries and change, one autocratic regime after another has assumed that liberal ‘nations of shopkeepers’ with their diverse ‘mongrel’ populations would prove weak in war because they did not conform to the autocratic vision of false strength. Go look for them now.

In short: liberalism works.

Which doesn’t mean that liberalism doesn’t have its challenges and discontents. Indeed, the tide of liberal expansion that began in the 1980s as the USSR shuddered and then collapsed has begun to flow out again. Most liberal democracies now have openly anti-liberal parties or political movements, often with distressing strength.

I don’t think this should necessarily surprise us. After all, contrary to John Locke, liberalism is not some innate impulse that we have – our natural rights may be inherent in our being, but the ideology of extending those rights to all is not. For most of our existence, humans lived in relatively small, close-knit communities that were effectively totalitarian; the lives they lived may have been brightened by the community, but they were also poor, short and full of misery. By contrast, the dynamic and pluralistic nature of liberal societies can be alienating (even as it enables us to live rich, full lives), as it is in a sense, alien to us. But of course education, reading, medicine, and technology are all equally alien to our primitive natures, things we must learn rather than grasping inherently. Alien here does not mean bad.

In our current world, we also see illiberal regimes attempting to present an ideological challenge to liberalism and some people from liberal societies are wont to fall prey to these visions, usually because these new authoritarians profess to hate the people they hate. But the actual performance of these new authoritarian regimes is unimpressive and their promises are lies. After years of repression and military investment, Russia’s personalist authoritarian model cannot move faster than a snail’s pace across Ukraine, a country with a fourth of its population and a tenth of its GDP. Hungary’s self-proclaimed ‘illiberal democracyis one of the worst-performing economies in Europe.

For a time, the economic growth of the People’s Republic of China, bolstered by its massive population and substantial resources, was the counter to this narrative leading to talk of a ‘Chinese model’ of authoritarian economic success. However the Chinese economy has begun to struggle precisely because the lack of liberalism in the political sector has created a government unwilling or incapable of adapting to changes in the global economy and China’s new place in them. Instead, China appears in danger of stalling out as a middle-income country, with a purchasing-power adjusted per-capita GDP today of just $25,000, a middle-of-the-pack figure that would be embarrassing for almost any major liberal democracy, plus a looming demographic crisis causes in part by China’s illiberal policies (and of course, which China cannot even partially ameliorate with immigration because of, wait for it, illiberal policies). Instead, much of China’s vast talent remains trapped behind a stifling hukou household registration system and an increasingly closed oligarchy walling itself off behind state-controlled schools tied to those registrations.

So while there are many people in today’s liberal democracies who are tempted by this or that form of illiberalism, liberalism today doesn’t so much face challenges as it faces cautionary tales, sometimes shrouded in just enough lies and propaganda to seem appealing at first glance. Instead, the greatest political strain on most liberal democracies is the same one: vast hosts of hopeful people clamoring desperately to get out of their illiberal home countries into liberal societies where they know they can be free, prosperous and secure.

Of course all of this comes back to the United States. As imperfect as it was, the United States was founded as the world’s first liberal country and – with some notable exceptions – for most of its history, American politics has consisted of debates within liberalism (so much so that the definition of the word could drift precisely because defining an American politician as ‘liberal’ in the old meaning didn’t much matter). Indeed, in this sense we’re using here, nearly every major presidential candidate in my lifetime was fundamentally committed to liberalism, however imperfectly implemented. Ronald Reagan could speak of the beauty of liberty and immigration, Barack Obama on the value of free markets and free trade, George W. Bush on the importance of religious pluralism. Liberal values were the given foundation from which other arguments proceeded.

But of course you all saw the word ‘nearly’ there and have guessed where I am going. But before someone of you get angry, let me ask – when I wrote nearly, why did you already know who I meant?

Because he promises to be dictator (for a day; he says – because how many dictators left after one day?). Because he amplifies calls for idea of military tribunals to prosecute political opponents for speech. Because he suggests that he could round up large numbers of immigrants using the military in violation of the laws because “these aren’t civilians.” Because he used a grab bag of federal law enforcement agencies to violently clear a peaceful protest so he could have a photo-op. Because the president of the Heritage Foundation, the leading pro-Trump think tank, is promising a “second American Revolution” (though the Trump campaign, perhaps realizing this rhetoric was a bit too honest incendiary, backed away from it). Because he openly aligns with anti-liberal political movements abroad. Because he riled up his supporters to use force to prevent the peaceful transfer of power.

Whatever you may think of this fellow or of his opponent, do not tell me that this fellow speaks from within the tradition of liberalism. He does not. Perhaps some of his supporters do, or imagine he does, but he does not.16

And what I hope to have shown here, as a historian, is both that liberalism was created to solve very real problems and that liberalism works. Perhaps tomorrow we will devise another, better system of government than liberal democracy, but we haven’t done so yet. A liberal political order, buttressed by democratic governing institutions, is the only system that reliably delivers the opportunity for the ‘good life’ – life, liberty and the pursuit of happiness. Consequently, my opinion on everything else, on marginal tax rates, trade policy, defense spending, environmental regulation, and so on – and I have strong opinions about at least some of those things! – come second to ensuring the United States remains within the tradition of liberalism.

Because outside of that tradition is a wasteland of oppressive, authoritarian regimes, endless sectarian wars and economic stagnation. And perhaps even more importantly, a liberal society which leaves choice to the individual, provides the greatest opportunity of any for dignity, virtue and honor, because without choice, without liberty, our actions cannot express any of these.

I do not think that the United States is one election away from collapse; indeed that sort of idiotic hyperbolic panic is far too often used to justify the very illiberalism that concerns me. The United States has survived lawless populists before. At the same time, elections matter, especially in the long term as one public choice mounts on another. Freedom can go backwards, liberties can be curtailed, as they were at the end of Reconstruction – liberties delayed by nearly a century.

Liberalism is the American tradition: it has graced our best virtues, motivated our most righteous victories, illuminated our deepest flaws, animated our greatest rhetoric, elevated our finest documents; it has defined our country. It is a tradition we now share, of course, with a great many good friends abroad, because liberty is a thing which grows stronger when shared and weaker when hoarded. And whatever tribulations, economic hardships, political headwinds, frustrations and sincere differences we have, we should not surrender it to any man who claims that ‘he alone can fix it,’ if only we tender some of our liberties (or some of our neighbor’s liberties) first.

It is rather, as Lincoln reminds, for us to be here dedicated to the preservation of that government of the people, by the people and for the people, the one conceived in liberty. It is for us to assert again and again, the self-evidence of these truths we hold, to guard the unalienable Rights of our neighbors and to preserve the liberal form of government which in turn ensures the Life, Liberty and Happiness of us and our posterity.

Read the whole story
2 days ago
"So often illiberal movements spring from a sort of moral cowardice, a failure of confidence that one’s ideas could win the contest of persuasion on a fair playing field, so one must resort to force or violence."
Washington, District of Columbia
Share this story

Human history in the very long run

1 Share

As we often do on Federal holidays, we’re unlocking an older paid post (today’s is from December 2021) that many of our current subscribers haven’t yet had a chance to read. We’ll be back with a fresh mailbag tomorrow. In the meantime, have a great 4th of July!

Like a lot of people who write about politics, I’ve been interested in history for a long time.

That’s mostly meant the history of relatively recent events in the west — things like Eric Foner’s book on the ideology of the early Republican Party or nationalism and revolution in the 19th century Habsburg Empire. But more recently I’ve been getting interested in some topics in the distant past like the origins of the Indo-European languages, the origin of states, and ancient DNA research into prehistory.

These are interesting topics in their own right. But they also cast more recent history in a different perspective, especially the history of economics and living standards. Or I guess, to put it another way, thinking about the distant past underscores that recorded history proper is a relatively small portion of the total story of humans, and the modern era of technological dynamism and rising living standards is itself a small portion of recorded history.

And when you start to see it that way (or at least when I do), your level of certainly about the meaning of relatively recent, relatively small-scale trends starts to go down. We mostly take for granted a tendency toward steady progress and rising living standards that is, in reality, not at all a constant feature of the human story.

Of course technological progress is something that’s been happening this whole time. Indeed, human technology long predates our species Homo sapiens, with the Oldowan stone tool technology invented by Homo habilis and later inherited by Homo erectus. Erectus eventually advanced to the Acheulian stone technology, but the move from Oldowan technology to Acheulian took about a million years.

Left: Oldowan stone chopper; Right: Acheulian handaxe

What’s so striking about Oldowan technology’s million-year run is not just that it’s a long time — it’s that this is longer than Homo sapiens have been on the planet (roughly 300,000 years). And Acheulian technology, again, lasts longer than the entire duration of our species. Homo sapiens is just incredibly new, not just relative to geological time or the history of the universe, but relative to the basic practice of upright-walking, tool-using apes roaming the earth in hunter-gatherer bands.

We’re evidently much better at inventing stuff than our Homo predecessors were because in our time on Earth, we’ve done a lot better than one or two upgrades of our basic stone tools. But even within the sapiens era, there is an incredible telescoping of technological progress.

Subscribe now

For most of time, not much happened

Our species is about 300,000 years old, and farming and towns started about 12,000 years ago. The vast majority of the history of human technology unfolded before the existence of our species, and the vast majority of our species’ existence was prior to permanent settlements.

And per this Luke Muehlhauser chart, it was really only starting 200 or 300 years ago that we see any kind of sustained upward momentum in living standards.

It’s not really that living standards never changed before the industrial revolution. For the sake of legibility, Muelhauser’s chart covers about 3,000 years rather than 300,000.

If you did go all the way back, you’d find that the neolithic revolution — when people turned to agriculture — was also a big deal. The problem is that, as Jared Diamond and now many other scholars seem to agree, agriculture made living standards lower rather than higher. Or if you want to be fussy about it, you can agree with James Scott that the problem was not agriculture per se but grain.

What’s nuts to me, though, is that this whole agricultural era is just so damn short in the context of human history. Holden Karnofsky made a nice chart of life getting worse and then better, but to make it look nice he puts aggregate lives lived to date on the x-axis rather than something more conventional like time.

It’s a fun move and certainly a much nicer chart than if you’d tried to do it to scale by time. That said, I think the extreme paucity of meaningful technical progress during the majority of human history is an important piece of context. As is the fact that one of our key breakthroughs took us backward.

The pernicious logic of agriculture

The world of anti-farming takes has gotten mixed up with paleo bros and weird crank diets, so it’s worth stepping back to take a broader view of things.

In neoclassical economics, we have two factors of production — capital goods and labor. The original classical economists were wiser and had a third factor — land. Hunter-gatherers didn’t have a lot of capital, so for them, land and labor were the key factors. If you were someplace where there’s fruit, you could get yourself some fruit by deploying a little gathering labor. Similarly, if there’s meat, you could get some meat by deploying some hunting labor.

But because you get the non-metaphorical low-hanging fruit first, your gathering labor has sharply diminishing marginal returns. If you hunt, you kill some animals and scare the others off. The land itself is just not very productive, which is why you’re living in nomadic bands. But the non-productivity of the land also means that you only need to work so hard because there’s really no point in putting in more hours. For analog-era journalists, the number of column-inches of copy the editors would print was constrained by the number of ads that were sold — there just wasn’t that much demand for content. On the internet, there’s no objective limit on the number of articles you can afford to run, so there is a lot of pressure to publish at high volume. Nomads work hard — up to a point — but then they stop.

The point of farming is that you can generate radically more calories per acre than by roaming through the wilderness.

But in general, this just results from the factors being complementary — you can make the land much more productive by working really hard. I don’t think this is true on a strictly uniform basis. I once met a family that owns an olive farm, and they described it as a pretty chill situation outside of peak harvest time. But even though olives are delicious, an olive grove is not generating a lot of calories per acre. A rice paddy, by contrast, has incredibly productive land. Today we’re not limited by our local farming capacity, but the long arm of the past is still with us, and population density is incredibly high where the highly productive rice agriculture was.

But there are two problems. One is you need to work insanely hard to maximize the productivity of that field.

The other is precisely in that population density. If some new farmland opens up, you probably start out on easy street — plenty of places for some orchards, pastureland where the animals can just chill until you kill them, plus a little agriculture. You’re doing great, and you’re having more surviving children than your hunter-gatherer ancestors. But that just means the population grows, so you need to replace the orchards with more productive land use that also requires more work. And then you’re scaling back the pasture too, because a field of wheat can produce a lot more calories if it’s directly consumed by humans than if it’s a field of grass that’s consumed by cows who are in turn consumed by humans. But now you’re working crazy hard and eating worse food than a hunter-gatherer.

The good news is that by having a settled lifestyle, you can accumulate possessions. The bad news is all your stuff can be stolen.

The road to serfdom

This is Scott’s point about grain specifically: because it’s harvested at a specific time and then dried and stored, it’s ideal for stealing. You could show up at somebody’s radish field and dig up their radishes, but then you’re actually doing some of the work of running a radish farm.

But you can cart off a ton of rice or barley or millet or corn and make a nice living as a bandit. Or you could become one of Mancur Olson’s “stationary bandits” who decides to start calling the stealing “taxes.” Or maybe the village organizes a squad of trained and equipped fighters to stop bandits from stealing your stuff, but then it turns out you’re paying protection money to the fighter class. Either way, in addition to people working harder than before, there’s this new class of people who don’t grow the food and just take stuff.

To the extent that there’s a virtue to the agricultural revolution, this is it:

  • By making the land much more productive, you allow the aggregate size of the human population to become much larger.

  • By turning most people into sedentary sitting ducks, you allow the creation of a surplus-extracting exploiter class who are much richer than the richest hunter-gatherer.

When I first read Derek Parfit’s “Reasons and Persons,” I viewed the “repugnant conclusion” thought-experiment about shifting society to one with a much larger population at considerably lower standards of living to be interesting but also a kind of weird hypothetical. But one of the signature developments of our history really did have that character — pushing the median living standard down considerably but raising the number of people dramatically.

But the exploiter class is really important to the legacy of this period. When you first hear the idea that agriculturalists had lower living standards than hunter-gatherers, it seems absurd. You think about Classical Greece, Ancient Rome, the Umayyad Caliphate, Henry VIII, and everyone who comes to mind seems to be living a lot higher on the hog than a hunter-gatherer. In dramatic renderings of Medieval or quasi-Medieval settings, whether we’re talking “Game of Thrones” or the recent “The Last Duel” (which is really good and I wish more people had seen; check it out when it comes to streaming), the focus is almost always on members of the elite minority. They’ll show you class conflict between relatively more- and relatively less-elite members of the elite, but there’s little dwelling on the large majority of the population who were serfs or peasants of some kind.

The industrial era

Experts disagree somewhat as to when to date the start of the takeoff, but even though we had steady technological improvements during the agricultural era, it’s not until the mid-18th century (at best) that we see sustained improvements in living standards. Before that, whether we’re talking about inventing ironworking or windmills or whatever else, progress is real. But, it is slow enough that population growth catches up and median living standards collapse back to their sub-hunter-gatherer level. All the gains accrue to the extractive, exploitative elites who are able to confiscate more surplus as the bulk of the population is pushed back to subsistence levels.

The sustained increase in agricultural yields and industrial productivity of the past 250 years hasn’t been like that.

The typical English person in 2021 has higher living standards than the typical English person of 1921, who in turn was better off than the typical English person of 1821, who was probably better off than the typical English person of 1721 (though this last one is a little less clear). The 2021 > 1921 fact is even true for the typical resident of the world. But note that on a global scale, there’s no real sign of sustained progress until the second half of the twentieth century.

The dark portrait that Marx and Engels painted of the industrial revolution as immiserating the working class was completely wrong. But it’s easier to understand why you might have made that call given all prior technological improvements had, at best, led to the growth and enrichment of an extractive elite. There’s even an account from Robert Allen holding that British working-class wages didn’t start rising until around 1840, so the immiseration story was even true as a direct observation of the early industrial revolution.

But stepping back, it turns out that the industrial revolution in the North Atlantic world and then the spread of prosperity due to decolonization and globalization after 1960 or so are basically the best things that ever happened.

Subscribe now

What does this all mean?

My main takeaway from looking at the really big picture is that we should be less certain about a few things.

I used to be totally dismissive of the idea that automation could lead to bad economic outcomes because my view was people “always” had this fear about new technological developments, and it was “always” wrong. But it wasn’t actually always wrong. The Neolithic Revolution set average living standards on a downward trajectory for a few thousand years, and they stayed at a low level for a long time. Is it likely that we happen to be living on the precipice of an event like that? Probably not, but it’s not impossible. You shouldn’t make categorical statements about human history based on observations from a 150-year period.

Some other thoughts:

  • It’s easy to understand why people have a lot of deeply entrenched, albeit wrong, Malthusian intuitions because that’s how life was forever and ever.

  • The social media experiment in “connecting people” is in some ways weirder and more contrary to history than I think we sometimes appreciate; until very recently, almost everyone was living in small towns.

  • It’s commonplace to refer to the slower productivity growth since 1970 as a “stagnation” relative to the 1870-1970 pace, but the 1970-2020 period still features more per capita growth in a 50-year span than was typical in human history. Much more growth. So what’s really the anomaly here?

  • The whole idea of trying to invent new ways of doing things seems to be perhaps more novel than you’d think. People were flaking stones the same old, same old way for unimaginably long spans of time.

  • Human history is kind of bleak. There’s a lot of talk these days about the “dark parts of our country’s history” and how to think about them. But I’m not really sure we’ve had a conversation about the generally dark trajectory of all this history in general, which seems broadly lacking in uplifting themes about progress until suddenly it’s not.

It’s also really hard to tell what’s going on while you’re living through it. Marx, as we saw, grasped the world-historical significance of the Industrial Revolution but also called it totally wrong. It’s sort of conventional to counterpoise him to Adam Smith as the advocate of market capitalism, but if you read Smith, he’s remarkably uninterested in technological progress and actually isn’t cheerleading for the Industrial Revolution that’s beginning to unfold around him. What Smith is saying is that a market economy leads to a more efficient allocation of a quasi-fixed pool of resources, which is true but also not what led to the explosion of prosperity over the next 250 years.

Mostly, though, my upshot from all of this is to be more attendant to downside risks.

There is a sense in which our modern, prosperous technological era represents the steady accretion of knowledge over a long period of time as part of an upward ascent of humankind. But there is also a pretty profound sense in which this is not true. The hundreds of thousands of years of human bands wandering the earth and not changing very much dramatically outlast the story of human progress. Then for the majority of recorded history, most people were living worse than the paleolithic hunter-gatherers. If everything collapsed tomorrow and we entered a 50,000-year span of global impoverishment, that wouldn’t necessarily be outside the observed norm. But it’s also conceivable that today’s human children will be the creators of a cohort of self-replicating and self-improving AI robots that colonize the galaxy.

Which will happen? I don’t know. Which is more likely? I don’t know. What I do think I’ve learned from the truly longue durée is just how live all the different possibilities are. A lot of the things we “know” — a steady pace of technological progress that is rapid enough to generate consistently rising living standards — actually comes from a surprisingly small and local sample of the broad sweep of human existence.


Read the whole story
2 days ago
Washington, District of Columbia
Share this story

The strange history of osteopathic medicine

1 Comment

Donald Trump and Joe Biden will happily tell voters, the media, and really anyone who will listen, about their differences.

I’d like to share something they have in common: Both of their primary care doctors hail from a style of medical practice that was once derided by the mainstream medical community as a cult.

They're called Doctors of Osteopathy (DOs), and they offer a holistic treatment approach that emphasizes preventive care, the musculoskeletal system, and the overall “mind, body, and spirit” of the patient. Despite the approach’s ostracized 19th-century origins, DOs are now commonplace in hospitals across the country. They receive equal training and have the same practice rights as medical doctors (MDs).1

Our last two presidents aren’t alone in turning to DOs for treatment.

Buffalo Bill was an early supporter of the practice. Nelson Rockefeller saw an osteopath twice a week. Pope John Paul II, Michael J. Fox, and Muhammad Ali have all been treated by DOs.

Even if you don’t know it, there’s a possibility that you’ve either been treated by a DO or will in the near future. DOs currently represent 11% of all physicians in the United States and 25% of current medical students.

The story of osteopathic medicine’s shift from the medical fringe to an integral part of modern medicine is not widely known. But it's a fascinating tale with implications for our contemporary public healthcare system.

The father of osteopathic medicine

The field of osteopathic medicine was born out of tragedy. Dr. Andrew Taylor Still, a former Kansas state legislator and Union soldier, lost three of his children to spinal meningitis. In his profound grief, he went looking for something to blame.

What he found has surprising resonance with medical debates today.

Dr. Still believed that allopathic medicine (the treatment practiced by mainstream medical doctors) was far too focused on treating symptoms, usually with drugs, rather than the underlying conditions that led to illness.

This was a bit more rational than swapping Advil with turmeric. Dr. Still was practicing medicine in Kansas during the 1870’s, a time where sore throats were treated with mercury and chapped lips with lead nitrate. He wanted to deemphasize these harsh and, at times, deadly forms of medicine, and treat patients more holistically.

Did Dr. Still wander into strange medical experiments of his own? Absolutely. The father of osteopathy tried treating pain with magnets and disease with herbs. He was, by our modern definition, a bit of a kook. And perhaps, a bit of a megalomaniac, once claiming that he could “shake a child and stop scarlet fever, croup, diphtheria, and cure whooping cough in three days by a wring of its neck.”

However, much of his work remains foundational to the practice. Most notably, he invented Osteopathic Manipulative Treatment (OMT), a hands-on technique used to treat and prevent everything from knee pain to chronic migraines. This practice, along with his philosophical approach to treating the patients’ “mind, body, and spirit” are taught as core elements of the osteopathic educational curriculum today.

Subscribe now

The slow road to acceptance

When Dr. Still first began preaching the virtues of osteopathic medicine, his local church denounced him as sacrilegious. His brothers openly called him crazy. Baker University, the educational institution his family had helped establish a few decades prior, flatly refused to let him present his ideas.

By all accounts, it was a stunning fall from grace for a once-respected community leader. And it took moving to several towns and a decade of travel before he eventually found a receptive community in Kirksville, Missouri, where he founded the first functioning osteopathic school. He soon transformed the town into a Midwest medical destination; by the late 1890s, around 400 people journeyed to see Dr. Still or one of his newly credentialed students each day.

That local recognition failed to resonate with the mainstream medical community. In 1910, the famed medical education reformist, Abraham Flexner, issued the eponymously named Flexner Report. In it, he recommended a litany of reforms for the medical community, the most significant of which was the outright end of osteopathic medicine. Ten years later, the American Medical Association furthered osteopathic ostracism, releasing a resolution that essentially called all practitioners cultists.

While the name-calling might’ve been a bit heavy-handed, the substance of these critiques had some merit. Early 20th-century osteopaths eschewed mainstream pharmacological treatments and invasive surgery. The schools had comparatively lower admissions standards, and certified practitioners were able to practice on patients without significant clinical training.

If the osteopathic field hadn’t internalized those critiques, it might’ve gone the way of phrenology—just another quackery footnote in medical history. But the field began to incorporate mainstream medical practices, including many of the breakthroughs that occurred from the mid 19th through the 20th century. This meant expanding treatments that involved pharmacology and invasive surgery, practices that would’ve abhorred the deceased Dr. Still, but were essential to the modernization of the field.

Eventually, DOs began learning the same curriculum as MDs, along with their holistic approach and the requirement that all practitioners learn OMT treatments. The AMA soon recognized the modernization of the DO field, and started certifying DOs as equivalent to medical practitioners in the early 1960s.

Today, in addition to the standard osteopathic exam, DOs take the exact same credentialed test as MDs. And as recently as 2020, the osteopathic field integrated into the singular accreditation system for graduate medical education, meaning DOs and MDs now complete their residency training at the same schools.

Modern osteopathic medicine is effective

Outside of its applications for bar trivia and stimulating dinner party conversation, this history matters because osteopathic medicine has evolved to become an important practice in today’s healthcare system.

Consider some of contemporary America’s biggest public health crises.

The Opioid Epidemic has already killed 600,000 Americans and is expected to claim 1.22 million lives this decade. Two of the leading causes of death are hypertension and heart disease, both of which are linked to obesity. A staggering one in five Americans is currently living with chronic pain.

There is, obviously, no silver bullet to treat any of these healthcare issues. But it’s worth considering the philosophical and practical differences that separate DOs from MDs, and subsequently, their capacity to treat these public health problems.

A study by PRECISION Pain Research Registry found that doctors of osteopathy are less likely than medical doctors to prescribe opioids and other non-steroidal medicines. And “patients of osteopathic physicians reported lower levels of pain catastrophizing and were more resilient and better able to cope with their pain.”

Why were DOs predisposed to prescribe less opioids, and how were they able to still leave their patients with a greater capacity to deal with pain? One possible reason dates back to the same philosophical practices that grounded osteopathic medicine 150 years ago. DOs are trained to treat the patient holistically, and while they still absolutely prescribe prescription medicine, they do so at a comparatively lower rate than medical doctors.

OMT, the manipulative bone therapy technique that is exclusively practiced by osteopaths, has been shown in multiple comprehensive studies to be effective against some of the most prevalent forms of chronic pain. Most notably, it’s been proven to reduce the amount of work days missed due to lower back pain. It was demonstrated to be more effective than medication in treating chronic migraines, and even reduced the length of hospital stay and the chances of both respiratory failure and death for elderly patients suffering from pneumonia.2

A clinical trial has also demonstrated that osteopathic treatment is associated with an improvement in high blood pressure, as well as an improved intima-media (essentially a proxy for artery health).

This doesn't suggest you should forsake your MD for a DO, nor does it imply that a medical doctor is incapable of treating these conditions effectively. Notably, the most extensive study available, conducted by UCLA Health on nearly 330,000 senior Americans, found virtually identical outcomes in patient mortality, hospital readmission, and Medicare spending between DOs and MDs.

It is fascinating that doctors of osteopathy transitioned from their cult-like reputation to perhaps outperforming medical doctors in treating some of the most pressing public health issues in contemporary America.

But that history matters for another more troubling reason: Despite its increasing prevalence and demonstrated effectiveness, osteopathic medicine is still plagued by the remnants of historical bias.

The lingering bias against DOs

Last year, a bipartisan group of Congress members introduced the Fair Access in Residency Act. The legislation’s purpose was twofold:

  1. It forced all residency programs receiving federal funding to report the number of osteopathic students it admitted each year, with the hope that the added transparency would reduce bias against prospective osteopathic students in the residency application process.

  2. Each school also had to affirm that they accept the respective licensing exams for both practices — the USMLE for allopathic licensure and COMLEX for osteopathic licensure.

The problem they were trying to solve is a very real one.

According to the National Residency Match Program, 32% of residency program directors report never or seldom interviewing an osteopathic medical student. And of the programs that do consider DOs, 56% require the students to take the MD licensing exam. In effect, this forces DO students to take two tests, while MD students take just one.

The good news for DO students is that the match rate into residency programs is basically the same as MD students, meaning they’re equally likely to continue their education and ultimately become practicing doctors. But these facts still reveal a lingering bias against DOs that excludes them from certain residency programs and adds additional obstacles to continuing their medical education.

Now, it’s natural to ask whether MD students simply perform better academically, and are thus more qualified to gain access to certain elite residencies.

And the answer to that is complicated.

DO programs have a historical reputation, dating back to the infamous Flexner report, for having lower standards for admitted students. Today, that narrative is still somewhat accurate, but less so. Admitted MD students have an average GPA of 3.77, while the DO GPA sits at 3.61. The average MD MCAT score is also slightly higher (511.7) compared to DO students (504.8).

But are those slight differences enough to justify the 32% of residency programs that basically ignore all qualified osteopathic doctors?

To answer this question, I think it’s worth returning to the actual performance of osteopaths in the field. Are MDs outperforming DOs in actual hospital performance? The largest study we have demonstrates no difference between the performance between the two medical professions. Other research suggests that osteopathic medicine outperforms allopathic medicine in treating certain conditions, including chronic pain and hypertension.

That lingering bias should probably be buried right next to A.T. Still and his neck-wringing cure for scarlet fever.

More practitioners, more public awareness

Doctors of osteopathy now play an incredibly useful role in our healthcare system, filling gaps in field that face significant supply shortages.

Fifty-six percent of all graduating DOs ultimately pursue a career in primary care, a specialty that could face the largest physician shortage by 2034. By contrast, two-thirds of MDs work in other medical specialties. DOs are also more likely to work in rural areas and in communities that are under-serviced by healthcare.

It’s difficult to say whether this is due to nature of their education, the innate desires of the students, or the geographic placement of the schools (DO schools are more likely to be in rural areas), but it does underscore the valuable role practitioners of osteopathic medicine play in our modern healthcare system.

The fact that DOs now represent 25% of all current medical students is a promising development for patients seeking primary care doctors and for millions of Americans living in rural America.

I do think we would benefit from a broader awareness of osteopathic medicine, though. Everyone knows the function of a medical doctor. But according to the American Osteopathic Association, 1 in 5 Americans don’t know DOs exist. And depending on the age group, 37% to 46% of adults would not consider osteopathic treatment.

These aren’t terrible public opinion numbers for a practice that was practically considered a cult a mere century ago. But the deadly crisis of overprescription continues to decimate communities. Chronic pain afflicts 20% of Americans and is substantially more common than depression and diabetes.

That means millions of Americans who are dealing with pain might benefit from knowing more about osteopathic medical practices like OMT. The historic number of younger patients looking for alternative forms of treatment should know that osteopathic medicine is far more reputable and effective than advice they might get on social media. And people looking to avoid opioid prescriptions should know that DOs are far less likely to prescribe them.

Century-old bias should not undermine the standing of this growing medical practice.



My sister is currently studying to become a DO, which triggered my interest in the topic.


Notably, 56% of DOs report to not use OMT on patients.

Read the whole story
3 days ago
Outside my area of expertise, so I'm quite open to counter arguments, but it sounds like a useful counterbalance
Washington, District of Columbia
Share this story

A Presidential Address for This Moment

1 Comment
In January, 2017, just before the Obama team turned power over to Donald Trump, members of the Hamilton cast performed at the White House. Here Lin-Manuel Miranda (right), as Hamilton, tries to talk Christopher Jackson, as Washington, out of his decision to voluntarily give up power when he could easily have held on.
Hamilton: “They will say you’re weak.” Washington: “No. We will show we’re strong…. We’ll teach them how to say goodbye.”
Here were some people listening in the front row that evening. Seven years later, Joe Biden could reflect on the event and the song. (Screenshots from Hamilton YouTube channel.)

Address to the Nation

President Joe Biden

July 2024

My fellow Americans:

            I’d like to talk with you tonight about the faith that connects nearly all of us who share the blessing of calling ourselves Americans.

            That is a faith in the country’s past and a belief in its future. And a willingness, in the here and now, to do what we can—to fulfill our duty—to make our country stronger, prouder, fairer, greater.

            More open to opportunity. More equal under the law. More faithful to the values to which so many generations of Americans have pledged “our lives, our fortunes, and our sacred honor,” as our founders put it nearly 250 years ago in the Declaration of Independence.

            Through my long life I’ve been conscious of my own good fortune, in having ancestors who came here from Ireland to make a new start. Like so many of us I’ve worked toward a world that can be brighter for our children, and their children, and the generations to come.

            The three great commitments of my life have been to family, faith, and country. Every day, in every moment of my public life—through the half-century since I first was elected to the US Senate, through the eight years in which I served as vice president, and most of all in these past four years when I’ve had the honor and responsibility of service as your president—I have thought about what I owed my family and my faith, but always and above all what I owed my country.

            I am immensely proud of what we have achieved together in these past four years. We, together, as Americans: The millions who voted for me. The millions who voted for my opponent. The millions who didn’t vote at all, or couldn’t. All of us who make up the national family, and the world community that depends on us. America at its finest has never been completed but has always been moving forward. In our economy, in our place in the world, in our attention to long-neglected problems, we have a long way to go but have been moving ahead.

            This progress must continue. The risks of moving backward are too great. And—to be blunt—the dangers at the moment are too grave, if control of America’s public institutions and its immense power, if its reputation abroad and its wellbeing at home, should fall back into the hands of someone whose loyalty extends only to himself.

            Knowing these stakes, I have thought carefully and clearly about the duty history asks of me at this crucial time—this ‘inflection point,’ as I often say. The duty that surmounts all others is making sure that leadership of the world’s greatest democracy remains with those who believe in democracy itself. We must guarantee that America is led by people who believe in America. Our nation has never had an election-denier and convicted felon in charge of its government. Nor one who disparages our military and courts and the other institutions that keep us strong. Who preaches division and promises retribution. It cannot risk doing so now.

            In recent weeks I have listened hard to critics, and supporters. I have talked with my family and staff and tried to look honestly at myself. I believe the record shows that I and my team were the right people, at the right time, for the challenges of the past four years. We did our duty, and I believe historians will say that we met the moment well.

            But I have come to realize that I can now best fulfill my duty in the fight for American values by passing the torch. I have always done my best, in my time. Now it is time for outstanding figures from our next generations—talented, idealistic, already highly experienced—to take their leading roles.

            We need the strongest candidates through the all-important next four months until the election. We need the most-qualified prospects for continued progress in the four years after that. We need to ensure that the next leaders of our country will be ones who appeal to the best in our national spirit, not pander to the worst.

            In this moment, my duty to the country and to history is to do everything I can to help such leaders prevail. Therefore I am tonight sharing with you my conclusion that I should no longer be a candidate in the coming election. I will remain on duty through every moment of my first term as your president. But I do not seek re-election to a second.

            This is a difficult and personally painful decision, for someone who has spent so much of his life in public office. But my family, my faith, and my belief in my country make me sure it is the right one. My commitment to this new course is total. I hope that all who have been so generous in their faith and support for me, especially my friends and allies in my own party, will understand. I hope they will wholeheartedly follow my lead.

            It is beyond question that my opponent should have made a similar decision long ago—or responsible members of his party should have made it for him. His ethical and temperamental failings are obvious. His contempt for our nation’s ideals is even more so. The threat he represents to our nation’s future and the free world’s values is enormous.

            But—despite the Supreme Court’s latest reckless ruling on presidential power—there is nothing I can do directly, or ethically, to stop him. All I can do is use every fiber of my being to see that a free electorate chooses a different path. 


           If the decision were solely up to me, I would naturally start with Vice President Harris, who has entirely fulfilled my belief that she was the right one to stand at my side, and next in line, on major decisions for our nation. I know something about the challenges of being a vice president. In these four years she has earned my absolute trust, gratitude, respect, and support.

            But I know that this next decision cannot be solely up to me. A democratic system requires democratic decisions, above all from the Democratic party. I am prepared to do all in my power to help Americans of my political party, and all parties, to come together in enthusiastic support of its next candidate.

            I owe this great country everything. I will continue to give it my very best. I do so this evening in committing to join you, my fellow Americans, next year in what the great Justice Louis Brandeis once called “the most important political office, that of private citizen.” And to using every moment between now and then to ensure that our next leaders are ones truest to our nation’s ideals.

            May God bless you all. And may God protect our troops and continue to guide our nation toward the light.

Subscribe now

Read the whole story
3 days ago
The greatest strength of a democracy is that no one person is necessary. Any of our leaders can step aside, and the nation moves on.
Washington, District of Columbia
Share this story

Indonesia's big EV push

1 Comment

The News

Indonesia’s first battery cell factory has opened. Hyundai and LG built the $1.1 billion plant in Karawang as Jakarta pushes to step up its high-tech manufacturing. The Chinese company CATL, the world’s biggest electric vehicle battery maker, will also begin building a facility later this year. Indonesia produces nearly half of the world’s nickel, a vital raw material for EV battery manufacturing, and in 2020 it banned ore exports, forcing foreign companies to invest in factories in the country.

The government also offers tax breaks to EV companies to incentivize investment. EV sales have slowed since the battery plant was proposed, but analysts told the Financial Times that Indonesia is set to gain from the energy transition.

Read the whole story
3 days ago
I think a lot of Indonesians would be delighted to become the next Detroit
Washington, District of Columbia
Share this story
Next Page of Stories