Skip to main content

The Narrowing Circle

Modern ethics excludes as many beings as it includes.

The “expanding circle” historical thesis ignores all instances in which modern ethics narrowed the set of beings to be morally regarded, often backing its exclusion by asserting their non-existence, and thus assumes its conclusion: where the circle is expanded, it’s highlighted as moral ‘progress’, and where it is narrowed, what is outside is simply defined away. When one compares modern with ancient society, the religious differences are striking: almost every single supernatural entity (place, personage, or force) has been excluded from the circle of moral concern, where they used to be huge parts of the circle and one could almost say the entire circle. Further examples include estates, houses, fetuses, prisoners, and graves.

One sometimes sees arguments for ethical vegetarianism which play on the Whiggish idea of moral progress following a predictable trend of valuing ever more creatures, which leads to not eating animals among other ethical positions; if one wishes not to incur the opprobrium of posterity, one ought to ‘skate where the puck will be’ and beat the mainstream in becoming vegetarian.

This seems plausible: Thomas Jefferson comes to mind as someone who surely saw that slavery was on the way out—for which we congratulate him—but also lacked the courage of his convictions, keeping and wenching his slaves—for which we condemn him. But you can do better! All you have to do is abandon eating meat and animal products…

The standard for this would be Peter Singer’s The Expanding Circle: Ethics, Evolution, and Moral Progress, which opens with the epigraph:

The moral unity to be expected in different ages is not an unity of standard, or of acts, but an unity of tendency…At one time the benevolent affections embrace merely the family, soon the circle expanding includes first a class, then a nation, then a coalition of nations, then all humanity, and finally, its influence is felt in the dealings of man with the animal world.

W.E.H. Lecky, The History of European Morals

Or consider Martin Luther King Jr’s “Why I Am Opposed to the War in Vietnam” (196757ya):

I haven’t lost faith, because the arc of the moral universe is long, but it bends toward justice.

Problems

In a way, it’s kind of odd that one can predict future moral progress, as opposed to something like future population growth. Presumably we are doing the best we can with our morals and at any moment, might change our position on an issue (or not change at all). If one knew in advance what progress would be made, why has it not already been made? (A little like economics’s efficient-market hypothesis.) If one knew that one was going to be wrong in guessing about a coin coming up heads, why doesn’t one immediately update to simply guess that it will be tails?1

But then again, perhaps ethics is not “efficient”: one can beat the trends by being especially intelligent, or especially attentive to empirical trends, or perhaps one has the benefit of being young & uncommitted while the rest of the populace is ossified & set in their evil ways.

Pareidolia

Of course, progress could be an illusion. Random data can look patterned, and especially patterned if one edits the data just a little bit. Biological evolution looks like an impressive multi-billion-year cascade of progress towards ever more complexity, but how can we prove Stephen Jay Gould wrong if he tells us that is due solely to evolution being a drunkard’s walk with an intrinsic lower bound (no complexity = no life)? And if we were to find that the appearance of progress were due to omissions in the presented data, that would certainly shake our belief that there’s a clear overall trend of progress (as opposed to some sort of random walk or periodic cycle or more complicated relationship).

And there may indeed be points omitted.

Observer Bias

An acquaintance, trying to come up in ways in which the moral circle might have narrowed, failed to. Who among us has lived through all those centuries, has believed as they believed, and could really give an accurate account of changing ethics & mores? We face the same incommensurability problem that history & philosophy of science faces: not only are our ideas of what ethics is changing over time, so are our standards of evidence, and even what entities we believe to exist (even before we get to questions about which entities have moral standing!).

Matters of Fact

The ontology question is a very serious one. C.S. Lewis remarked, if many people are against burning witches, then it’s probably because they don’t believe witches exist—but if the witches existed and acted as described, they would howl for the witches’ blood:

I have met people who exaggerate the differences [between the morality of different cultures], because they have not distinguished between differences of morality and differences of belief about facts. For example, one man said to me, ‘Three hundred years ago people in England were putting witches to death. Was that what you call the Rule of Human Nature or Right Conduct?’ But surely the reason we do not execute witches is that we do not believe there are such things. If we did-if we really thought that there were people going about who had sold themselves to the devil and received supernatural powers from him in return and were using these powers to kill their neighbors or drive them mad or bring bad weather, surely we would all agree that if anyone deserved the death penalty, then these filthy quislings did. There is no difference of moral principle here: the difference is simply about matter of fact. It may be a great advance in knowledge not to believe in witches: there is no moral advance in not executing them when you do not think they are there. You would not call a man humane for ceasing to set mousetraps if he did so because he believed there were no mice in the house.2

Or consider this example: the USA has not executed anyone for treason in decades, and rarely convicts anyone, despite plenty of people committing treasonous acts like spying for Israel or China or Russia. Does this reflect an expanding circle now including the belief that one should not kill people for any reason?

Or does it instead reflect that understanding that: with the largest and most sophisticated military in the world, a huge population, 2 very friendly bordering states, oceans on either side and a navy to match, there is not the slightest possibility of any other country invading or occupying the USA? We worry about overseas allies like South Korea—but how many other countries are so incredibly stable and secure that an ally half-way around the world is their chief headache? Terrorism is the chief security concern—and utterly ridiculous given the actual facts. The one real military threat to the USA is nuclear war with Russia, which relied on a balance of terror no traitor could single-handedly affect (the key information, the location of nuclear weapons, still wouldn’t enable a pain-free first strike); Julius and Ethel Rosenberg were, incidentally, executed for nuclear espionage—nuclear war being the only genuine existential threat to America in the 20th century. Benedict Arnold could hand over to the British the key fort of West Point enabling Britain to invade & occupy the commercial heart of the colonies, New York State, but what could a modern Arnold do? (Britain no longer rules the waves.) Is the US really more moral for no longer executing for treason? Or is it just not really threatened? If the US were genuinely under threat, I suspect this ethical record would disappear like the proverbial snowball in hell; certainly the US has been pleased to assassinate or torture anyone it wants to in the course of the War on Terror.

Narrowings

I believe we can identify multiple large areas in which the circle has shrunk.

Religion

This achievement is often praised as a sign of the great superiority of modern civilization over the many faded and lost civilizations of the ancients. While our great skill lies in finding patterns of repetition under the apparent play of accident and chance, less successful civilizations dealt by appealing to supernatural powers for protection. But the voices of the gods proved ignorant and false; they have been silenced by the truth.

James P. Carse, Finite and Infinite Games (198638ya)

My acquaintance’s failure was understandable because he was an atheist. When one doesn’t believe religion deals with real things at all, it’s hard to take religion seriously—much less recall any instances of its sway in the West increasing or decreasing.

Nevertheless, when one compares modern with ancient society, the religious differences are striking: almost every single supernatural entity (place, personage, or force) has been excluded from the circle of moral concern, where they used to be huge parts of the circle and one could almost say the entire circle.

One really has to read source texts to understand how vivid and important the gods were. In the Bible or Herodotus’s Histories, one seems to read of divine intervention on almost every page: the oracles were not consulted by the superstitious but were key parts of statecraft (and so was bribing them either directly by sacrifices or indirectly by bribing the clergy); a messenger could meet the god Pan along the road (who berates him for his city’s neglect of Pan’s sacrifices) and relate their conversation in detail to the legislature, who listen respectfully and rectify the wrongs; the gods would guide their favorites in daily matters with useful little omens, and would routinely protect their temples and sacred places by such efficacious means as insanity and destroying entire families. (If one quoted Tacitus today that Deorum iniuriae Diis curae/“offenses to the gods are the concern of the gods”, it comes out as ironic and mocking—physician, cure thyself!—but I suspect the Romans would mean it literally.) Indeed, the gods were immanent and not transcendent. Their expressed wishes were respected and honored, as were their avatars, possessions (Herodotus’s pages are as crowded with artwork given to Delphi as the temple precincts must have been), slaves, and food.

The Greeks did not believe in belief and the ‘retreat to commitment’ would have been sheer heresy; the oracles were taken very seriously by Greco-Roman culture and were not ‘compartmentalized’ aspects of their religion to be humored and ignored. Even the most skeptical educated elite often believed visions reported by people if they were awake (see “Kooks and Quacks of the Roman Empire: A Look into the World of the Gospels”); as has been observed by anthropologists, modern Western societies are extraordinarily atheistic in practice. Discussing the historical context of early Christian missionaries in his 200915ya Not the Impossible Faith (pg 283–284), Richard Carrier writes:

In the other case, cultural presuppositions subconsciously guide the prophet’s mind to experience exactly what he needs to in order to achieve his goals. Such “experiences are found among 90% of the world’s population today, where they are considered normal and natural, even if not available to all individuals,” whereas “modern Euro-American cultures offer strong cultural resistance” to such “experiences, considering them pathological or infantile while considering their mode of consciousness as normal and ordinary.” So moderns like Holding stubbornly reject such a possibility only by ignoring the difference between modern and ancient cultures-for contrary to modern hostility to the idea, “to meet and converse with a god or some other celestial being is a phenomenon which was simply not very surprising or unheard of in the Greco-Roman period,” and the biology and sociology of altered states of consciousness is sufficient to explain this. [Malina & Pilch 200024ya, Social Science Commentary on the Book of Revelation pg 5, 43]

…As it happens, schizotypal personalities (who experience a relatively common form of non-debilitating schizophrenia) would be the most prone to hallucinations guided by such a subconscious mechanism, and therefore the most likely to gravitate into the role of “prophet” in their society (as Malina himself argues). Paul, for example, so often refers to hearing voices in his letters (often quoting God’s voice verbatim) that it’s quite possible he was just such a person, and so might many of the original Christian leaders have been (like Peter). Indeed, the “Angel of Satan” that Paul calls a “thorn in his flesh” (2 Corinthians 12:6–10) could have been an evil voice he often heard and had to suppress (though Holding is right to point out that other interpretations are possible). But there are many opportunities even for normal people to enter the same kind of hallucinatory state, especially in religious and vision-oriented cultures: from fasting, fatigue, sleep deprivation, and other ascetic behaviors (such as extended periods of mantric prayer), to ordinary dreaming and hypnagogic or hypnopompic events (a common hallucinatory state experienced by normal people between waking and sleep). [On all of this see references in note 14 in Chapter 8 (and note 25 above).]

The gradual failure of the oracles was a spiritual crisis, memorialized by Plutarch in his dialogue De Defectu Oraculorum. (An overview and connection to modern Christian concerns is Benno Zuiddam’s “Plutarch and ‘god-eclipse’ in Christian theology: when the gods ceased to speak”.) The dialogue is interesting on many levels (I am struck by the multiple refutations of the suggestion that an eternal flame burning less proves the year is shrinking yet all uncritically believe in the gods & oracles, or see Elijah’s theological experiment); the speakers do not consult the remaining oracle but attempt to explain the decline of the oracles as a divine response to the decline of Greece itself (fewer people need fewer oracles), divine will (or whim?), corruption among humans, or deaths among the lesser supernatural entities (daemons) who might handle the oracles for the major gods like Apollo or Zeus:

Let this statement be ventured by us, following the lead of many others before us, that co-incidentally with the total defection of the guardian spirits assigned to the oracles and prophetic shrines, occurs the defection of the oracles themselves; and when the spirits flee or go to another place, the oracles themselves lose their power….but when the spirits return many years later, the oracles, like musical instruments, become articulate, since those who can put them to use are present and in charge of them.

Hopeful, but Plutarch concludes with a more depressing message:

The power of the spirit does not affect all persons nor the same persons always in the same way, but it only supplies an enkindling and an inception, as has been said, for them that are in a proper state to be affected and to undergo the change. The power comes from the gods and demigods, but, for all that, it is not unfailing nor imperishable nor ageless, lasting into that infinite time by which all things between earth and moon become wearied out, according to our reasoning. And there are some who assert that the things above the moon also do not abide, but give out as they confront the everlasting and infinite, and undergo continual transmutations and rebirths.

This idea that the gods might die, and the general silence and reduction in miracles was fortunate for the upstart mystery cult Christianity as the silence could be and was interpreted as a victory of the Christian god over the Olympians: Christian Eusebius of Caesarea writes in his Praeparatio Evangelica (313 AD) of Plutarch’s dialogue:

Hear therefore how Greeks themselves confess that their oracles have failed, and never so failed from the beginning until after the times when the doctrine of salvation in the Gospel caused the knowledge of the one God, the Sovereign and Creator of the universe, to dawn like light upon all mankind. We shall show then almost immediately that very soon after His manifestation there came stories of the deaths of daemons, and that the wonderful oracles so celebrated of old have ceased.

(The oracles would occasionally be restored and supported by various emperors but the efforts never took and they were finally outlawed as pagan remnants.) Eusebius goes further, saying the death of Pan (related in the dialogue by Cleombrotus) was due directly to Jesus:

…it is important to observe the time at which he says that the death of the daemon [Pan] took place. For it was the time of Tiberius, in which our Savior, making His sojourn among men, is recorded to have been ridding human life from daemons of every kind, so that there were some of them now kneeling before Him and beseeching Him not to deliver them over to the Tartarus that awaited them. You have therefore the date of the overthrow of the daemons, of which there was no record at any other time; just as you had the abolition of human sacrifice among the Gentiles as not having occurred until after the preaching of the doctrine of the Gospel had reached all mankind. Let then these refutations from recent history suffice.

Religions of Convenience

To see the gods dispelled in mid-air and dissolve like clouds is one of the great human experiences. It is not as if they had gone over the horizon to disappear for a time; nor as if they had been overcome by other gods of greater power and profounder knowledge. It is simply that they came to nothing.

Wallace Stevens, Opus Posthumous (195569ya)

This blind spot is partially based on different ontologies—different facts.

But even more, it is based on weaker, less virulent religions, where the believers tolerate amazing things. (The analogy is to contagious diseases, which cannot afford to be too virulent if transmission becomes more difficult, eg. perhaps syphilis) One might call this meta-atheism: most believers simply do not seem to believe—they do not act as agents might be expected to act if they took as facts about the world the beliefs they claim to believe.

Consider how seriously religion used to be taken in the West. But today? Iceland is mocked when construction is held up to expel elves—but the construction goes forward. Japan keeps its temples on sacred places—when they earn their keep and do not block housing projects, of course. Lip service is paid, at most. In the West, ordinary use of the supernatural (eg. trial by ordeal) has been receding since well before any scientific revolution, since the 1300s3. It is trivial to point out how religions “evolve” to make everyone a “cafeteria Catholic”, leading to people who take their religion seriously (perhaps because they are adult converts who are not “inoculated” with the appropriate excuses and shabby intellectual dodges ‘cafeteria’ believers inculcate in their children) being labeled as insane fanatics; but early believers of anything took their particular religion quite seriously even if it is almost unrecognizably different from the religion as it is actually practiced many centuries later. Christianity affords a striking example given how completely its communalism has been abandoned: consider the case of Ananias and Sapphira in Acts of the Apostles, supposedly detailing how the early churches ran their affairs, which tells us that

There was no needy person among them, for those who owned property or houses would sell them, bring the proceeds of the sale, and put them at the feet of the apostles, and they were distributed to each according to need…A man named Ananias, however, with his wife Sapphira, sold a piece of property. He retained for himself, with his wife’s knowledge, some of the purchase price, took the remainder, and put it at the feet of the apostles. But Peter said, “Ananias, why has Satan filled your heart so that you lied to the holy Spirit and retained part of the price of the land? While it remained unsold, did it not remain yours? And when it was sold, was it not still under your control? Why did you contrive this deed? You have lied not to human beings, but to God.” When Ananias heard these words, he fell down and breathed his last, and great fear came upon all who heard of it…After an interval of about three hours, his wife came in, unaware of what had happened. Peter said to her, “Tell me, did you sell the land for this amount?” She answered, “Yes, for that amount.”…At once, she fell down at his feet and breathed her last.

The early communist Church held up as an ideal that from each according to their ability, to each “according to need”, that failing to turn over all (it is not suggested that they held back a large amount, just “some”) of one’s worldly wealth is literally Satanic and inspired by the Devil, and divinely punishable by instant death. This was not presented as a parable, or an ideal, or a teaching, or a metaphor, but as something that actually happened. There is a clear lesson here about how to run human affairs, which is applicable to all times and places, with no convenient qualifiers or loopholes to neuter it. Yet I note with perplexity that in both my contemporary Christian society and also all previous Christian societies, not only is failing to donate all one’s wealth to the Church not punishable by summary execution, it does not seem punishable by anything at all; in fact, I’m not sure it is a crime anywhere or any time! Of course everyone knows why this incident has been minimized and no society has ever tried to implement such requirements, and I have no doubt that some Christian at some point has come up with some clever distinction to dispose of it (eg. perhaps such communism is justified only under direct apostolic control, all other humans being too fallible or too weak) but the grounds on which this example of divine justice is ignored illustrate my observations here; it is hard not to agree with Nietzsche (Beyond Good and Evil, section 4): “It is not their love of humanity but the impotence of their love of humanity that prevents today’s Christians—from burning us.”

The wishes of supernatural entities are not respected, even if—as many aboriginals might argue of their sacred places and spirits—those entities will be killed by whatever profitable human activity. And to think, atheists are a small minority in most every nation! If this is how believers treat their gods, the gods are fallen gods indeed.

Animals

The circle may have widened for the human and less-than-human, but in what way has the circle not narrowed for the greater-than-human? Peter Singer focuses on animals; religion gives us a perspective on them—what have they lost by none of them being connected to divinities and by becoming subject to modern factory farming and agriculture? If you could ask snakes, one of the most common sacred animals, what they made of the world over the last millennia, would they regard themselves as better or worse off for becoming merely animals in the expanded circle? If India abandoned Hinduism, what would happen to the cows? We may be proud of our legal protections for endangered or threatened species, but the medievals protected & acquitted ordinary bugs & rats in trials. Let us not forget that the Nazis—who usefully replace devils & demons in the Western consciousness—were great lovers of animals. In-group/outgroup favoritism may be inherently zero-sum.

Infanticide

Continuing the religious vein, many modern practices reflect a narrowing circle from some points of view: abortion and contraception come to mind. Abortion could be a good example for cyclical or random walk theses, as in many areas the moral status of abortion or infanticide has varied widely over recorded history, from normal to religiously mandated to banned to permitted again.

Consider Greece: examples like Agamemnon sacrificing Iphigenia only prove that human sacrifice was ambiguous in some periods, not that it never existed. (I have not read much about Greek sacrificing; Richard Hamilton’s review of Hughes’s Human Sacrifice in Ancient Greece relays Hughes’s skepticism and alternate explanations of “twenty-five archaeological sites are discussed in detail and over a hundred literary testimonia” describing human sacrifices.)

Much better documented is the widespread infanticide: Sparta routinely tossed infants off a cliff and exposure of deformed existed among other cities— Thebes famously exposed Oedipus and Athens discarded valueless females. As with much about the life of Greek city-states, the existing evidence only whets our appetite and does not give a full picture. Mark Golden in “Demography and the exposure of girls at Athens” says the female infanticide rate may have ranged as high as 20%, while Donald Engels in “The problem of female infanticide in the Greco-Roman world” deprecates this as demographically impossible and says there is a single percentage upper bound. The impression I get from Mindy Nichol’s senior thesis, “Did Ancient Romans Love Their Children? Infanticide in Ancient Rome”, is that the picture is complicated but there was probably substantial infanticide—although surely not as high as the 66–75% rate (for both genders) ascribed to pre-contact Polynesians by observers in the 1800s (Gregory Clark, ch5 Farewell to Alms).

Into recorded history, sacrifice disappears and infanticide becomes rarer while abortion remains permitted within limits; early Rome’s Twelve Tables mandated infanticide of the deformed; from the Twelve Tables:

A father shall immediately put to death a son recently born, who is a monster, or has a form different from that of members of the human race.

Imperial Rome seems to have eventually banned abortion and infanticide, but to little effect; the Christian Tertullian, writing in mid-empire, claims the infanticide laws are ignored in Libri duo ad Nationes, ch 15 (“The Charge of Infanticide Retorted on the Heathen”):

Meanwhile, as I have said, the comparison between us [Christians & pagans] does not fail in another point of view. For if we are infanticides in one sense, you also can hardly be deemed such in any other sense; because, although you are forbidden by the laws to slay new-born infants, it so happens that no laws are evaded with more impunity or greater safety, with the deliberate knowledge of the public, and the suffrages of this entire age. Yet there is no great difference between us, only you do not kill your infants in the way of a sacred rite, nor (as a service) to God. But then you make away with them in a more cruel manner, because you expose them to the cold and hunger, and to wild beasts, or else you get rid of them by the slower death of drowning.

Early Christianity banned infanticide but seems to have permitted abortion and although there were many dissenters and different positions, views have gradually hardened to the present day where majority sects like the Catholic Church (and most Protestant sects) flatly oppose it as murder.

And Greece came under Turkish dominion, so it might be governed by the entirely different set of changing Islamic beliefs on those matters (consistently opposed to infanticide and all human sacrifice) which may permit some abortions. Is there any consistent trend here? If one accepts the basic premise that a fetus is human, then the annual rate (as pro-life activists never tire of pointing out) of millions of abortions worldwide would negate centuries of ‘moral progress’. If one does not accept the premise, then per C.S. Lewis, we have change in facts as to what is ‘human’, but nothing one could call an “expanding circle”.

What about people with disabilities? Are they better off or worse these days? There’s discrimination against the fat because they are seen as morally inferior to thin people—‘gluttons’—while fatness used to just be a marker of wealth. Or consider scarification such as dueling scars. Is there now more discrimination against the unattractive? Is that because of what we might call increases in “beauty inequality” (to go with the more famous income inequality)? As populaces get healthier, wealthier, and ever new technologies & techniques are invented, the most beautiful are getting more beautiful. You can only be so ugly and crippled before you die or are too incapacitated, but how attractive you can be keeps increasing: with cosmetics, orthodontics, plastic surgery… If you were born in the dark ages and escaped goiters and manage to not be a thin short starving peasant, you would still be lessened if your teeth came in crooked.

Judicial Torture

State use of torture can be cyclical—some northern European countries going from minimal torture under their indigenous governments to extensive torture under Roman dominion back to juries & financial punishments after Rome to torture again with the revival of Roman law by rising modern centralized states and then torture’s abandonment when those states modernized and liberalized even further. China has gone through even more cycles of judicial torture, with its dynastic cycle.

Some areas have changed far less than one might hope; arbitrary property confiscations that would make a medieval England freeman scarlet with anger are alive and well under the aegis of the War on Drugs, under the anodyne term “asset forfeiture” as a random form of taxation. (And what are we to make of the disappearance of jury trials in favor of plea bargaining?)

Continuing the judicial vein, can we really say that our use of incarceration is that superior to our ancestors’ use of metered-out torture? I am chilled when I read and agree with Adam Gopnik:

Every day, at least fifty thousand men—a full house at Yankee Stadium—wake in solitary confinement, often in “supermax” prisons or prison wings, in which men are locked in small cells, where they see no one, cannot freely read and write, and are allowed out just once a day for an hour’s solo “exercise.” (Lock yourself in your bathroom and then imagine you have to stay there for the next ten years, and you will have some sense of the experience.) Prison rape is so endemic—more than seventy thousand prisoners are raped each year—that it is routinely held out as a threat, part of the punishment to be expected. The subject is standard fodder for comedy, and an uncooperative suspect being threatened with rape in prison is now represented, every night on television, as an ordinary and rather lovable bit of policing. The normalization of prison rape—like eighteenth-century japery about watching men struggle as they die on the gallows—will surely strike our descendants as chillingly sadistic, incomprehensible on the part of people who thought themselves civilized.

From another author:

For 200816ya, for example, the government had previously tallied 935 confirmed instances of sexual abuse. After asking around, and performing some calculations, the Justice Department came up with a new number: 216,000. That’s 216,000 victims, not instances. These victims are often assaulted multiple times over the course of the year. The Justice Department now seems to be saying that prison rape accounted for the majority of all rapes committed in the US in 200816ya, likely making the United States the first country in the history of the world to count more rapes for men than for women.

Those calculations are contained in 4 reports discussed in “Prison Rape and the Government” (The New York Review of Books 201113ya); other interesting bits:

As the Bureau of Justice Statistics found in a recent study, “between 69% and 82% of inmates who reported sexual abuse in response to the survey stated that they had never reported an incident to correctional managers.”…According to a recent report by the Bureau of Justice Statistics (BJS), a branch of the Department of Justice, there were only 7,444 official allegations of sexual abuse in detention in 200816ya, and of those, only 931 were substantiated. These are absurdly low figures. But perhaps more shocking is that even when authorities confirmed that corrections staff had sexually abused inmates in their care, only 42% of those officers had their cases referred to prosecution; only 23% were arrested, and only 3% charged, indicted, or convicted. 15% were actually allowed to keep their jobs.

…The department divides sexual abuse in detention into four categories. Most straightforward, and most common, is rape by force or the threat of force. An estimated 69,800 inmates suffered this in 200816ya.3 The second category, “nonconsensual sexual acts involving pressure,” includes 36,100 inmates coerced by such means as blackmail, offers of protection, and demanded payment of a jailhouse “debt.” This is still rape by any reasonable standard. An estimated 65,700 inmates, including 6,800 juveniles, had sex with staff “willingly.” But it is illegal in all fifty states for corrections staff to have any sexual contact with inmates. Since staff can inflict punishments including behavioral reports that may extend the time people serve, solitary confinement, loss of even the most basic privileges such as showering, and (legally or not) violence, it is often impossible for inmates to say no.4 Finally, the department estimates that there were 45,000 victims of “abusive sexual contacts” in 200816ya: unwanted touching by another inmate “of the inmate’s buttocks, thigh, penis, breasts, or vagina in a sexual way.” Overall, most victims were abused not by other inmates but, like Jan, by corrections staff: agents of our government, paid with our taxes, whose job it is to keep inmates safe.

…Between half and two thirds of those who claim sexual abuse in adult facilities say it happened more than once; previous BJS studies suggest that victims endure an average of three to five attacks each per year…Of juvenile detainees reporting sexual abuse by other inmates, 81% said it happened more than once.

…The top half of all facilities have made their achievements without explicitly stated standards; there is still plenty of room for them to improve, and every reason to expect that they will once the standards are in place, though probably not as dramatically as the bottom half of facilities. In our opinion, if the department issues strong standards, it wouldn’t be unrealistic to expect that the national rates of abuse could sink to those of the best quarter or even the best tenth of all facilities. But even if the standards allowed all facilities to do only as well as half do now, they would be saving not 3% of the people sexually abused in detention, but over 53%.23 This means that had the standards been in place in 200816ya, instead of the 199,500 people who the department says were abused in adult prisons and jails, there would have been about 93,100. More than 100,000 adults (as well as many thousands of children) would have been saved an experience from which few recover emotionally.

(Wikipedia covers a variety of statistics putting US rape rates in the 2000s at 195,000 and under.)

At least spectators could count how many lashes were administered; but who counts the anal rapes—and gives time off for extra?

And millions of rapes per decade is only the start of the criminality of the American criminal system; in much the same way waterboarding was—uncontroversially—a torture employed by the likes of the Spanish Inquisition and condemned as such by Americans whenever it appeared (and after WWII, they hung Japanese who employed waterboarding) yet once waterboarding became useful in the War on Terror it suddenly ceased to be torture when Americans did it, the prison system abusively uses solitary confinement—well-established to be profound torture in its own right, leading to suicide, hallucinations, madness, and suffering (these effects were, incidentally, why the early American Quakers abandoned reform plans for prisons based on solitary confinement)—even as Americans criticize any employment of solitary confinement by other countries such as Iran. (Let’s not talk about how one is sentenced to jail in the first place; Hunter Felt: ‘“Your third arrest, you go to jail for life.” “Why the third?” “Because in a game a guy gets three times to swing a stick at a ball.”’)

Ancestors

Another possible oversight is the way in which the dead and past are no longer taken into consideration. This is due in part to the expanding circle itself: if moral progress is indeed being made, and the weight of one’s voice is related to how moral one was, then it follows past people (by being immoral) may be ignored. We pay attention to Jefferson in part because he was partially moral, and we pay no attention to a Southern planter who was not even partially moral by our modern lights.

More dramatically, we dishonor our ancestors by neglecting their graves, by not offering any sacrifices or even performing any rituals, by forgetting their names (can you name your great-grandparents?), by selling off the family estate when we think the market has hit the peak, and so on.

Even if the dead sacrifice and save up a large estate to be used after their death for something they greatly valued, we freely ignore their will when it suits us, assuming the courts will execute the will at all (“perpetuities” being outright forbidden by law and treated skeptically by respected jurists like Richard Posner4 despite being prima facie highly desirable; although similar institutions like the deed restriction has both good and bad aspects, as does the Muslim world’s practical experience with the waqf institution, covered in an appendix). Contrast this with the ability of the wealthy in far gone eras to endow eternal flames, or masses continually said or sutras recited for their soul, or add conditions to their property like ‘no Duke in my line shall marry a Catholic’, or set up perpetual charities (as in the Muslim or Indian worlds). The dead are ill-respected, and are not even secure in their graves (what shame to hand over remains to be destroyed by alchemists in their bizarre unnatural procedures, whatever those “scientists” claim to be doing). The “dead hand of the past” was once more truly the ‘live hand’—a vital component of society and the world; from Ryszard Kapuscinski, The Shadow of the Sun 200222ya, pg 36–37 (quoted in James L. Kugel’s In the Valley of the Shadow 201113ya, pg 33):

The spiritual world of the ‘African’ (if one may use the term despite its gross simplification) is rich and complex, and his inner life is permeated by a profound religiosity. He believes in the coexistence of three different yet related worlds.

The first is the one that surrounds us, the palpable and visible reality composed of living people, animals, and plants, as well as inanimate objects: stones, water, air. The second is the world of the ancestors, those who died before us, but who died, as it were, not completely, not finally, not absolutely. Indeed, in a metaphysical sense, they continue to exist, and are even capable of participating in our life, of influencing it, shaping it. That is why maintaining good relations with one’s ancestors is a precondition of a successful life, and sometimes even of life itself. The third world is the rich kingdom of the spirits—spirits that exist independently, yet at the same time are present in every being, in every object, in everything and everywhere. At the head of these three worlds stands the Supreme Being, God. Many of the bus inscriptions speak of omnipresence and his unknown omnipotence: ‘God is everywhere’, ‘God knows what he does’, ‘God is mystery’.

Continuing Kugel:

It is not difficult to imagine our own ancestors some generations ago living in such a world. Indeed, many of the things that Kapuściński writes about Africans are easily paralleled by what we know of the ancient Near east, including the cult of the dead. Though largely forbidden by official, biblical law, consulting dead ancestors, contacting them through wizards or mediums—in fact, providing the deceased with water and sustenance on a regular basis via feeding tubes specially implanted at their burial sites (because, as Kapuściński writes, those relatives have ‘died, as it were, not completely, not finally, not absolutely’)—were practices that have been documented by archaeologists within biblical Israel and, more widely, all across the eastern Mediterranean, as well as in Mesopotamia and even in imperial Rome.6 More generally, those three overlapping worlds Kapuściński describes—one’s physical surroundings, one’s dead ancestors, and the whole world of God and the divine—have been described elsewhere by ethnographers working in such diverse locales as the Amazon rain forests, New Guinea, and Micronesia….For centuries and millennia, we were small, dwarfed by gods and ancestors and a throbbing world of animate and inanimate beings all around us, each with its personal claim to existence no less valid than our own.5

The dead have been ejected utterly from the “expanding circle” and indeed, exhumed in the thousands from the Egyptian sands to be used as paper, burned as convenient fuel, turned into folk remedies, or ground into the lovely paint colors caput mortuum & mummy brown. One might say that there has never been a worse time to be dead.

This is particularly amusing given that one of the primary purposes of property was to honor and support the dead, and be honored by subsequent generations in turn; from Francis Fukuyama’s The Origins of Political Order (201113ya):

According to Fustel de Coulanges, it was in no way comparable to Christian worship of saints: “The funeral obsequies could be religiously performed only by the nearest relative … They believed that the dead ancestor accepted no offerings save from his own family; he desired no worship save from his own descendants.” Moreover, each individual has a strong interest in having male descendants (in an agnatic system), since it is only they who will be able to look after one’s soul after one’s death. As a result, there is a strong imperative to marry and have male children; celibacy in early Greece and Rome was in most circumstances illegal. The result of these beliefs is that an individual is tied both to dead ancestors and to unborn descendants, in addition to his or her living children. As Hugh Baker puts it with regard to Chinese kinship, there is a rope representing the continuum of descent that “stretches from Infinity to Infinity passing over a razor which is the Present. If the rope is cut, both ends fall away from the middle and the rope is no more. If the man alive now dies without heir, the whole continuum of ancestors and unborn descendants dies with him … His existence as an individual is necessary but insignificant beside his existence as the representative of the whole.”39

…The emergence of modern property rights was then postulated to be a matter of economic rationality, in which individuals bargained among themselves to divide up the communal property, much like Hobbes’s account of the emergence of the Leviathan out of the state of nature. There is a twofold problem with this scenario. The first is that many alternative forms of customary property existed before the emergence of modern property rights. While these forms of land tenure may not have provided the same incentives for their efficient use as do their modern counterparts, very few of them led to anything like the tragedy of the commons. The second problem is that there aren’t very many examples of modern property rights emerging spontaneously and peacefully out of a bargaining process. The way customary property rights yielded to modern ones was much more violent, and power and deceit played a large role.5

…The earliest forms of private property were held not by individuals but by lineages or other kin groups, and much of their motivation was not simply economic but religious and social as well. Forced collectivization by the Soviet Union and China in the twentieth century sought to turn back the clock to an imagined past that never existed, in which common property was held by nonkin. Greek and Roman households had two things that tied them to a particular piece of real estate: the hearth with its sacred fire, which resided in the household, and nearby ancestral tombs. Land was desired not simply for its productive potential but also because it was where dead ancestors and the family’s unmovable hearth resided. Property needed to be private: strangers or the state could not be allowed to violate the resting place of one’s ancestors. On the other hand, these early forms of private property lacked a critical characteristic of what we regard today as modern property: rights were generally usufructuary (that is, they conveyed the right to use land but not to own it), making it impossible for individuals to sell or otherwise alienate it.6 The owner is not an individual landlord, but a community of living and dead kin. Property was held as a kind of trust on behalf of the dead ancestors and the unborn descendants, a practice that has parallels in many contemporary societies. As an early twentieth-century Nigerian chief said, “I conceive that land belongs to a vast family of which many are dead, few are living and countless members are still unborn.”7 Property and kinship thus become intimately connected: property enables you to take care of not only preceding and succeeding generations of relatives, but of yourself as well through your ancestors and descendants, who can affect your well-being.

In some parts of precolonial Africa, kin groups were tied to land because their ancestors were buried there, much as for the Greeks and Romans.8 But in other long-settled parts of West Africa, religion operated differently. There, the descendants of the first settlers were designated Earth Priests, who maintained Earth Shrines and presided over various ritual activities related to land use. Newcomers acquired rights to land not through individual buying and selling of properties but through their entry into the local ritual community. The community conferred access rights to planting, hunting, and fishing not in perpetuity but as a privilege of membership in the community.9 In tribal societies, property was sometimes communally owned by the tribe. As the historical anthropologist Paul Vinogradoff explained of the Celtic tribes, “Both the free and the unfree are grouped in [agnatic] kindreds. These kindreds hold land in communal ownership, and their possessions do not as a rule coincide with the landmarks [boundaries] of the villages, but spread spider-like through different settlements.”10 Communal ownership never meant that land was worked collectively, however, as on a twentieth-century Soviet or Chinese collective farm. Individual families were often allocated their own plots. In other cases, properties were individually owned but severely entailed by the social obligations that individuals had toward their kin-living, dead, and yet to be born.11 Your strip of land lies next to your cousin’s, and you cooperate at harvest-time; it is unthinkable to sell your strip to a stranger. If you die without male heirs, your land reverts to the kin group. Tribes often had the power to reassign property rights. According to Vinogradoff, “On the borders of India, conquering tribes have been known to settle down on large tracts of land without allowing them to be converted into separate property even among clans or kindreds. Occasional or periodical redivisions testified to the effective overlordship of the tribe.”12

Customary property held by kin groups still exists in contemporary Melanesia. Upward of 95% of all land is tied up in customary property rights in Papua New Guinea and the Solomon Islands. When a mining or palm oil company wants to acquire real estate, it has to deal with entire descent groups (wantoks).13 Each individual within the descent group has a potential veto over the deal, and there is no statute of limitations. As a result, one group of relatives may decide to sell their land to the company; ten years later, another group may show up and claim title to the same property, arguing that the land had been unjustly stolen from them in previous generations.14 Many individuals are unwilling to sell title to their land under any conditions, since the spirits of their ancestors dwell there. But the inability of individuals within the kin group to fully appropriate their property’s resources, or to be able to sell it, does not necessarily mean that they neglect it or treat it irresponsibly. Property rights in tribal societies are extremely well specified, even if that specification is not formal or legal.156

In Emmanuel Le Roy Ladurie’s famous micro-ethnography Montaillou, of an obscure French village notable chiefly for the extremely detailed Inquisition records allowing reconstruction of the villagers’ history & beliefs & economy, Ladurie struggles to explain to the modern reader one of the overriding moral concerns of the villagers: their preoccupation with the social structure of the domus, or ‘house’ (pg24, Chapter II), which is not quite a physical house nor quite a clan/family linked by blood nor ancestor-worship, but crucial to Montaillou—the Cathar heresy spread domus by domus, resistance to informing was done out of concern for the domus, threats to the domus were employed by the Inquisition, etc.

Property Rights

An example of the interests of the dead being neglected—even at substantial harm to the living—is not far from hand. English common law explicitly bans wills or trusts that operate indefinitely through a rule against perpetuities; the application can be very tricky, forbidding even apparently legitimate short-term specifications.

This trickiness reflects the basic desirability of such contracts. Indeed, under a basic economic analysis of compound interest, respecting the wishes of even distant ancestors is valuable—we should hardly quibble about the odd billion devoted to an eternal flame for Ahura Mazda or child sacrifice to Moloch if it means additional trillions of dollars of growth in the economy (a conclusion which as stated may seem objectionable, but when hidden as a parable seems sensible).

Nor is the suggestion of very long-term investments and perpetuities purely theoretical: Benjamin Franklin succeeded in exactly this, turning 2,000 pounds into $17,773,518.72$7,000,0001990+ over 2 centuries; Anna C. Mott’s $1,000 turned only into $362,370.67$215,0002002 in 200222ya due to a shorter maturity, Wellington R. Burt succeeded in turning his few millions into $100 million (although if not for the feuding, it would have turned into many billions). Very old continuous organizations like the Catholic Church or Fuggerei are more common than one might think; see Wikipedia on the oldest companies & newspapers, universities, churches, and madrasahs.

Sadly, when we look at subsequent history, the chief risk to such philanthropy is not inflation, taxes, or any of the other failure modes triumphantly suggested as refutations, but legal hostility. The estate of Franklin’s first imitator, Peter Thellusson (who sought to benefit his descendants), was embroiled in the Thellusson Will Case on which more than 100 lawyers earned their daily bread (paid out of the interest of course) for the next 62 years; would-be philanthropist Jonathan Holden’s millions were likewise eaten up, the trusts broken by the living, and nothing even named after Holden. The lack of perpetuities endangers arrangements one might want; Richard Dawkins in The God Delusion describes an example of only partially-kept religious perpetuities and draws the appropriate lesson for (secular) long-term projects like cryonics7 or the Long Now8:

Even in the Middle Ages, money was not the only currency in which you could buy parole from purgatory [indulgences]. You could pay in prayers too, either your own before death or the prayers of others on your behalf, after your death. And money could buy prayers. If you were rich, you could lay down provision for your soul in perpetuity. My own Oxford College, New College, was founded in 1379 (it was new then) by one of that century’s great philanthropists, William of Wykeham, Bishop of Winchester. A medieval bishop could become the Bill Gates of the age, controlling the equivalent of the information highway (to God), and amassing huge riches. His diocese was exceptionally large, and Wykeham used his wealth and influence to found two great educational establishments, one in Winchester and one in Oxford. Education was important to Wykeham, but, in the words of the official New College history, published in 197945ya to mark the sixth centenary, the fundamental purpose of the college was ‘as a great chantry to make intercession for the repose of his soul. He provided for the service of the chapel by ten chaplains, three clerks and sixteen choristers, and he ordered that they alone were to be retained if the college’s income failed.’ Wykeham left New College in the hands of the Fellowship, a self-electing body which has been continuously in existence like a single organism for more than six hundred years. Presumably he trusted us to continue to pray for his soul through the centuries.

Today the college has only one chaplain and no clerks, and the steady century-by-century torrent of prayers for Wykeham in purgatory has dwindled to a trickle of two prayers per year. The choristers alone go from strength to strength and their music is, indeed, magical. Even I feel a twinge of guilt, as a member of that Fellowship, for a trust betrayed. In the understanding of his own time, Wykeham was doing the equivalent of a rich man today making a large down payment to a cryogenics company which guarantees to freeze your body and keep it insulated from earthquakes, civil disorder, nuclear war and other hazards, until some future time when medical science has learned how to unfreeze it and cure whatever disease it was dying of. Are we later Fellows of New College reneging on a contract with our Founder? If so, we are in good company. Hundreds of medieval benefactors died trusting that their heirs, well paid to do so, would pray for them in purgatory. I can’t help wondering what proportion of Europe’s medieval treasures of art and architecture started out as down payments on eternity, in trusts now betrayed.

Descendants

If the past has been excluded from the circle, what of the future? One wonders.

The demographic transition is a curious phenomenon, and one that is putting many developed nations below replacement fertility; when combined with national and private debt levels unprecedented in history, and depletion of non-renewable resources, that suggests a certain disregard for descendants. Yes, all that may have resulted in higher economic growth which the descendants can then use to purchase whatever bundle of goods they find most desirable, but as with banks lending money, it only takes one blow-up to render the net returns negative. (If a multi-lateral thermonuclear war bombs the world back to the stone age—what is the net global growth rate from the Neolithic to WWIII? Is it positive or negative? If, knowing this risk, we continue to “borrow from the future”, are we guilty of coercive fraud? This is an important question since war casualties historically follow what appears to be a power law.)

There are no explicit advocates for futurity, and no real place for them in contemporary ethics besides economics’s idea of exponential discounting (which has been criticized for making any future consequence, no matter how terrible, almost irrelevant as long as it is delayed a century or two). Has the living’s concern for their descendants, the inclusion of the future into the circle or moral concern, increased or decreased over time? Whichever one’s opinion, I submit that the answer is shaky and not supported by excellent evidence.

Conclusion

One of the most difficult aspects of any theory of moral progress is explaining why moral progress happens when it does, in such apparently random non-linear jumps. (Historical economics has a similar problem with the Industrial Revolution & Great Divergence.) These jumps do not seem to correspond to simply how many philosophers are thinking about ethics. As we have already seen, the straightforward picture of ever more inclusive ethics relies on cherry-picking if it covers more than, say, the past 5 centuries; and if we are honest enough to say that moral progress isn’t clear before then, we face the new question of explaining why things changed then and not at any point previous in the 2500 years of Western philosophy, which included many great figures who worked hard on moral philosophy such as Plato or Aristotle. It is also troubling how much morality & religion seems to be correlated with biological factors. Even if we do not go as far as Julian Jaynes’s9 theories of gods as auditory hallucinations, there are still many curious correlations floating around.

Given these shrinking circles, should we call it an expanding circle or a shifting circle?

Appendix

The Fukuyama Thesis

Western liberal socialist capitalism is the attractor state for industrialized humanity for the foreseeable future, and that barring existential risks etc, we will see a long-term trend (possibly very noisy in the short-term) towards that; alternative systems of economics & governance may have sporadic successes but fundamentally have no broad allegiance among the intelligentsia and richer countries; and that most criticisms of the Fukuyama thesis are based on misunderstandings, pop-culture simplifications, ignorance, reasoning from anecdotes, and ignoring long-term trends in favor of some brief regressions.

One restricted, almost purely empirical, version of the general Whiggish/progressive thesis is offered courtesy of Francis Fukuyama’s 199232ya The End of History and the Last Man; to steal a Fukuyama quote, the general understanding of his thesis is:

What we may be witnessing is not just the end of the Cold War, or the passing of a particular period of post-war history, but the end of history as such: that is, the end point of mankind’s ideological evolution and the universalization of Western liberal democracy as the final form of human government.

This isn’t the same thing as the ‘expanding circle thesis’, but rather an almost trivial observation that Western liberal democracy has steadily expanded the ranks of believers and as time passes, ever more countries slip into liberal democracies (of varying qualities) and few slip back into some alternative form like a divine-right absolute monarchy. Curiously, one sometimes sees descriptions of Fukuyama as debunked by 9/11, such as this gem in a National Interest review:

He rejected, for example, Francis Fukuyama’s heralded “End of History” thesis—that Western liberal democracy represents the final form of human governance—when it appeared in this magazine in 198935ya. History, it turned out, lingered long enough to prove Gray right and Fukuyama wrong.

I disagree in the strongest possible terms: this is a grossly uncharitable reading of Fukuyama and a shocking misunderstanding of the last 20 years of geopolitical changes & key historic events. History has not proved Fukuyama wrong, it has proven him right. And he foresaw both this and that people would be unable to understand this; the very first paragraph of his 198935ya article “The End Of History?” ends in a line that should shock the average reader of 2015 to their core:

Most of these analyses lack any larger conceptual framework for distinguishing between what is essential and what is contingent or accidental in world history, and are predictably superficial. If Mr. Gorbachev were ousted from the Kremlin or a new Ayatollah proclaimed the millennium from a desolate Middle Eastern capital, these same commentators would scramble to announce the rebirth of a new era of conflict.

I ask simply this: what credible alternatives are there to Western liberal democracy with regulated capitalism? For concreteness’s sake, let us specify Norway as our paradigmatic Western liberal democracy. If you can provide a form of governance which has the allegiance of hundreds of millions of people as improving on Norway as a goal or ideal, then you have succeeded. So many people claim Fukuyama is not just wrong, but laughably incredibly wrong, that this should be easy.

Communism was fully discredited and remains discredited, no one disputes that; Cuba & North Korea inspire no one and are a lingering refutation of the idea combined with a demonstration of common communist failure modes.

Socialism is not clearly distinguishable from the above summary, and in key countries like the USA, Britain, or France, has been in constant retreat since the days of unions and regulated airlines and government rations of cheese and coal.

Anarchistic self-governing communes? Hardly.

City-states? There have been no new ones, one of the existing 2 (Hong Kong) has fallen under Chinese sway, and attempts to create new ones in the form of Paul Romer’s charter city have died at the hands of nationalist democracies; they have neither come into existence nor won a global intellectual or mass following. Seasteading is even more obscure and unsuccessful. And it’s worth noting that even though it was famously characterized as a “Disneyland with the death penalty” and commonly put forth as a new kind of governance, the technocratic parliamentary democracy of Singapore has nothing new to it—the American goo-goos, and their Silicon Valley contemporaries, would have approved.

Monarchism? It continues its shabbily genteel decline into tourist fodder. I can think of perhaps one counter-example where a monarch may have increased their power (Thailand), but its reliance on thuggish tactics like lèse majesté laws, suggest, if anything, its leaders perceive serious underlying weakness (especially after the 200618ya coup), continued internal political division, and the monarch Bhumibol Adulyadej’s substantial age plus multiple military coups suggest Thailand has only postponed its reckoning (après nous, le Déluge?). This is further proof of the Fukuyama thesis, not evidence against.

What about 9/11? Surely if this disproved Fukuyama as so many commentators claimed, it must have demonstrated the rise of a new form of government that would revolutionize key countries, be fervently espoused by millions, and seize the imaginations & minds of intellectuals the world over—surely we can point to many successful revolutions spear-headed by al-Qaeda, to new Caliphates, and to the Caliph consolidating the Dar al-Islam under his benevolent & divinely ordained rule and expanding the realm of peacefulness? People, look at the Arab Spring. It happened before our eyes in exhaustive detail, you have no excuse for ignorance of what the protesters sought or what the results have been. Did it yield any caliphates? Empires? Monarchies? Self-governing city-states? Hanseatic Leagues? Or heck, anarchistic autonomous communes? Of course not. It yielded more representative governments—eg. Tunisia, Egypt. (They are still far from being Western utopias, but who can argue that they are worse than 50 years ago?) On the contrary, radical Islam has been increasingly unpopular as the brutality and ineffectiveness of terrorism became evident, the success of peaceful liberal-style protests became apparent. And in the one country where Islamists (the Muslim Brotherhood) have successfully been elected (note the verb), Egyptians have been learning that being fervent believers is not a job qualification for effective corruption-free government and the President has become increasingly unpopular with both his party and the populace.

Well, OK, but what about… um… Hugo Chavez’s “Chavismo”? No. Chavez never succeeded in gaining hegemony over Latin America or exporting his ‘revolution’ (a warmed-over socialism), and did succeed in destroying Venezuela’s economy, looting & crippling its oil industry, and died having failed to turn Chavismo into anything but a personality cult combined with standard welfare give-away tactics for gaining votes. It is barely alive in Venezuela and non-existent overseas. This is no refutation.

Iran’s “Islamic Revolution”? Iran never had any sway in the Sunni Muslim world, and what it did, it has likely forfeited due to its support for Bashar Assad trying to roll back his personal Arab Spring; the urban protests belie any claim that the regime can earn legitimacy in the face of Western liberal democratic memes, and the uniquely Iranian aspects of their democracy like the Council of Guardians banning presidential candidates inspire no admiration either in or outside the country. The Iranian economic model has fostered massive corruption due to the Revolutionary Guards and the bonyads, failed to provide jobs for a youth bulge, is not robust against sanctions, and is running perilously high inflation. No one admires the Iranian economic model. The Islamic Revolution is not a counter-argument to Fukuyama, and no one suggested this in 199232ya either though the revolution happened decades before.

Various Sunni militant extremists, with AK-47s in one hand and Shari’a law in the other? Ask Kenyans or Malians or Afghanis or Somalians how attractive they found such a system of governance in practice.

Similar comments hold for Russia: crony capitalism & ethnic prejudice might be inspirational for dictators like Kim Jong-Eun thinking about how to ride the economic growth tiger without being deposed, but no intellectuals or masses believe it is superior to Western liberal democracy. Growth is based principally on resource extraction, wealthy Russians maintain overseas ties to hide their wealthy from the regime and ensure themselves an escape route, Putin has allied himself with the Orthodox Church for support and manpower, and there are occasional protests despite surveillance, routine executions of journalists, etc. Where are the Communist apologists of yesteryear?

We could say something very similar about China, as well. In particular with China, we have explosive growth papering over huge problems in the underlying levels of corruption, economic equity, separatism, liberalizing population, and parking children and wealth overseas. People praising the Communist Party there seem to ignore the striking parallels to Japanese growth in the 1980s, where its own overweening government agencies & practices of questionable integrity & sclerotic consensus-building were praised as an unique new form of national & corporate governance—right up until the crash revealed the truth and largely discredited them. As Fukuyama comments in his 2014 retrospective evaluation of how well the thesis is doing:

The only system out there that would appear to be at all competitive with liberal democracy is the so-called “China model,” which mixes authoritarian government with a partially market-based economy and a high level of technocratic and technological competence. Yet if asked to bet whether, 50 years from now, the U.S. and Europe would look more like China politically or vice versa, I would pick the latter without hesitation.

Scott Sumner remarks bluntly:

This story [a Nigerian suicide bomber for Boko Haram] is emblematic of something I’ve noticed seems increasingly common in the 21st century—political movements that appear exceedingly stupid…To an educated westerner the statements made by the anti-western leaders (as well as terrorist groups like ISIS and Boko Haram) don’t just seem offensive, they seem extremely stupid. I’ve talked to Venezuelans who told me that Chavez would give long speeches on TV that were almost mind-bogglingly stupid. Anyone who has read the various laughable claims made for the Kim family in North Korea has to wonder what the North Korean people make of the absurd propaganda…Both the US and Soviets, as well as their allies, at least tried to make their political models look appealing to the nonaligned countries, and to intellectuals. And to some extent they succeeded—lots of western intellectuals were on each side of the debate. There is almost no western intellectual support for the militarism and gay bashing of Putin, or the racism of Mugabe, or the stoning to death of adulterers and homosexuals. Nor for the kidnapping of school girls that get sold into slavery. The North Korean dynasty is treated like a bad joke. Only Chavez had a bit of support among western intellectuals, and that’s mostly gone now, as Venezuela keeps deteriorating under his replacement.

None of these models come even close to satisfying the requirement of convincing even a small fraction of the world that they are more desirable end-states or equilibriums than mature Western liberal democracies. (“Caliphism, comrades, has never truly been tried!”) There is no credible alternatives for current humans—although I cannot frame any hypotheses about what the ideal post-human society or governance will be.

Fukuyama was right. There are no credible alternatives to the capitalist liberal democracy paradigm. Human history has ended. And we await the resumption of history with fear and trembling.

Islamic Waqfs

Excerpts of articles by Timur Kuran reviewing the Islamic corporate charity form of the waqf: charities with perpetual endowments which are locked into narrow missions by founding documents. Kuran argues that this structure, while gradually engrossing every more of the economy, was corrupt and inefficient, damaging growth of Islamic countries which employed it.

“The Provision of Public Goods under Islamic Law: Origins, Impact, and Limitations of the Waqf System”, by Timur Kuran; Law & Society Review, Vol. 35, No. 4 (200123ya), pp. 841–898. The basic idea:

A waqf is an unincorporated trust established under Islamic law by a living man or woman for the provision of a designated social service in perpetuity. Its activities are financed by revenue-bearing assets that have been rendered forever inalienable. Originally the assets had to be immovable, although in some places this requirement was eventually relaxed to legitimize what came to be known as a “cash waqf.”

Waqfs were not an Islamic innovation, exactly; may have had Persian antecedents, but certainly we can find earlier analogies:

One inspiration for the waqf was perhaps the Roman legal concept of a sacred object, which provided the basis for the inalienability of religious temples. Another inspiration might have been the philanthropic foundations of Byzantium, and still another the Jewish institution of consecrated property (hekdesh). But there are important differences between the waqf and each of these forerunners. A Roman sacred object was authorized, if not initiated, by the state, which acted as the property’s administrator (K6prluii 194282ya:7–9; Barnes 198737ya:5–8). By contrast, a waqf was typically established and managed by individuals without the sovereign’s involvement. Under Islamic law, the state’s role was limited to enforcement of the rules governing its creation and operation. A Byzantine philanthropic foundation was usually linked to a church or monastery, and it was subject to ecclesiastical control (Jones 198044ya:25). A waqf could be attached to a mosque, but often it was established and administered by people outside the religious establishment. Finally, whereas under Jewish law it was considered a sacrilege to consecrate property for one’s own benefit (Elon 197153ya:280–88), there was nothing to keep the founder of a waqf from appointing himself as its first administrator and drawing a hefty salary for his services.

These perpetuities were huge; modern Iran’s bonyads are estimated at 20% of its GDP and the waqfs may have been bigger and correspondingly active:

Available aggregate statistics on the assets controlled by waqfs come from recent centuries. At the founding of the Republic of Turkey in 1923101ya, three-quarters of the country’s arable land belonged to waqfs. Around the same time, one-eighth of all cultivated soil in Egypt and one-seventh of that in Iran stood immobilized as waqf property. In the middle of the 19th century, one-half of the agricultural land in Algeria, and in 1883 one-third of that in Tunisia, was owned by waqfs (Heffening1936:1100; Gibb & Kramers 196163ya:627; Barkan 193985ya:237; Baer 196856yab:79–80). In 1829195ya, soon after Greece broke away from the Ottoman Empire, its new government expropriated waqf land that composed about a third of the country’s total area (Fratcher 197351ya:114). Figures that stretch back the farthest pertain to the total annual income of the waqf system. At the end of the 18th century, it has been estimated, the combined income of the roughly 20,000 Ottoman waqfs in operation equaled one-third of Ottoman state’s total revenue, including the yield from tax farms in the Balkans, Turkey, and the Arab world (Yediyildlz 198440ya:26). Under the assumption that individuals cultivating waqf land were taxed equally with those working land belonging to state-owned tax farms, this last figure suggests that roughly one-third of all economically productive land in the Ottoman Empire was controlled by waqfs.

…here is abundant evidence that even a single waqf could carry great economic importance. Jerusalem’s Haseki Sultan charitable complex, founded in 1552472ya by Haseki Hurrem, wife of Suleyman the Magnificent and better known in the West as Roxelana, possessed 26 entire villages, several shops, a covered bazaar, 2 soap plants, 11 flour mills, and 2 bathhouses, all in Palestine and Lebanon. For centuries the revenues produced by these assets were used to operate a huge soup kitchen, along with a mosque and two hostels for pilgrims and wayfarers (Peri 199232ya:170–71). In the 18th century, a waqf established in Aleppo by Hajj Musa Amiri, a member of the local elite, included 10 houses, 67 shops, 4 inns, 2 storerooms, several dyeing plants and baths, 3 bakeries, 8 orchards, and 3 gardens, among various other assets, including agricultural land (Meriwether 199925ya: 182–83)…many of the architectural masterpieces that symbolize the region’s great cities, were financed through the waqf system. So were practically all the soup kitchens in operation throughout the region. By the end of the 18th century, in Istanbul, whose estimated population of 700,000 made it the largest city in Europe, up to 30,000 people a day were being fed by charitable complexes (imarets) established under the waqf system (Huart 192797ya:475).

Such wealth would make them targets just like the Catholic Church was targeted by King Henry—but perhaps with different results (surprising since waqfs seem predicated on ordinary property rights being insecure, especially compared with England):

The consequent weakness of private property rights made the sacred institution of the waqf a convenient vehicle for defending wealth against official predation. Expropriations of waqf properties did occur, especially following conquests or the replacement of one dynasty by another. However, when they occurred, they usually generated serious resistance. During the two and a half centuries preceding Egypt’s fall to the Turks in 1517507ya, no fewer than six revenue-seeking Mameluke rulers attempted to confiscate major waqfs; primarily because of judicial resistance, their efforts were largely unsuccessful (Yediyildlz 198242yaa:161). In the 1470s the Ottoman sultan Mehmed II expropriated scores of waqfs to raise resources for his army and his unusually broad public works program. His conversion of hundreds of waqf-owned villages into state property generated a strong reaction, and it influenced the succession struggle that followed his death. Moreover, his son Bayezid II, upon acceding to the throne, restored the confiscated lands to their former status (Repp 198836ya:128–29; Inalclk 195569ya:533). Such episodes underscored the relative security of waqf property. …Precisely because of the commonness of this motive, when a state attempted to take over a waqf it usually justified the act on the ground that it was illegitimate (Akgunduz 199628ya:523–61). Accordingly, its officials tried to convince the populace that the expropriated properties belonged to the state to begin with or simply that the waqf founder had never been their legitimate owner.23

The waqf structure did succeed, as economics might predict, in increasing the amount dedicated to charity, as we can see comparing religious groups’ participation:

Accordingly, up to the 19th century Jews and Christians were ordinarily permitted to establish only functionally similar institutions (Akgindfiz 199628ya:238–41). Unlike waqfs, these would not be overseen by the Islamic courts or enjoy the protection of Islamic law. We know that actual practices varied. In certain periods and regions influential non-Muslims were permitted to establish waqfs.14 Yet, the requirement pertaining to the founder’s religion was generally effective. Non-Muslims were less inclined than equally wealthy Muslims to establish and fund charitable foundations of any kind, even ones to serve mostly, if not exclusively, their own religious communities (Masters 198836ya:173–74; Jennings 199034ya:308–9; Marcus 198935ya:305).15 This pattern changed radically only in the 19th century, when the right to establish waqfs was extended to the members of other faiths (Cadlrcl 199133ya:257–58). At this point it became common for wealthy Jews and Christians to establish waqfs under a permissive new variant of Islamic law (Shaham 199133ya:460–72; Afifi 199430ya:119–22).16

The chief flaw in waqfs was the ‘dead hand’—perpetual meant perpetual:

To start with the former type of rigidity, the designated mission of a waqf was irrevocable. Ordinarily not even the founder of a waqf could alter its goals. Wherever possible, the objectives specified in the waqf deed had to be pursued exactly. This requirement, if obeyed to the letter, could cause a waqf to become dysfunctional. Imagine a richly endowed waqf established to build and support a particular caravanserai. Two centuries later, let us also suppose, a shift in trade routes idles the structure. If the long-dead founder had neglected to permit future mutawallis to use their own judgment in the interest of supporting commerce through the most efficient means, his waqf’s assets could not be transferred from the now dysfunctional caravanserai to, say, the administration of a commercial port. They could not be shifted even to another caravanserai. At least for a while, therefore, the resources of the waqf would be used inefficiently. Probably because this danger of serious efficiency loss gained recognition early on, the architects of the waqf system made the residuary mission of every waqf the benefit of the poor.36 This rule meant that the assets supporting a dysfunctional caravanserai would eventually be transferred to a public shelter or a soup kitchen, thus limiting the misallocation of resources. But in tempering one form of inefficiency this measure created another. The resources devoted to poor relief would grow over time, possibly dampening incentives to work. The earlier-reported evidence of Istanbul’s soup kitchens feeding 30,000 people a day points, then, to more than the waqf system’s success in providing social services in a decentralized manner. Perhaps it shows also that the system could generate a socially costly oversupply of certain services. This is the basis on which some scholars have claimed that the waqf system contributed to the Islamic world’s long economic descent by fostering a large class of indolent beneficiaries (Akdag 197945ya:128–30; Cem 197054ya:98–99).37

Not only were these recognized but steps were taken to mitigate them. The typical Ottoman waqf deed contained a standard formulary featuring a list of operational changes the mutawalli was authorized to make. However, unless explicitly stated otherwise, he could make only one set of changes; once the waqf’s original rules had undergone one modification, there could not be another reform (Akgindiiz 199628ya:257–70; Little 198440ya:317–18). This point qualifies, but also supports the observation that the waqf system suffered from operational rigidities. Sooner or later every waqf equipped with the standard flexibilities would exhaust its adaptive capacity…It is on this basis that in 1789235ya, some 237 years after the establishment of the Haseki Sultan complex, its mutawalli decided against hiring a money changer, even though some employees wanted the appointment to cope with rising financial turnover (Peri 199232ya:184–85).

Finally, if the founder had not explicitly allowed the waqf to pool its resources with those of other organizations, technically achievable economies of scale could remain unexploited. In particular, services that a single large waqf could deliver most efficiently—road maintenance, piped water—might be provided at high cost by multiple small waqfs. Founders were free, of course, to stipulate that part, even all, of the income of their waqfs be transferred to a large waqf. And scattered examples of such pooling of waqf resources have been found (Cizakga 200024ya:48).40 The point remains, however, that if a waqf had not been designed to participate in resource pooling it could not be converted into a “feeder waqf” of another, ordinarily larger waqf. Even if new technologies came to generate economies of scale unimaginable at the waqf’s inception, the waqf would have to continue operating independently. Rifaah al-Tahtawi, a major Egyptian thinker of the 19th century, put his finger on this problem when he wrote, “Associations for joint philanthropy are few in our country, in contrast to individual charitable donations and family endowments, which are usually endowed by a single individual” (Cole 200024ya).

On this basis one may suggest that the “static perpetuity” principle of the waqf system was more suitable to a slowly changing economy than to one in which technologies, tastes, and lifestyles undergo revolutionary changes within the span of a generation. Even if adherence to the principle was only partial-as discussed later, violations were hardly uncommon-in a changing economy the efficiency of the waqf system would have fallen as a result of delays in socially desirable adjustments.42 This interpretation is consistent with the fact that in various parts of the modern Islamic world the legal infrastructure of the waqf system has been, or is being, modified to endow mutawallis with broader operational powers. Like many forms of the Western trust, a modern waqf is a corporation-an internally autonomous organization that the courts treat as a legal person.43 As such, its mutawalli, which may now be a committee of individuals or even another corporation, enjoys broad rights to change its services, its mode and rules of operation, and even its goals, without outside interference. This is not to say that a mutawalli is now unconstrained by the founder’s directives. Instead, there is no longer a presumption that the founder’s directives were complete, and the mutawalli, or board of mutawallis, is expected and authorized to be much more than a superintendent following orders. A modern mutawalli is charged with maximizing the overall return on all assets, subject to intertemporal tradeoffs and the acceptability of risk. The permanence of any particular asset is no longer an objective in itself. It is taken for granted that the waqf’s substantive goals may best be served by trimming the payroll to finance repairs or by replacing a farm received directly from the founder with equity in a manufacturing company. …The ongoing reforms of the waqf system amount, then, to an acknowledgment that the rigidities of the traditional waqf system were indeed sources of inefficiency.

Such inefficiency is consistent with one estimate of the beneficial economic effects of the English Dissolution of the Catholic Church’s holdings (Heldring et al 2015). The natural approach was to add in new flexibility by two routes; first, explicit flexibility in the incorporation:

It was not uncommon for founders to authorize their mutawallis to sell or exchange waqf assets (istibddl). Miriam Hoexter (199826ya:ch. 5) has shown that between the 17th and 19th centuries the mutawallis of an Algerian waqf established for the benefit of Mecca and Medina managed, acting on the authority they enjoyed, to enlarge this waqf’s endowment through shrewd purchases, sales, and exchanges of assets. In the same vein, Ronald Jennings (199034ya:279–80, 286) has observed that in 16th-century Trabzon some founders explicitly empowered their mutawallis to exercise their own judgment on business matters. He has also found that the courts with jurisdiction over Trabzon’s waqfs tolerated a wide range of adaptations.45 The waqfs in question were able to undertake repairs, adjust payments to suit market conditions, and rent out unproductive properties at rates low enough and for sufficiently long periods to entice renters into making improvements (Jennings 199034ya:335). Other scholars, in addition to providing examples of founder-endorsed plasticity, have shown that there were limits to the founder’s control over the waqf’s management, especially beyond his or her own lifetime. Said Arjomand (199826ya:117, 126) and Stephane Yerasimos (199430ya:43–45) independently note that the waqf deed could suffer damage or even disappear with the passage of time. It could also be tampered with, sowing doubts about the authenticity of all its directives. In such circumstances, the courts might use their supervisory authority to modify the waqf’s organization, its mode of operation, and even its mission. Moreover, even when no disagreements existed over the deed itself judges had the right to order unstipulated changes in the interest of either the waqf’s intended beneficiaries or the broader community. We have seen that such heavy handedness sometimes sparked resistance. Harmed constituencies might claim that the principle of static perpetuity had been violated. However, judges were able to prevail if they commanded popular support and the opponents of change were poorly organized. Yerasimos furnishes examples of 16th-century Ottoman construction projects that involved the successful seizure of ostensibly immobilized waqf properties, sometimes without full compensation. …There are ample indications that modification costs were generally substantial. As Murat (Sizakca 200024ya:16–21) observes, only some of the Islamic schools of law allowed sales and exchanges of waqf properties, and even these schools imposed various restrictions.

The second approach was to avoid inalienable assets—not real estate, but perhaps money or other financial instruments:

“Cash waqfs” thus emerged as early as the eighth century, earning income generally through interest-bearing loans (Qizakga 200024ya:ch. 3). Uncommon for many centuries, these waqfs provoked intense controversy as their numbers multiplied, because they violated both waqf law and the prohibition of interest (Mandaville 197945ya; Kurt 199628ya:10–21). According to their critics, not only was the cash waqf doubly un-Islamic but it consumed resources better devoted to charity and religion. Interestingly, the defenders invoked neither scripture nor the law. Conceding that the cash waqf violates classical Islamic principles, they pointed to its popularity and inferred that it had to be serving a valuable social function. In effect, they held that the cash waqf should be tolerated because it passes the utilitarian test of the market-the irreligious test now commonly used to justify popular, but perhaps ethically troubling, economic practices. The defenders of the cash waqf, who included prominent clerics, also lamented that their opponents, though perhaps knowledgeable of Islam, were ignorant of both history and the prevailing practical needs of their communities (Mandaville 197945ya:297–300, 306–8).

Because they met important needs and encountered little opposition outside of legal and religious circles, cash waqfs became increasingly popular. By the 16th century, in fact, they accounted for more than half of all the new Ottoman waqfs. Most of them were on the small side, as measured by assets (Cagatay 197153ya; Yediylldlz 199034ya:118–22; Masters 198836ya:161–63). One factor that accounts for their enormous popularity is the ubiquitous quest for wealth protection. Another was that there existed no banks able to meet the demand for consumption loans, only moneylenders whose rates reflected the risks they took by operating outside the strict interpretation of the law. Where and when the cash waqf enjoyed legal approval, it allowed moneylenders to operate more or less within the prevailing interpretation of Islamic law. If nothing else, the sacredness that flowed from its inclusion in the waqf system insulated its interest-based operations from the charge of sinfulness.

Both brought their own problems:

Yet, cash waqfs were by no means free of operational constraints. Like the founder of an ordinary waqf, that of a cash waqf could restrict its beneficiaries and limit its charges. Yediylldlz points to the deed of an 18th-century waqf whose founder required it to lend at exactly 10% and only to merchants based in the town of Amasya (Yediylldlz 199034ya:122). The restrictions imposed on a cash waqf typically reflected, in addition to the founder’s personal tastes and biases, the prevailing interest rates at the time of its establishment. Over time, these could become increasingly serious barriers to the waqf’s exploitation of profit opportunities. Precisely because the cash waqfs were required to keep their rates fixed, observes Cizakqa (200024ya:52–53), only a fifth of them survived beyond a century…Revealingly, the borrowers of the 18th-century cash waqfs of Bursa included their own mutawallis. These mutawallis lent on their own account to the moneylenders of Ankara and Istanbul, where interest rates were higher (Cizakqa 199529ya). Had the endowment deeds of these cash waqfs permitted greater flexibility, the gains reaped by mutawallis could have accrued to the waqfs themselves.

Insofar as these methods enhanced the acceptability of corruption, they would also have facilitated the embezzlement of resources ostensibly immobilized for the provision of social services, including public goods and charitable causes. Embezzlement often occurred through sales and exchanges of waqf properties. While such transactions could serve a waqf’s financial interests, and thus its capacity for meeting the founder’s goals, they were subject to abuse. Mutawallis found ways to line their own pockets through transactions detrimental to the waqf, for instance, the exchange of an economically valuable farm for the inferior farm of an uncle. A bribe-hungry judge might approve such a transaction under the pretext of duress, knowing full well that it was motivated more by personal gain than by civic duty. In certain times and places this form of embezzlement became so common that high officials took to treating waqf properties as alienable. In the early 16th century, right before the Ottomans occupied Egypt, a Mameluke judge ruled that the land on which the famous al-Azhar complex stands could be sold to someone looking for a site to build a mansion (Behrens-Abouseif 199430ya:146–47)…Her waqf was to support, she stated, “the poor and the humble, the weak and the needy… the true believers and the righteous who live near the holy places . . . [and] hold onto the sharia and strictly observe the commandments of the sunna” (Peri 199232ya:172). Since practically any Muslim resident of greater Jerusalem could qualify as either weak or devout, within a few generations huge numbers of families, including some of the richest, were drawing income from the waqf. Even an Ottoman governor managed to get himself on the waqf’s payroll, and he took to using the waqf as an instrument of patronage (Peri 199232ya:173–74). As Hfirrem’s waqf turned into a politicized source of supplementary income for people whom she would hardly have characterized as needy, the government in Istanbul tried repeatedly to trim the list of beneficiaries. Evidently it sensed that continued corruption would cause the waqf, and therefore Ottoman rule itself, to lose legitimacy. Yet the government itself benefited from showering provincial notables with privileges, which limited the reach of its reforms. After every crackdown the waqf’s managers returned to creating entitlements for the upper classes (Peri 199232ya:182–84). Ann Lambton (199727ya:305) gives examples of even more serious abuses from 14th-century Iran. Based on contemporary observations, she notes that practically all assets of the 500 waqfs in Shiraz had fallen into the hands of corrupt mutawallis bent on diverting revenues to themselves…One must not infer that managerial harm to the efficiency of waqfs stemmed only, or even primarily, from corruption. As Richard Posner (199232ya:511) observes in regard to charitable trusts in common law jurisdictions, the managers and supervisors of trusts established for the benefit of broad social causes generally lack adequate incentives to manage properties efficiently.

Contrast with European institutions:

Just as the premodern Middle East had inflexible waqfs, one might observe, the preindustrial and industrial West featured restrictions that inhibited the efficient administration of trusts (Fratcher 197351ya:22, 55, 66–71). …Do such facts invalidate the claim of this section, namely, that inflexibilities of the waqf system held the Middle East back as Europe took the lead in shaping the modern global economy? Two additional facts from European economic history may be advanced in defense of the presented argument. First, over the centuries the West developed an increasingly broad variety of trusts, including many that give a trustee-the counterpart of the mutawalli-greater operational flexibility. These came to include trusts to operate businesses, trusts to manage financial portfolios, and trusts to hold the majority of the voting shares in a corporation. Also, while it is doubtless true that certain Western trusts suffered from the sorts of rigidities that plagued the waqf system, other trusts mitigated these problems by equipping their trustees, or boards of trustees, with powers akin to those of a corporate board.

Another important difference concerns the powers of founders. As early as the 14th century, judges in England were discouraging waqf-like “perpetuities” through which donors could micromanage properties indefinitely, well after their deaths. Trusts providing benefits for unborn persons were declared invalid, or valid only if subject to destruction by prior beneficiaries. And in France, a law was instituted in 1560464ya to keep the founders of fideicommissa, trust-like devices grounded in Roman law, from tying the hands of more than two generations of beneficiaries (Fratcher 197351ya:11–12, 86). These cases of resistance to static perpetuity show that the immobilization of property also presented dangers in Europe. But they also demonstrate that successful attempts to contain the dangers came much earlier in Europe than in the Middle East, where legal reforms designed to give mutawallis greater discretion had to await the 20th century.

Waqfs are discussed further in “Legal Roots of Authoritarian Rule in the Middle East: Civic Legacies of the Islamic Waqf”, Kuran2016 & “Islam and Economic Performance: Historical and Contemporary Links”, Kuran2018:

This essay critically evaluates the analytic literature concerned with causal connections between Islam and economic performance. It focuses on works since 199727ya, when this literature was last surveyed…Weak property rights reinforced the private sector’s stagnation by driving capital out of commerce and into rigid waqfs. Waqfs limited economic development through their inflexibility and democratization by restraining the development of civil society.

The later Bazzi et al 2019 exploits a natural shock in Indonesian politics which drove a burst of donations to waqf endowments, showing generally negative effects in affected regions (particularly to economic growth).

The Discovery of France

Excerpts from Robb’s The Discovery Of France on peasant pessimism, poverty, death-wishes, the blessing of cretinism, famine, fire, and infanticide.

pg89:

Written descriptions of daily life inevitably convey the same bright sense of purpose and progress. They pass through years of lived experience like carefree travellers, telescoping the changes that only a long memory could have perceived. Occasionally, however, a simple fact has the same effect as the photograph in the museum. At the end of the eighteenth century, doctors from urban Alsace to rural Brittany found that high death rates were not caused primarily by famine and disease. The problem was that, as soon as they became ill, people took to their beds and hoped to die. In 1750274ya, the Marquis d’Argenson noticed that the peasants who farmed his land in the Touraine were ‘trying not to multiply’: ‘They wish only for death’. Even in times of plenty, old people who could no longer wield a spade or hold a needle were keen to die as soon as possible. ‘Lasting too long’ was one of the great fears of life. Invalids were habitually hated by their carers. It took a special government grant, instituted in 1850174ya in the Seine and Loiret départements, to persuade poor families to keep their ailing relatives at home instead of sending them to that bare waiting room of the graveyard, the municipal hospice.

When there was just enough food for the living, the mouth of a dying person was an obscenity. In the relatively harmonious household of the 1840s described by the peasant novelist Émile Guillaumin, the family members speculate openly in front of Émile’s bed-ridden grandmother (who has not lost her hearing): ‘“I wish we knew how long it’s going to last.” And another would reply, “Not long, I hope.”’ As soon as the burden expired, any water kept in pans or basins was thrown out (since the soul might have washed itself—or, if bound for Hell, tried to extinguish itself—as it left the house), and then life went on as before.

‘Happy as a corpse’ was a saying in the Alps. Visitors to villages in the Savoy Alps, the central Pyrenees, Alsace and Lorraine, and parts of the Massif Central were often horrified to find silent populations of cretins with hideous thyroid deformities. (The link between goitre and lack of iodine in the water was not widely recognized until the early nineteenth century.) The Alpine explorer Saussure, who asked in vain for directions in a village in the Aosta Valley when most of the villagers were out in the fields, imagined that ‘an evil spirit had turned the inhabitants of the unhappy village into dumb animals, leaving them with just enough human face to show that they had once been men’.

The infirmity that seemed a curse to Saussure was a blessing to the natives. The birth of a cretinous baby was believed to bring good luck to the family. The idiot child would never have to work and would never have to leave home to earn money to pay the tax-collector. These hideous, creatures were already half-cured of life. Even the death of a normal child could be a consolation. If the baby had lived long enough to be baptized, or if a clever witch revived the corpse for an instant to sprinkle it with holy water, its soul would pray for the family in heaven.

…A slightly callous view of past suffering has emphasized the suspiciously repetitive nature of these Cahiers. Set phrases were suggested by central committees and copied down by local committees. One village found an adequate expression of its suffering and others repeated the impressive details: children eating grass, tears moistening bread, farmers feeling envious of their animals, and so on. But those grass-eating children were clearly not a figure of speech: the harvest of 1788236ya had been worse than usual, and the Cahiers were drawn up in the dangerously hollow months when last year’s supplies were running low and next year’s corn had yet to ripen. The relatively prosperous town of Espère obviously had nothing to gain when it applied the phrase to its neighbours:

We have not yet seen our children munching grass like those of our neighbours, and our old people, happier than many of those in the surrounding region, almost all survived the rigours of last January. Only once did we have the affliction of seeing one of our own people die of hunger.

Even for prosperous peasants, disaster always loomed. Few lives were free from sudden setbacks. Every year, several villages and urban districts went up in smoke. An English traveller, crossing the Jura from Salins to Pontarlier in 1738286ya, was told that ‘there is scarce a Village in all this Tract that does not perish by Flames once at least in 10 Years’. Salins itself was almost totally destroyed in 1825199ya by a fire that burned for three days. The city of Rennes disappeared in 1720304ya and much of Limoges in 1864160ya. Thatch was cheap (gleaned from harvested fields in October), but it harboured huge populations of insects and caught fire easily unless it was completely covered by a layer of clay, quicklime, horse manure and sand. (In some parts, thatch was outlawed in new buildings in the mid-nineteenth century and replaced by the red corrugated iron that was thought to add a pleasant touch of color to the landscape.)

Déguignet was fortunate in having parents who wanted to keep him. Thousands of children—like Tom Thumb in the French fairy tale—were abandoned every year. At Provins, between 1854170ya and 1859165ya, 1,258 children were deposited in the rotating barrel built into the wall of the general hospital. (It can now be seen in the local museum.) These tours d’abandon, which contained a straw bed and some blankets, made it possible for mothers to abandon their babies anonymously and safely. They were outlawed as a public disgrace in 1861163ya, which simply meant that more babies than before were left to die on doorsteps. In 1869155ya, over 7 per cent of births in France were illegitimate, and one-third of those children were abandoned. Each year, fifty thousand human beings started life in France without a parent. Many were sent to the enterprising women known as ‘angel-makers’ who performed what can most kindly be described as postnatal abortions. A report on the hospice at Rennes defined them as ‘women who have no milk and who—doubtless for a fee—feloniously take care of several children at the same time. The children perish almost immediately.’

Before 1779245ya, the nuns who ran the foundling hospital in Paris were obliged by law to take the infant overflow from the provinces. This emergency regulation produced one of the strangest sights on the main roads of France. Long-distance donkeys carrying panniers stuffed with babies came to the capital from as far away as Brittany, Lorraine and the Auvergne. The carters set out on their two-hundred-and-fifty-mile journeys with four or five babies to a basket, but in towns and villages along the route they struck deals with midwives and parents. For a small fee, they would push in a few extra babies. To make the load more tractable and easier on the ears, the babies were given wine instead of milk. Those that died were dumped at the roadside like rotten apples. In Paris, the carters were paid by the head and evidently delivered enough to make it worth their while. But for every ten living babies that reached the capital, only one survived more than three days.

See also Village Life in Late Tsarist Russia, Olga Semyonova Tian-Shanskaia, The Moral Basis of a Backward Society.

The Dark Side of the Enlightenment

Fleming 201311ya, The Dark Side of the Enlightenment, offers a useful example of the blinding effects of time & changes in ideology in discussion of Milton’s Paradise Lost (pg27–29):

The same phenomena viewed from opposite ends of a period of dramatic intellectual change may look very different…Paradise Lost [1667357ya] is an epic poem based in the history of Adam and Eve as recounted in the first two chapters of the Book of Genesis…The subject of Paradise Lost is “the Fall of Man”, the “original sin” of ancient Christian theological orthodoxy. The results of the primal transgression were catastrophic, for sin “brought death into the World, and all our woe, with loss of Eden.” Our human ancestors, now rendered mortal by their own disobedience, were banished into the harsh world of labor and necessity—our world. This myth was universally understood among Christian thinkers both as a historical account of the primal fall and an allegory of every act of sin in which sensuality masters reason an willfulness conquers a required obedience. John Milton, an actual revolutionary both in politics and in art, very clearly grounded his poem in a strictly static hierarchy of the Great Chain of Being. There was a metaphysical pyramid with God at its apex. Just below that were the hierarchically ordered angels. A “little lower than the angels” were human beings, with man the superior to woman. Below that were all the animals and birds, all of vegetative life from the mighty oak to the lichen scabrous upon the stone, then the stones and minerals themselves, down to the meanest clods of the earth…Sin at its core was the overthrow of divinely established hierarchies, turning things upside down.

…In 1793231ya…William Blake created a work called The Marriage of Heaven and HellThe Marriage takes Paradise Lost as its point of departure, and it makes the following criticism of it: “Note: The reason Milton wrote in fetters when he wrote of Angels & God, and at liberty when of Devils & Hell, is because he was a true Poet and of the Devils party without knowing it.” The idea that Milton was subconsciously “of the Devils party”—or putting it in more forceful terms that Satan is the true hero of Paradise Lost and God Almighty its true villain—has become one of the orthodoxies of modern literary history. It seems to accord with our sense of what is good and true, and it seems confirmed by the nature of the verse. Milton’s God is arbitrary and autocratic, and His words, when compared with Satan’s fiery speeches, are boring. According to one famous interpretation, by the literary critic William Empson, Milton’s God is actively evil. Satan, on the other hand, is dynamic. Pandemonium—the parliament of all the devils—is less like a royal court than a democratic senate. There is verbal thrust and verbal parry, the most fundamental challenging of authority. Non serviam, cries Satan. I shall not serve. His most memorable line may be “Better to reign in hell than serve in heaven!”

Despite tortured attempts to attribute this “reading” to Milton’s conscious intention, it seems impossible that a seventeen-century English Puritan would write a biblical epic in which God is the villain and Satan the hero, or that it would be received by nearly the entire Protestant eighteenth century as the greatest Christian poem ever written. Slightly more plausible, but only slightly, is the notion that such an interpretation reveals a subconscious irresolution within John Milton. It is much more likely that what seemed manifestly clear to the twentieth-century literary critic Empson never occurred to anybody for a century or more after the poem’s publication [1793–1667 = 126]. When, however, the Old World view of the Great Chain of Being and the rightness of fixed hierarchies gives way to a very different view—of the generative power of dynamically interacting polarities—the phenomena may look very different. Yet unless we are willing to turn all of cultural history into a vast Rorschach test that can tell us only what is already in our own minds, we need to make a strenuous effort to grasp something very different from what may already be there. “A perfect judge will read each work of wit”, says Alexander Pope, “With the same spirit that its author writ.”

The Better Angels of Our Nature, Pinker

Excerpts from Steven Pinker’s The Better Angels of Our Nature on violence & infanticide in pre-industrial societies (including Christian ones):

…A survey of cultures by the anthropologist Laila Williamson reveals that infanticide has been practiced on every continent and by every kind of society, from non-state bands and villages (77% of which have an accepted custom of infanticide) to advanced civilizations.102 Until recently, between 10 and 15% of all babies were killed shortly after they were born, and in some societies the rate has been as high as 50%.103 In the words of the historian Lloyd deMause, “All families once practiced infanticide. All states trace their origin to child sacrifice. All religions began with the mutilation and murder of children.”104

…Martin Daly and Margo Wilson tested the triage theory by examining a sample of sixty unrelated societies from a database of ethnographies.111 Infanticide was documented in a majority of them, and in 112 cases the anthropologists recorded a reason. 87% of the reasons fit the triage theory: the infant was not sired by the woman’s husband, the infant was deformed or ill, or the infant had strikes against its chances of surviving to maturity, such as being a twin, having an older sibling close in age, having no father around, or being born into a family that had fallen on hard economic times.

…The technological efficiency of daughter-proofing a pregnancy may make it seem as if the girl shortage is a problem of modernity, but female infanticide has been documented in China and India for more than two thousand years.119 In China, midwives kept a bucket of water at the bedside to drown the baby if it was a girl. In India there were many methods: “giving a pill of tobacco and bhang to swallow, drowning in milk, smearing the mother’s breast with opium or the juice of the poisonous Datura, or covering the child’s mouth with a plaster of cow-dung before it drew breath.” Then and now, even when daughters are suffered to live, they may not last long. Parents allocate most of the available food to their sons, and as a Chinese doctor explains, “if a boy gets sick, the parents may send him to the hospital at once, but if a girl gets sick, the parents may say to themselves, ‘Well, we’ll see how she is tomorrow.’”120

Female infanticide, also called gendercide and gynecide, is not unique to Asia.121 The Yanomamö are one of many foraging peoples that kill more newborn daughters than sons. In ancient Greece and Rome, babies were “discarded in rivers, dunghills, or cesspools, placed in jars to starve, or exposed to the elements and beasts in the wild.”122 Infanticide was also common in medieval and Renaissance Europe.123 In all these places, more girls perished than boys. Often families would kill every daughter born to them until they had a son; subsequent daughters were allowed to live.

…The evolutionary anthropologists Sarah Hrdy and Kristen Hawkes have each shown that the Trivers-Willard theory gets only half of the story right. In India, it’s true that the higher castes tend to kill their daughters. Unfortunately, it’s not true that the lower castes tend to kill their sons. In fact, it’s hard to find a society anywhere that kills its sons.128 The infanticidal cultures of the world are either equal-opportunity baby-killers or they prefer to kill the girls—and with them, the Trivers-Willard explanation for female infanticide in humans.

Whether they are new mothers in desperate straits, putative fathers doubting their paternity, or parents preferring a son over a daughter, people in the West can no longer kill their newborns with impunity.135 In 200717ya in the United States, 221 infants were murdered out of 4.3 million births. That works out to a rate of 0.00005, or a reduction from the historical average by a factor of two to three thousand. About a quarter of them were killed on their first day of life by their mothers, like the “trash-can moms” who made headlines in the late 1990s by concealing their pregnancies, giving birth in secret (in one case during a high school prom), smothering their newborns, and discarding their bodies in the trash.136 These women find themselves in similar conditions to those who set the stage for infanticide in human prehistory: they are young, single, give birth alone, and feel they cannot count on the support of their kin. Other infants were killed by fatal abuse, often by a stepfather. Still others perished at the hands of a depressed mother who committed suicide and took her children with her because she could not imagine them living without her. Rarely, a mother with postpartum depression will cross the line into postpartum psychosis and kill her children under the spell of a delusion, like the infamous Andrea Yates, who in 200123ya drowned her five children in a bathtub.

What drove down the Western rate of infanticide by more than three orders of magnitude? The first step was to criminalize it. Biblical Judaism prohibited filicide, though it didn’t go the whole hog: killing an infant younger than a month did not count as murder, and loopholes were claimed by Abraham, King Solomon, and Yahweh himself for Plague #10.137 The prohibition became clearer in Talmudic Judaism and in Christianity, from which it was absorbed into the late Roman Empire. The prohibition came from an ideology that held that lives are owned by God, to be given and taken at his pleasure, so the lives of children no longer belonged to their parents. The upshot was a taboo in Western moral codes and legal systems on taking an identifiable human life: one could not deliberate on the value of the life of an individual in one’s midst. (Exceptions were exuberantly made, of course, for heretics, infidels, uncivilized tribes, enemy peoples, and transgressors of any of several hundred laws. And we continue to deliberate on the value of statistical lives, as opposed to identifiable lives, every time we send soldiers or police into harm’s way, or scrimp on expensive health and safety measures.)

For almost a millennium and a half the Judeo-Christian prohibition against infanticide coexisted with massive infanticide in practice. According to one historian, exposure of infants during the Middle Ages “was practiced on a gigantic scale with absolute impunity, noticed by writers with most frigid indifference.”145 Milner cites birth records showing an average of 5.1 births among wealthy families, 2.9 among the middle class, and 1.8 among the poor, adding, “There was no evidence that the number of pregnancies followed similar lines.”146 In 1527497ya a French priest wrote that “the latrines resound with the cries of children who have been plunged into them.”147

At various points in the late Middle Ages and the early modern period, systems of criminal justice tried to do something about infanticide….Various fig leaves were procured. The phenomenon of “overlying,” in which a mother would accidentally smother an infant by rolling over it in her sleep, at times became an epidemic. Women were invited to drop off their unwanted babies at foundling homes, some of them equipped with turntables and trapdoors to ensure anonymity. The mortality rates for the inhabitants of these homes ranged from 50% to more than 99%.149 Women handed over their infants to wet nurses or “baby farmers” who were known to have similar rates of success. Elixirs of opium, alcohol, and treacle were readily obtainable by mothers and wet nurses to becalm a cranky infant, and at the right dosage it could becalm them very effectively indeed. Many a child who survived infancy was sent to a workhouse, “without the inconvenience of too much food or too much clothing”, as Dickens described them in Oliver Twist, and where “it did perversely happen in eight and a half cases out of ten, either that it sickened from want and cold, or fell into the fire from neglect, or got half-smothered by accident; in any one of which cases, the miserable little being was usually summoned into another world, and there gathered to the fathers it had never known in this.” Even with these contrivances, tiny corpses were a frequent sight in parks, under bridges, and in ditches. According to a British coroner in 1862162ya, “The police seemed to think no more of finding a dead child than they did of finding a dead cat or a dead dog.”150

It is true that in much of the world today, a similar proportion of pregnancies end in abortion as the fraction that in centuries past ended in infanticide.151 Women in the developed West abort between 12% and 25% of their pregnancies; in some of the former communist countries the proportion is greater than half. In 200321ya a million fetuses were aborted in the United States, and about 5 million were aborted throughout Europe and the West, with at least another 11 million aborted elsewhere in the world. If abortion counts as a form of violence, the West has made no progress in its treatment of children. Indeed, because effective abortion has become widely available only since the 1970s (especially, in the United States, with the 197351ya Roe v. Wade Supreme Court decision), the moral state of the West hasn’t improved; it has collapsed.

“The Book of Revelation: prophecy and politics Edge master class 2011”:

Steven Pinker: This speaks to the original question of why a lot of these beliefs persist. And I’m always puzzled how, if you take all of this literally as some profess to do, that it really does lead to some—and speaking anachronistically as a post-enlightenment secular humanist—it leads to all kinds of pernicious consequences. Like if the only thing that keeps you from an eternity of torment is accepting Jesus as your savior, well, if you torture someone until they embrace Jesus, you’re doing them the biggest favor of their lives. It’s better a few hours now than all eternity. And if someone is leading people away from this kind of salvation, well, they’re the most evil Typhoid Mary that you can imagine, and exterminating them would be a public health measure because they are luring people into an eternity of torment, and there could be nothing more evil. Again, it’s totally anachronistic. The idea of damnation and hell is, by modern standards, a morally pernicious concept. If you take it literally, though, then of course torturing Jews and atheists and heretics and so on, is actually a very responsible public health measure. Nowadays, people both profess to believe in The Book of Revelation, and they also don’t think it’s a good idea to torture Jews and heretics and atheists.

…Even the televangelists who are thundering from their pulpits, probably don’t think it’s a good idea to torture Jews. And in fact, in public opinion polls, there’s a remarkable change through the 20th century, in statements like, all religions are equally valid, and ought to be respected. which today, the majority of Americans agree with. And in the 1930s, needless to say, the majority disagreed with. What I find fascinating is, what kind of compartmentalization allows, on the one hand, people to believe in a literal truth of judgment day, eternal torment, but they no longer, as they once did, follow through the implication, well, we’d better execute heretics and torture nonbelievers. On one hand they’ve got admirably, a kind of post-enlightenment ecumenical tolerant humanism, torturing people is bad. On the other hand, they claim to hold beliefs that logically imply that torturing heretics would be an excellent thing. It’s interesting that the human mind can embrace these contradictions and that fortunately for all of us, the humanistic sentiments trump the, at least, claimed belief in the literal truth of all of this.


  1. It’s accepted that theories should be consistent. It’d also be good if one’s beliefs were consistent over time as well, otherwise one gets things like Moore’s question (or a quote ascribed to Leonardo Da Vinci, appropriately on vegetarianism), “I went to the pictures last Tuesday, but I don’t believe that I did”, a sort of inconsistency which seems to render one vulnerable to a Dutch book exploit. (How exactly the inconsistency is to be resolved is a bit unclear.) Reflection principles have been much discussed.↩︎

  2. C.S. Lewis, Mere Christianity (195272ya), Bk1, ch2↩︎

  3. Kugel 201113ya, pg 165–16:

    The omens continued to exist long after Europe was Christianized; indeed, Christianity was often the omens’ close friend, a frequent feature in tales of the saints. But then, slowly at first, their sphere of influence began to shrink. The whole realm of the supernatural underwent a marked contraction in Western Europe—not, as one might suppose, with the scientific revolution, but well before it, during the period of, roughly, 1000 to 1500 of the common era.6 The supernatural of course continued to exist, but, as I mentioned, the very act of distinguishing the natural from the supernatural was a distinction that bespoke mankind’s growing power over occult forces.

    One indication of this change is the phenomenon of ‘trial by ordeal’. In many societies, supernatural means were used to determine a person’s guilt or innocence, or the appropriateness or inappropriateness of a given course of action: lots were cast, entrails were scrutinized, arrows were shot, and so forth, and the results determined what was to be done. This was not, it should be stressed, like our flipping a coin nowadays, where the utterly random nature of the outcome is generally recognized by the participants. Instead, the results here were taken to be an expression of the divine will…Christian trials by ordeal continued long after this time [first century CE], in fact, well into the Middle Ages. And they were no joke: indeed, they were known by the somewhat more ominous name of ‘the Judgment of God’ (iudicium Dei)…The interesting thing is that such trials virtually disappeared from Western Europe by the year 1300, and it seems this was part of a wider trend that limited (but certainly did not eliminate entirely) the role of the supernatural in human affairs. It may not be a coincidence that this was also the time when the writings of Plato and Aristotle, as well as the other Greek scientific and mathematical treatises, were making their way into Latin, often via earlier translations into Arabic. (Greek had been largely unknown in Western Europe.) A whole new attitude to the formerly supernatural world was emerging, what the sociologist Max Weber called “breaking the magic spell” of the world.8 The uncanny was receding.

    • 6: This is the subject of a recent study from which some of the following examples are taken: Robert Bartlett, The Natural and the Supernatural in the Middle Ages (Cambridge: Cambridge University Press, 200816ya). See also Peter Brown, “Society and the Supernatural: A Medieval Change”, Daedalus 104 (197549ya), 133–151

    • 8: Entzayberung der Welt [Disenchantment of the World]: see Barlett, 32–33. Even then, however, the movement was not unidirectional. While Aristotle’s treatises on logic were uncontroversial, his writings on physics, biology, and other libri naturales were regarded with some suspicion and even, briefly, banned.

    ↩︎
  4. Charitable Foundations—Posner’s Comment:

    But I would not place much weight on competition by universities and other recipients of charitable giving for foundation grants, since the recipients will compete whatever the source; universities compete for government grants just as they do for private grants. A perpetual charitable foundation, however, is a completely irresponsible institution, answerable to nobody. It competes neither in capital markets nor in product markets (in both respects differing from universities), and, unlike a hereditary monarch whom such a foundation otherwise resembles, it is subject to no political controls either. It is not even subject to benchmark competition—that is, evaluation by comparison with similar enterprises—except with regard to the percentage of its expenditures that go to administration (staff salaries and the like) rather than to donees. The puzzle for economics is why these foundations are not total scandals…A deeper puzzle relates to the leftward drift in foundation policies that Becker discusses, a drift enabled by the perpetual character of a foundation. (I agree that foundation staff work is attractive to liberals and that the children of the founders tend to be more liberal than their fathers. In both cases the main reason is probably that while the creators of the major foundations invariably are successful businessmen, and business values are conservative, foundation staff are not business-people and many children of wealthy business-people do not go into business either.) The puzzle is why conservatives establish perpetual foundations. Don’t they realize what is likely to happen down the road? The answer may be that the desire to perpetuate their name is greater than their desire to support conservative causes. In any event, a rule forbidding perpetual foundations would be paternalistic.

    Nevertheless, on the basic libertarian/economic grounds, even Posner has to admit:

    I agree with Becker that the great strength of charitable foundations, and the principal justification for the tax exemption (though a secondary one is to offset the free-rider problem in charitable giving—if you give to my favorite charity, I benefit, and so the more you give the less I will be inclined to give), are that they bring about a decentralization of charitable giving, breaking what would otherwise be a governmental monopoly and thus reducing the play of politics in charity. In addition, however, to the extent that charitable giving substitutes for government spending, such giving (minus the tax benefits to the giver) represents a form of voluntary taxation, like state lotteries. Given the enormous skewness of incomes in today’s United States, it is good to encourage voluntary taxation of the wealthy…If rich people want to squander their money on feckless foundations, that should be their privilege. Moreover, to the extent that foundation spending substitutes for government spending, the comparison is of two inefficient forms of enterprise, and the foundations may be the less inefficient form.

    ↩︎
  5. 6: See on this M. Bayliss, “The Cult of Dead Kin in Assyria and Babylonia”, Iraq 35 (197351ya), 115–125; Brian B. Schmidt, Israel’s Beneficent Dead: Ancestor Cult and Necromancy in Ancient Israelite Religion and Tradition (Winona Lake, Ind.: Eisenbrauns, 199628ya), 201–215; Theodore Lewis, The Cult of the Dead in Ancient Israel and Ugarit (Atlanta: Scholars Press, 198935ya), 97. [More reading: Elizabeth M. Bloch-Smith’s 199232ya paper, “The Cult of the Dead in Judah: Interpreting the Material Remains”.]↩︎

  6. The following are the numbered references in the Fukuyama extract:

    • 39: Hugh Baker, Chinese Family and Kinship (New York: Columbia University Press, 197945ya), p. 26.

    • 5: Such rights were said to have spontaneously emerged during the California gold rush of 1849–1850174ya, when miners peacefully negotiated among themselves an allocation of the claims they had staked out. See Pipes, Property and Freedom, p. 91. This account ignores two important contextual factors: first, the miners were all products of an Anglo-American culture where respect for individual property rights was deeply embedded; second, these rights came at the expense of the customary rights to these territories on the part of the various indigenous peoples living there, which were not respected by the miners.

    • 6: Charles K. Meek, Land Law and Custom in the Colonies, 2d ed. (London: Frank Cass, 196856ya), p. 26.

    • 7: Quoted in Elizabeth Colson, “The Impact of the Colonial Period on the Definition of Land Rights,” in Victor Turner, ed., Colonialism in Africa 187090196064ya. Vol. 3: “Profiles in Change: African Society and Colonial Rule” (New York: Cambridge University Press, 197153ya), p. 203.

    • 8: Meek, Land Law and Custom, p. 6.

    • 9: Colson, “Impact of the Colonial Period,” p. 200.

    • 10: Paul Vinogradoff, Historical Jurisprudence (London: Oxford University Press, 1923101ya), p. 327.

    • 11: Meek, Land Law and Custom, p. 17.

    • 12: Vinogradoff, Historical Jurisprudence, p. 322.

    • 13: For a discussion of the pros and cons of traditional land tenure, see Curtin, Holzknecht, and Larmour, Land Registration in Papua New Guinea.

    • 14: For a detailed account of the difficulties of negotiating property rights in Papua New Guinea, see Wimp, “Indigenous Land Owners and Representation in PNG and Australia.”

    • 15: The modern economic theory of property rights does not specify the social unit over which individual property rights extend for the system to be efficient. The unit is often presumed to be the individual, but families and firms are often posited as holders of property rights, whose constituent members are assumed to have common interests in the efficient exploitation of the resources they together own. See Jennifer Roback, “Exchange, Sovereignty, and Indian-Anglo Relations,” in Terry L. Anderson, ed., Property Rights and Indian Economies (Lanham, MD: Rowman and Littlefield, 199133ya).

    ↩︎
  7. In the non-Singularity scenarios where patient maintenance & revival trusts will need to last a minimum of a century—the first cryopreservation was in 196757ya, so already hitting the half-century mark—the contempt of descendants for their ancestors’ wishes becomes a serious concern.↩︎

  8. Financial funds or trusts which only last a few decades or a century or two may not be a large concern for the Clock of the Long Now or the Rosetta Disk, which are designed to function without human or financial input; but it is a serious concern for Long Bets (how do the predictions pay out if the funds are seized? or if the organization is forced to dissolve?) and a deal-breaker for any “Library of the Long Now” (what’s the point of a new library if it will only last a few centuries? Any efforts would then be far better invested in existing old institutions like the Library of Congress or the Bodleian Library).↩︎

  9. pg65–66, James L. Kugel, In the Valley of the Shadow:

    One book I read during chemotherapy was the well-known study by the experimental psychologist Julian Jaynes, The Origin of Consciousness in the Breakdown of the Bicameral Mind (197648ya). Jaynes suggested that the human brain used to function somewhat differently in ancient times (that is, up to about 3,000 years ago). He noted that, while many aspects of language and related functions are located in the two parts of the brain’s left hemisphere known as Wernicke’s area and Broca’s area, the right-brain counterparts to these areas are nowadays largely dormant. According to Jaynes, however, those areas had been extremely important in earlier times, before humans began to perceive the world as we do now. Back then, he theorized, humans had an essentially “bicameral mind” that lacked the integrative capacities of the modern brain. Instead, its two halves functioned relatively independently: the left brain would obey what it perceived as “voices”. which in fact emanated from those now-dormant areas of the right brain. (In Jaynes’s formulation, the right hemisphere “organized admonitory experience and coded it into ‘voices’ which were then ‘heard’ by the left hemisphere.”) Although internally generated, those voices were thus perceived by the left brain as coming from outside. It is this situation that led to the belief in communications from the gods in ancient times, as well as the belief in lesser sorts of supernatural communicators: talking spirits and genies, muses who dictated poetry to the “inspired” poet, sacred rocks, trees, and other objects that brought word “from the other side”. When this bicameral mind faded out of existence and modern consciousness arose, prophecy likewise ceased and people suddenly no longer heard the gods telling them what to do.

    Jaynes’s theory attracted much attention when first promulgated: it answered a lot of questions in one bold stroke. But it was not long before other scholars raised [substantial], and eventually devastating, objections to his idea. To begin with, 3,000 years is a tiny speck of time on the scale of human evolution. How could so basic a change in the way our brains work have come about so recently? What is more, 3,000 years ago humans lived in the most varied societies and environments. Some societies were already quite sophisticated and diversified, while others then (and some still now) existed in the most rudimentary state; some humans lived in tropical forests, others in temperate climes, still others in snowy wastelands close to earth’s poles; and so forth. could human brains in these most diverse circumstances all have changed so radically at—in evolutionary terms—the same instant? Certainly now our brains all seem to function in pretty much the same way, no matter where we come from; there are no apparent surviving exemplars of the bicameral mind that Jaynes postulated. What could have caused humanity to undergo this radical change in lockstep all over the earth’s surface? A ray from outer space?

    But if Jaynes’s idea has met with disapproval, the evidence he adduced is no less provocative. The problem of explaining such phenomena as the appearance and subsequent disappearance of prophecy in many societies (though certainly not all), along with the near-universal evidence of religion discussed earlier (with the widespread phenomenon of people communing with dead ancestors and/or gods—and hearing back from them), remains puzzling.

    ↩︎