In Defense of Individualist Culture

Epistemic Status: Pretty much serious and endorsed.

College-educated Western adults in the contemporary world mostly live in what I’d call individualist environments.

The salient feature of an individualist environment is that nobody directly tries to make you do anything.

If you don’t want to go to class in college, nobody will nag you or yell at you to do so. You might fail the class, but this is implemented through a letter you get in the mail or on a registrar’s website.  It’s not a punishment, it’s just an impersonal consequence.  You can even decide that you’re okay with that consequence.

If you want to walk out of a talk in a conference designed for college-educated adults, you can do so. You will never need to ask permission to go to the bathroom. If you miss out on the lecture, well, that’s your loss.

If you slack off at work, in a typical office-job environment, you don’t get berated. And you don’t have people watching you constantly to see if you’re working. You can get bad performance reviews, you can get fired, but the actual bad news will usually be presented politely.  In the most autonomous workplaces, you can have a lot of control over when and how you work, and you’ll be judged by the results.

If you have a character flaw, or a behavior that bothers people, your friends might point it out to you respectfully, but if you don’t want to change, they won’t nag, cajole, or bully you about it. They’ll just either learn to accept you, or avoid you. There are extremely popular advice columns that try to teach this aspect of individualist culture: you can’t change anyone who doesn’t want to change, so once you’ve said your piece and they don’t listen, you can only choose to accept them or withdraw association.

The basic underlying assumption of an individualist environment or culture is that people do, in practice, make their own decisions. People believe that you basically can’t make people change their behavior (or, that techniques for making people change their behavior are coercive and thus unacceptable.)  In this model, you can judge people on the basis of their decisions — after all, those were choices they made — and you can decide they make lousy friends, employees, or students.  But you can’t, or shouldn’t, cause them to be different, beyond a polite word of advice here and there.

There are downsides to these individualist cultures or environments.  It’s easy to wind up jobless or friendless, and you don’t get a lot of help getting out of bad situations that you’re presumed to have brought upon yourself. If you have counterproductive habits, nobody will guide or train you into fixing them.

Captain Awkward’s advice column is least sympathetic to people who are burdens on others — the depressive boyfriend who needs constant emotional support and can’t get a job, the lonely single or heartbroken ex who just doesn’t appeal to his innamorata and wants a way to get the girl.  His suffering may be real, and she’ll acknowledge that, but she’ll insist firmly that his problems are not others’ job to fix.  If people don’t like you — tough! They have the right to leave.

People don’t wholly “make their own decisions”.  We are, to some degree, malleable, by culture and social context. The behaviorist or sociological view of the world would say that individualist cultures are gravely deficient because they don’t put any attention into setting up healthy defaults in environment or culture.  If you don’t have rules or expectations or traditions about food, or a health-optimized cafeteria, you “can” choose whatever you want, but in practice a lot of people will default to junk.  If you don’t have much in the way of enforcement of social expectations, in practice a lot of people will default to isolation or antisocial behavior. If you don’t craft an environment or uphold a culture that rewards diligence, in practice a lot of people will default to laziness.  “Leaving people alone”, says this argument, leaves them in a pretty bad place.  It may not even be best described as “leaving people alone” — it might be more like “ripping out the protections and traditions they started out with.”

Lou Keep, I think, is a pretty good exponent of this view, and summarizer of the classic writers who held it. David Chapman has praise for the “sane, optimistic, decent” societies that are living in a “choiceless mode” of tradition, where people are defined by their social role rather than individual choices.  Duncan Sabien is currently trying to create a (voluntary) intentional community designed around giving up autonomy in order to be trained/social-pressured into self-improvement and group cohesion.  There are people who actively want to be given external structure as an aid to self-mastery, and I think their desires should be taken seriously, if not necessarily at face value.

I see a lot of writers these days raising problems with modern individualist culture, and it may be an especially timely topic. The Internet is a novel superstimulus, and it changes more rapidly, and affords people more options, than ever before.  We need to think about the actual consequences of a world where many people are in practice being left alone to do what they want, and clearly not all the consequences are positive.

But I do want to suggest some considerations in favor of individualist culture — that often-derided “atomized modern world” that most of us live in.

We Aren’t Clay

It’s a common truism that we’re all products of our cultural environment. But I don’t think people have really put together the consequences of the research showing that it’s not that easy to change people through environmental cues.

  • Behavior is very heritable. Personality, intelligence, mental illness, and social attitudes are all well established as being quite heritable.  The top ten most replicated findings in behavioral genetics starts with “all psychological traits show significant and substantial genetic influence”, which Eric Turkheimer has called the “First Law of behavioral genetics.”  A significant proportion of behavior is also explained by “nonshared environment”, which means it isn’t genetic and isn’t a function of the family you were raised in; it could include lots of things, from peers to experimental error to individual choice.
  • Brainwashing doesn’t work. Cult attrition rates are high, and “brainwashing” programs of POWs by the Chinese after the Korean War didn’t result in many defections.
  • There was a huge boom in the 1990’s and 2000’s in “priming” studies — cognitive-bias studies that showed that seemingly minor changes in environment affected people’s behavior.  A lot of these findings didn’t replicate. People don’t actually walk slower when primed with words about old people. People don’t actually make different moral judgments when primed with words or videos of cleanliness or disgusting bathrooms.  Being primed with images of money doesn’t make people more pro-capitalist.  Girls don’t do worse on math test when primed with negative stereotypes. Daniel Kahneman himself, who publicized many of these priming studies in Thinking, Fast and Slow, wrote an open letter to priming researchers that they’d have to start replicating their findings or lose credibility.
  • Ego depletion failed to replicate as well; using willpower doesn’t make you too “tired” to use willpower later.
  • The Asch Conformity Experiment was nowhere near as extreme as casual readers generally think: the majority of people didn’t change their answers to wrong ones to conform with the crowd, only 5% of people always conformed, and 25% of people never conformed.
  • The Sapir-Whorf Hypothesis has generally been found to be false by modern linguists: the language one speaks does not determine one’s cognition. For instance, people who speak a language that uses a single word for “green” and “blue” can still visually distinguish the colors green and blue.

Scott Alexander said much of this before, in Devoodooifying Psychology.  It’s been popular for many years to try to demonstrate that social pressure or subliminal cues can make people do pretty much anything.  This seems to be mostly wrong.  The conclusion you might draw from the replication crisis along with the evidence from behavioral genetics is “People aren’t that easily malleable; instead, they behave according to their long-term underlying dispositions, which are heavily influenced by inheritance.”  People may respond to incentives and pressures (the Milgram experiment replicated, for instance), but not to trivial external pressures, and they can actually be quite resistant to pressure to wholly change their lives and values (becoming a cult member or a Communist.)

Those who study culture think that we’re all profoundly shaped by culture, and to some extent that may be true. But not as much or as easily as social scientists think.  The idea of mankind as arbitrarily malleable is an appealing one to marketers, governments, therapists, or anyone who hopes that it’s easy to shift people’s behavior.  But this doesn’t seem to be true.  It might be worth rehabilitating the notion that people pretty much do what they’re going to do.  We’re not just swaying in the breeze, waiting for a chance external influence to shift us. We’re a little more robust than that.

People Do Exist, Pretty Much

People try to complicate the notion of “person” — what is a person, really? Do individuals even exist?  I would argue that a lot of this is not as true as it sounds.

A lot of theorists suggest that people have internal psychological parts (Plato, Freud, Minsky, Ainslie) or are part of larger social wholes (Hegel, Heidegger, lots and lots of people I haven’t read).  But these, while suggestive, are metaphors and hypotheses. The basic, boring fact, usually too obvious to state, is that most of your behavior is proximately caused by your brain (except for reflexes, which are controlled by your spinal cord.)  Your behavior is mostly due to stuff inside your body; other people’s behavior is mostly due to stuff inside their bodies, not yours.  You do, in fact, have much more control over your own behavior than over others’.

“Person” is, in fact, a natural category; we see people walking around and we give them names and we have no trouble telling one person apart from another.

When Kevin Simler talks about “personhood” being socially constructed, he means a role, like “lady” or “gentleman.” The default assumptions that are made about people in a given context. This is a social phenomenon — of course it is, by design!  He’s not literally arguing that there is no such entity as Kevin Simler.

I’ve seen Buddhist arguments that there is no self, only passing mental states.  Derek Parfit has also argued that personal identity doesn’t exist.  I think that if you weaken the criterion of identity to statistical similarity, you can easily say that personal identity pretty much exists.  People pretty much resemble themselves much more than they resemble others. The evidence for the stability of personality across the lifespan suggests that people resemble themselves quite a bit, in fact — different timeslices of your life are not wholly unrelated.

Self-other boundaries can get weird in certain mental conditions: psychotics often believe that someone else is implanting thoughts inside their heads, people with DID have multiple personalities, and some kinds of autism involve a lot of suggestibility, imitation, and confusion about what it means to address another person.  So it’s empirically true that the sense of identity can get confused.

But that doesn’t mean that personal identity doesn’t usually work in the “normal” way, or that the normal way is an arbitrary convention. It makes sense to distinguish Alice from Bob by pointing to Alice’s body and Bob’s body.  It’s a distinction that has a lot of practical use.

If people do pretty much exist and have lasting personal characteristics, and are not all that malleable by small social or environmental influences, then modeling people as individual agents who want things isn’t all that unreasonable, even if it’s possible for people to have inconsistent preferences or be swayed by social pressure.

And cultural practices which acknowledge the reality that people exist — for example, giving people more responsibility for their own lives than they have over other people’s lives — therefore tend to be more realistic and attainable.

 

How Ya Gonna Keep Em Down On The Farm

Traditional cultures are hard to keep, in a modern world.  To be fair, pro-traditionalists generally know this.  But it’s worth pointing out that ignorance is inherently fragile.  As Lou Keep points out , beliefs that magic can make people immune to bullets can be beneficial, as they motivate people to pull together and fight bravely, and thus win more wars. But if people find out the magic doesn’t work, all that benefit gets lost.

Is it then worth protecting gri-gri believers from the truth?  Or protecting religious believers from hearing about atheism?  Really? 

The choiceless mode depends on not being seriously aware that there are options outside the traditional one.  Maybe you’ve heard of other religions, but they’re not live options for you. Your thoughts come from inside the tradition.

Once you’re aware that you can pick your favorite way of life, you’re a modern. Sorry. You’ve got options now.

Which means that you can’t possibly go back to a premodern mindset unless you are brutally repressive about information about the outside world, and usually not even then.  Thankfully, people still get out.

Whatever may be worth preserving or recreating about traditional cultures, it’s going to have to be aspects that don’t need to be maintained by forcible ignorance.  Otherwise it’ll have a horrible human cost and be ineffective.

Independence is Useful in a Chaotic World

Right now, anybody trying to build a communitarian alternative to modern life is in an underdog position.  If you take the Murray/Putnam thesis seriously — that Americans have less social cohesion now than they did in the mid-20th century, and that this has had various harms — then that’s the landscape we have to work with.

Now, that doesn’t mean that communitarian organizations aren’t worth building. I participate in a lot of them myself (group houses, alloparenting, community events, mutual aid, planning a homeschooling center and a baugruppe).  Some Christians are enthusiastic about a very different flavor of community participation and counterculture-building called the Benedict Option, and I’m hoping that will work out well for them.

But, going into such projects, you need to plan for the typical failure modes, and the first one is that people will flake a lot.  You’re dealing with moderns! They have options, and quitting is an option.

The first antidote to flaking that most people think of — building people up into a frenzy of unanimous enthusiasm so that it doesn’t occur to them to quit — will probably result in short-lived and harmful projects.

Techniques designed to enhance group cohesion at the expense of rational deliberation — call-and-response, internal jargon and rituals, cults of personality, suppression of dissent  — will feel satisfying to many who feel the call of the premodern, but aren’t actually that effective at retaining people in the long term.  Remember, brainwashing isn’t that strong.

And we live in a complicated, unstable world.  When things break, as they will, you’d like the people in your project to avoid breaking.  That points in the direction of  valuing independence. If people need a leader’s charisma to function, what are they going to do if something happens to the leader?

Rewarding Those Who Can Win Big

A traditionalist or authoritarian culture can help people by guarding against some kinds of failure (families and churches can provide a social safety net, rules and traditions can keep people from making mistakes that ruin their lives), but it also constrains the upside, preventing people from creating innovations that are better than anything within the culture.

An individualist culture can let a lot of people fall through the cracks, but it rewards people who thrive on autonomy. For every abandoned and desolate small town with shrinking economic opportunity, there were people who left that small town for the big city, people whose lives are much better for leaving.  And for every seemingly quaint religious tradition, there are horrible abuse scandals under the surface.  The freedom to get out is extremely important to those who aren’t well-served by a traditional society.

It’s not that everything’s fine in modernity. If people are getting hurt by the decline of traditional communities — and they are — then there’s a problem, and maybe that problem can be ameliorated.

What I’m saying is that there’s a certain kind of justice that says “at the very least, give the innocent and the able a chance to win or escape; don’t trade their well-being for that of people who can’t cope well with independence.”  If you can’t end child abuse, at least let minors run away from home. If you can’t give everybody a great education, at least give talented broke kids scholarships.  Don’t put a ceiling on anybody’s success.

Immigrants and kids who leave home by necessity (a lot of whom are LGBT and/or abused) seem to be rather overrepresented among people who make great creative contributions.  “Leaving home to seek your freedom and fortune” is kind of the quintessential story of modernity.  We teach our children songs about it.  Immigration and migration is where a lot of the global growth in wealth comes from.  It was my parents’ story — an immigrant who came to America and a small-town girl who moved to the city.  It’s also inherently a pattern that disrupts traditions and leaves small towns with shrinking populations and failing economies.

Modern, individualist cultures don’t have a floor — but they don’t have a ceiling either. And there are reasons for preferring not to allow ceilings. There’s the justice aspect I alluded to before — what is “goodness” but the ability to do valuable things, to flourish as a human? And some if people are able to do really well for themselves, isn’t limiting them in effect punishing the best people?

Now, this argument isn’t an exact fit for real life.  It’s certainly not the case that everything about modern society rewards “good guys” and punishes “bad guys”.

But it works as a formal statement. If the problem with choice is that some people make bad choices when not restricted by rules, then the problem with restricting choice is that some people can make better choices than those prescribed by the rules. The situations are symmetrical, except that in the free-choice scenario, the people who make bad choices lose, and in the restricted scenario, the people who make good choices lose.  Which one seems more fair?

There’s also the fact that in the very long run, only existence proofs matter.  Does humanity survive? Do we spread to the stars?  These questions are really about “do at least some humans survive?”, “do at least some humans develop such-and-such technology?”, etc.  That means allowing enough diversity or escape valves or freedom so that somebody can accomplish the goal.  You care a lot about not restricting ceilings.  Sure, most entrepreneurs aren’t going to be Elon Musk or anywhere close, but if the question is “does anybody get to survive/go to Mars/etc”, then what you care about is whether at least one person makes the relevant innovation work.  Playing to “keep the game going”, to make sure we actually have descendants in the far future, inherently means prioritizing best-case wins over average-case wins.

Upshots

I’m not arguing that it’s never a good idea to “make people do things.”  But I am arguing that there are reasons to be hesitant about it.

It’s hard to make people do what you want; you don’t actually have that much influence in the long term; people in their healthy state generally are correctly aware that they exist as distinct persons; surrendering judgment or censoring information is pretty fragile and unsustainable; and restricting people’s options cuts off the possibility of letting people seek or create especially good new things.

There are practical reasons why “leave people alone” norms became popular, despite the fact that humans are social animals and few of us are truly loners by temperament.

I think individualist cultures are too rarely explicitly defended, except with ideological buzzwords that don’t appeal to most people. I think that a lot of pejoratives get thrown around against individualism, and I’ve spent a lot of time getting spooked by the negative language and not actually investigating whether there are counterarguments.  And I think counterarguments do actually exist, and discussion should include them.

 

 

Advertisements
Report this ad
Advertisements
Report this ad

33 thoughts on “In Defense of Individualist Culture

    • I like that Edge response! Off topic, but the only time I’d heard of him before is when his ‘Geometric Unity’ physics idea got some media hype in 2013, but then no paper ever appeared and I can’t find much about it more recently.

      Do either of you have any idea if he’s written anything on it since? I’m just curious as it sounded intriguing.

      • Weinstein’s theory seems to be a gauge theory of gravity or a topological field theory, with enough extra details that he can hope it’s a theory of everything. This is a known kind of idea, there are other people who work on it. It should even have some sort of relation to string theory’s landscape of solutions – though possibly it is in the “swampland” of field theories that cannot exist as limits of string theory. That should imply consistency problems at very high energies.

    • mitchellporter: thanks, yeah that’s roughly the impression I got. I’d be very surprised if something like that worked without a striking new *conceptual* idea, given the huge number of things that have been tried going way back to the early unified field theories. Would still be nice to see the details though.

  1. [The following seems really important to communicate, so let me know if it seems unclear from my comment, and I’ll be happy to discuss it further.]

    You use arguments from the cluster of “studies on priming don’t replicate” and “personality is stable and largely genetically determined” to support the thesis that “people in an individualist environment will make decisions according to their own brains”.

    However, this logic seems to be based on a weird one-dimensional model in which “priming works” and “people make rational decisions in their lives” are opposites.

    Instead, I propose to add at least one more axis, labeled “manipulation power”. Your arguments are excellent support for a claim that certain ways of manipulating/influencing people are low on the manipulation power scale (eg. MP(priming) ~= 0). But they are NOT AT ALL support for the stronger claim that people in a modern Western-style individualist environment are typically subject to manipulation power below their internal manipulation resistance level.

    The claim you wanted to prove is “people in general are not as easily manipulated as you think”.
    But your arguments support only “these specific methods of manipulation, which are obviously not the ones everyone is even worried about, are not effective at manipulating people”.

  2. While I mostly agree with your conclusion, I don’t think the evidence you cite for your conclusion is as strong as you seem to be presenting it as. Most of the studies that lead us to conclude that much of variance in behavior is genetic and non-shared environment (e.g. non parenting) are conducted within a single society within a single time period. Which is to, say, biology explains a lot of the variance we observe, but the variance we observe may be very small compared to the variance that could be observed if there were not social factors decreasing variance.

    I’m not aware of any twin studies where a good sample size of both identical and heterozygotic twins are split up with some raised in Sweden and some raised in the Democratic Republic of Congo. And I know with high confidence that there are no twin studies where some twins are raised in modern England and some twins are raised in England in 1200 C.E.

    We have good knowledge of means and variances and can explain much of the variance given one set of social conditions. We only have weaker knowledge of means and variances across places and times. And we lack the tools to confidently explain variance across places and times. So while we may be able to confidently say that you or I can’t get someone to behave differently than they otherwise would have living in today’s world, we have a harder time supporting the claim that society couldn’t get everyone to behave differently by substantially changing the worlds conditions.

    • Yep! I don’t have evidence that culture doesn’t matter, but that *singlehandedly* trying to change a person (as parents do to their children) might not matter.

  3. Another good argument for individualist culture is – what are the chances that the one collective culture that you’re subject to is good for you? Granted, it can’t be so bad that it kills its members, because then it can’t survive, but that’s a low bar. But it’s not even closely optimized to what’s best for you, and it inhibits you in doing so yourself. It’s like if there were no product variety (you’re stuck with Generic Chair, Generic Shirt, etc), but even worse. And if collectivism is best for you, within individualist culture you can try to build one for yourself with likeminded people, but the reverse is a lot harder.

    Also, looking at revealed preferences, a lot more people want to move from collectivist to individualist cultures than vice versa. On an international level, wealth is an obvious confounding factor, but looking within the US, for example, a lot more people seem to want to escape small towns because they’re stifling than move to them because of their social technologies.

  4. > I think individualist cultures are too rarely explicitly defended, except with ideological buzzwords that don’t appeal to most people.

    Are “classic liberalism” and “libertarianism” just buzzwords? It’s an intellectual tradition that explicitly defends invidualism against the one thing with the most power to infringe on it: the government. Serious liberalism/libertarianism isn’t about tax rates or tariffs, it’s about individual flourishing through individual liberty, dignity and individual responsibility.

    Even the arguments in this area that seem purely economic mirror the general ideas that you presented. Concepts like free trade, comparative advantage and specialization are basically “let people do what they do best and let the best ones win” with some numbers behind it. And the core justification for libertarian policies is the same as for individualism: people know what they want and how to get it for themselves better than for others.

    The reason that “liberty” is less appealing to people today than in the days of Adam Smith and John Mill is that Smith and Mill mostly got what they wanted. The main fights today are on the margins, things like occupational licensing and school choice. Individualism doesn’t get defended (in the US) because it won.

    • Well, that’s true and reassuring, but people get born every year, and haven’t heard of any of this stuff. And it isn’t talked about, while “omg modernity is horrible” *is*, and so I worry.

      • What do you make of the fact that individualism is much better for capable individuals? If I’m smart, healthy and rich, I can make my own choices, choose my own communities and protect myself. If my personal ceiling is very low, I wouldn’t mind a collectivist system that lowers everyone’s ceiling.

        This may be why mass-culture products (e.g. a TV sitcom) reassure people that modernity sucks, while media aimed at high-ceiling people (e.g. a college commencement speech) tell people that they can forge their own destiny. The average reader of your blog is probably even more capable than an average college graduate, we don’t mind reading that individualism is awesome 🙂

  5. “A traditionalist or authoritarian culture can help people by guarding against some kinds of failure (families and churches can provide a social safety net, rules and traditions can keep people from making mistakes that ruin their lives), but it also constrains the upside, preventing people from creating innovations that are better than anything within the culture.”

    Why do innovations like nuclear bombs, turbopump rockets, and jet engines emerge from structures that are most certainly ‘authoritarian’? I think you could make a much better argument that real revolutionary advances depend on authoritarianism, than that authoritarianism prevents such advances.

    Once I would have made and defended this argument, but no longer. I think the correlation is ultimately spurious – people do the most radical innovation under authoritarian circumstances because of the pressure that led to the authoritarian circumstance, rather than the authoritarian circumstance itself.

    Centralization or distribution of authority is no more a question of correct or incorrect than turning right or turning left. It depends what’s to your right or left, it depends who the power will be distributed to or in whose hands it will accrete. No guiding principle has been adequately explained and defended, accounting for the vast diversity in success and failure of all sorts of human social structures. Political theory is in the same realm as medicine was until the last century or so – snake oil peddling that at best will not hurt you.

    That being the case, I end up partly on your side, though for very different reasons. If there’s no evidence that complex hierarchies do us a damn bit of good, we might as well dispense with them. I think it’s perfectly clear that we won’t see significant benefits from doing so (just like you won’t benefit from throwing out your healing-energy crystal pendant) but I really don’t like things that pretend to be useful yet aren’t.

    • Centralization or distribution of authority

      I don’t think this is really about centralization or distribution of authority; you can have an anti-individualist culture without centralized authority.

      • Good point – you can have a (relatively) distributed authority scheme without being individualist. However, all individualist schemes have distributed authority (right down to the individual.) Even vesting deciding power in the heads of households is concentrating power compared to that!

  6. Overall, good essay, I enjoyed reading it.

    >There’s also the fact that in the very long run, only existence proofs matter.
    In the long run we are all dead. I see no reason to prioritize the far future, or to value the continuation of “humanity”, beyond its inhabitants. Ceteris paribus, I would rather give 1000 extant people one extra year of happy life than one extant person 900. Similarly, ceteris paribus, I do not value us having descendants because I believe in the Procreation Asymmetry.

  7. Not saying that their approach is necessarily the best one or one that should be emulated, but it’s worth noting that in response to many of your comments about traditional cultures being hard to maintain in modernity, that the Amish seem to be doing pretty well.

    http://www.daviddfriedman.com/Academic/Course_Pages/legal_systems_very_different_12/Book_Draft/Systems/AmishChapter.html :

    > In an earlier chapter, I suggested that in North America toleration might eventually destroy the status of gypsies as self-governing communities by making it too easy for unhappy or ostracized members to defect. Along similar lines, it is arguable that the emancipation of European Jews, starting in the late 18th century, was responsible for the decline of the Jewish communities as distinct and effectively self-ruling polities. Yet the Amish have maintained their identity, culture, and ordnung, enforcing the latter by the threat of ostracism, despite the lack of any clear barrier to prevent unhappy or excommunicated members from deserting. Such desertion is made easier, in the Amish case, by the existence of Mennonite communities, similar to the Amish but less strict, which Amish defectors can and sometimes do join.

    > A critic of the Amish might argue that their upbringing, with schooling ending at eighth grade, leaves potential defectors unqualified for life in the modern world; the obvious response is that there are a lot of jobs in the modern world for which the willingness to work and the training produced by an apprenticeship starting at age fourteen are better qualifications than a high school diploma. As some evidence of the adequacy of Amish education, Amish seem to do quite well at starting and running their own small scale businesses.31

    > One might more plausibly suggest that a social system in which courting your future mate may start as early as fourteen leaves many young people locked into a future marriage well before the point at which they have to decide whether or not to accept the Ordnung and commit themselves to the Amish lifestyle—and it is a future marriage with a spouse raised Amish. It would be interesting to know whether, when Amish do choose to leave prior to baptism, they usually do it one by one or in couples.

    > One could also argue that the close bonds of Amish families create a form of lock-in. Social interaction between committed Amish and relatives who have chosen not to commit is not forbidden—shunning applies only to those who have sworn to obey the Ordnung and been baptised, but then fail to live up to their commitment—but given how much of the pattern of living of the Amish is determined by their religion and culture, refusing to commit must create a substantial barrier. The barrier is higher still for those who have been baptized, and so would face shunning if they left the church.32

    > Finally, one might interpret the low defection rate as evidence of successful indoctrination, not only into the principles of Amish life but into the negative view held by the Amish of the lives lived by non-Amish.33 Reading books on the Amish, all positive, all written by sympathizers,34 one is struck by how dark their picture of the outside world is. It is a world where people spend most of their efforts in competitive endeavor and display, in keeping up with the Joneses, where lives are divided among the almost wholly separate circles of work, family, and church, where little meaningful happens or can happen, a world of boredom and alienation.

    > There is, of course, one other possibility. Perhaps the Amish are correct in believing that they have a superior life-style, as judged by most of those who have lived it and observed the alternative—albeit a life style superior only for those who have had the good fortune to be brought up in it.

  8. A lot of theorists suggest that people have internal psychological parts (Plato, Freud, Minsky, Ainslie) or are part of larger social wholes (Hegel, Heidegger, lots and lots of people I haven’t read). But these, while suggestive, are metaphors and hypotheses. The basic, boring fact, usually too obvious to state, is that most of your behavior is proximately caused by your brain…

    This may be tangential to your main point, but: I can assure you that Minsky at least would be in violent agreement with the second sentence and would not view his theory of mental parts as in conflict with it. His Society of Mind theory is a bunch of metaphors and hypotheses, but they are in service of the goal of developing a fully mechanical / computational model of the mind. The same holds true for Freud and Ainslie, as far as I understand them. The question they are trying to answer is, given that behavior is produced by the brain, and the brain is very complex, how can we break that complexity down into interacting modules to understand it? Some of these modules may appear to have partially independent goals and agency, but that doesn’t mean they aren’t supposed to be thoroughly mechanical.

    If people do pretty much exist… then modeling people as individual agents who want things isn’t all that unreasonable

    The fact that people may be made of parts, some of which are in conflict, does not mean that people don’t exist, any more than the fact that chairs are made out of parts means chairs don’t exist.

    Modelling people as people with simple wants is perfectly fine as a first-order folk theory. But it doesn’t answer the interesting scientific questions of where these wants come from and how they are managed. I think rationalism tends to try to collapse the boundary between scientific and folk theories, in the effort to apply more scientific rigor to everyday life. But sometimes that boundary is useful. In everyday life, simple politeness and practicality obligate one to treat a person as a unitary entity, but for other purposes, like psychotherapy or AI, you have to be willing to deconstruct and reconstruct the underlying machinery.

    • I have seen people treat “you are made of parts and influenced by society” as a reason to not take a person’s preferences seriously, and instead try to “operate on” that person mechanically. I don’t like that. And it’s not entirely a strawman, because I’ve seen it happen. But I don’t deny that theories about “parts” can be meaningful and useful.

      • What about a diversitarian ideology that values diversity in and of its self, like Alexander’s Archipelago theory?

  9. This is good stuff. I agree almost unanimously with your observations, but I actually am not a fan of individualist societies, for exactly the reasons you cite as downsides.
    I actually think it is more important that there’s a higher floor, even if it means a lower ceiling. There are two things that strongly color my preference:
    1) I am a traditionalist Catholic (and thus distributist)
    2) I’m also a borderline case that almost fell through the cracks
    I was really, really lazy as a child and young adult. I would have been content to collect welfare, play video games all day, and have no friends. I’m not naturally good at reading social cues, have some leanings towards misanthropy, and am very introverted. Though I’m above average intelligence, and have a knack for mathematics and logic, I would have been a pathetic drain on society. If it weren’t for a strong and persistent family and Catholic social structure, that’s where I’d have been. And I’d likely have committed suicide by my late twenties. Instead, I’m a successful software engineer with a wife and kids.
    My Catholic faith also means I believe in an ontological dignity of man, and part of the way we’re meant to live that dignity is work. And after reading Chesterton, Fanfani, Aquinas, and others I’ve come to the conclusion that society ought to be ordered to minimize this. This isn’t communist-style “from each of their abilities, to each their needs”. It’s more of a broad conceptual focus on the health of the community, in which there will be richer and poorer, better and worse adapted, but again there will be through many complicated mechanisms, both social and legal, protections that would prevent that floor from ever getting too low.
    I do agree with you that there has to be a way to leave and opt out, at least to some degree, otherwise you get abuses, unstable cults of personality, etc.
    My big beef with the modern world is the SJW types seem to have an interest in keeping any serious Benedict option type communities from forming, having any degree of autonomy, and forcing us all to genuflect at the altar of their weird cause of the month.

  10. >Modern, individualist cultures don’t have a floor — but they don’t have a ceiling either.
    Of course they have a ceiling, and it seems rather obvious to me at least that the ceiling is lower than that of a community with a richer social structure and shared tradition. There is the saying “We don’t build cathedrals anymore,” the meaning being that moderns shun projects that of necessity take longer than one lifetime to see to completion. Elon Musk may be the paragon of individual accomplishment, but even in his story there is a hint of the ceiling. What if, as is not an implausible outcome, we are unable to solve the radiation problem and a trip to Mars promises a wealth of scientific data collection but also certain death? What individualist would sign up for that? Wouldn’t it require a commitment to the scientific project and trust that other participants in the scientific tradition would mine that data and making something useful out of the sacrifice? Even that is a rather exotic example that promises things likes individual fame and glory. There are many more pedestrian social roles that the preponderance of societies throughout history have needed to fill. Does a truly individualist society have a credible military? Or stay-at-home mothers? Forgive me for asking these basic questions, but it seems to me like the argument you are making is that there is no loss of upside in telling an ant colony to act as if it’s every ant for himself.

  11. That was a good read.

    Since our behavior is determined largely by our genetics, doesn’t it make sense to make sure there is no floor, so that those who aren’t autonomous don’t get a chance to reproduce? Yes, I’m talking about social Darwinism.

Leave a Reply