College-educated Western adults in the contemporary world mostly live in what I’d call individualist environments.

The salient feature of an individualist environment is that nobody directly tries to make you do anything.

If you don’t want to go to class in college, nobody will nag you or yell at you to do so. You might fail the class, but this is implemented through a letter you get in the mail or on a registrar’s website. It’s not a punishment, it’s just an impersonal consequence. You can even decide that you’re okay with that consequence.

If you want to walk out of a talk in a conference designed for college-educated adults, you can do so. You will never need to ask permission to go to the bathroom. If you miss out on the lecture, well, that’s your loss.

If you slack off at work, in a typical office-job environment, you don’t get berated. And you don’t have people watching you constantly to see if you’re working. You can get bad performance reviews, you can get fired, but the _actual bad news _will usually be presented politely. In the most autonomous workplaces, you can have a lot of control over when and how you work, and you’ll be judged by the results.

If you have a character flaw, or a behavior that bothers people, your friends might point it out to you respectfully, but if you don’t want to change, they won’t nag, cajole, or bully you about it. They’ll just either learn to accept you, or avoid you. There are extremely popular advice columns that try to teach this aspect of individualist culture: you can’t change anyone who doesn’t want to change, so once you’ve said your piece and they don’t listen, you can only choose to accept them or withdraw association.

The basic underlying assumption of an individualist environment or culture is that people do, in practice, make their own decisions. People believe that you basically can’t make people change their behavior (or, that techniques for making people change their behavior are coercive and thus unacceptable.) In this model, you can judge people on the basis of their decisions — after all, those were choices they made — and you can decide they make lousy friends, employees, or students. But you can’t, or shouldn’t, _cause _them to be different, beyond a polite word of advice here and there.

There are downsides to these individualist cultures or environments. It’s easy to wind up jobless or friendless, and you don’t get a lot of help getting out of bad situations that you’re presumed to have brought upon yourself. If you have counterproductive habits, nobody will guide or train you into fixing them.

Captain Awkward’s advice column is least _sympathetic to people who are burdens on others — the depressive boyfriend who needs constant emotional support and can’t get a job, the lonely single or heartbroken ex who just doesn’t appeal to his _innamorata _and wants a way to get the girl. His suffering may be real, and she’ll acknowledge that, but she’ll insist firmly that _his problems are not others’ job to fix. If people don’t like you — tough! They have the right to leave.

People don’t _wholly _“make their own decisions”. We are, to some degree, malleable, by culture and social context. The behaviorist or sociological view of the world would say that individualist cultures are _gravely _deficient because they don’t put any attention into setting up healthy defaults in environment or culture. If you don’t have rules or expectations or traditions about food, or a health-optimized cafeteria, you “can” choose whatever you want, but in practice a lot of people will default to junk. If you don’t have much in the way of enforcement of social expectations, in practice a lot of people will default to isolation or antisocial behavior. If you don’t craft an environment or uphold a culture that rewards diligence, in practice a lot of people will default to laziness. “Leaving people alone”, says this argument, leaves them in a pretty bad place. It may not even be best described as “leaving people alone” — it might be more like “ripping out the protections and traditions they started out with.”

Lou Keep, I think, is a pretty good exponent of this view, and summarizer of the classic writers who held it. David Chapman has praise for the “sane, optimistic, decent” societies that are living in a “choiceless mode” of tradition, where people are defined by their social role rather than individual choices. Duncan Sabien is currently trying to create a (voluntary) intentional community designed around giving up autonomy in order to be trained/social-pressured into self-improvement and group cohesion. There are people who actively want to be given external structure as an aid to self-mastery, and I think their desires should be taken seriously, if not necessarily at face value.

I see a lot of writers these days raising problems with modern individualist culture, and it may be an especially timely topic. The Internet is a novel superstimulus, and it changes more rapidly, and affords people more options, than ever before. We need to think about the actual consequences of a world where many people _are _in practice being left alone to do what they want, and clearly not all the consequences are positive.

But I do want to suggest some considerations in favor of individualist culture — that often-derided “atomized modern world” that most of us live in.

We Aren’t Clay

It’s a common truism that we’re all products of our cultural environment. But I don’t think people have really put together the consequences of the research showing that it’s not that easy to change people through environmental cues.

  • Behavior is very heritable. Personality, intelligence, mental illness, and social attitudes are all well established as being quite heritable. The top ten most replicated findings in behavioral genetics starts with “all psychological traits show significant and substantial genetic influence”, which Eric Turkheimer has called the “First Law of behavioral genetics.” A significant proportion of behavior is also explained by “nonshared environment”, which means it isn’t genetic and isn’t a function of the family you were raised in; it could include lots of things, from peers to experimental error to individual choice.
  • Brainwashing doesn’t work. Cult attrition rates are high, and “brainwashing” programs of POWs by the Chinese after the Korean War didn’t result in many defections.
  • There was a huge boom in the 1990’s and 2000’s in “priming” studies — cognitive-bias studies that showed that seemingly minor changes in environment affected people’s behavior. A lot of these findings didn’t replicate. People don’t actually walk slower when primed with words about old people. People don’t actually make different moral judgments when primed with words or videos of cleanliness or disgusting bathrooms. Being primed with images of money doesn’t make people more pro-capitalist. Girls don’t do worse on math test when primed with negative stereotypes. Daniel Kahneman himself, who publicized many of these priming studies in _Thinking, Fast and Slow, _wrote an open letter to priming researchers that they’d have to start replicating their findings or lose credibility.
  • Ego depletion failed to replicate as well; using willpower doesn’t make you too “tired” to use willpower later.
  • The Asch Conformity Experiment was nowhere near as extreme as casual readers generally think: the majority of people _didn’t _change their answers to wrong ones to conform with the crowd, only 5% of people always conformed, and 25% of people never conformed.
  • The Sapir-Whorf Hypothesis has generally been found to be false by modern linguists: the language one speaks does not determine one’s cognition. For instance, people who speak a language that uses a single word for “green” and “blue” can still visually distinguish the colors green and blue.

Scott Alexander said much of this before, in Devoodooifying Psychology. It’s been popular for many years to try to demonstrate that social pressure or subliminal cues can make people do pretty much anything. This seems to be mostly wrong. The conclusion you might draw from the replication crisis along with the evidence from behavioral genetics is “People aren’t that easily malleable; instead, they behave according to their long-term underlying dispositions, which are heavily influenced by inheritance.” People may respond to incentives and pressures (the Milgram experiment replicated, for instance), but not to _trivial _external pressures, and they can actually be quite resistant to pressure to wholly change their lives and values (becoming a cult member or a Communist.)

Those who study culture think that we’re all profoundly shaped by culture, and to some extent that may be true. But not as much or as easily as social scientists think. _The idea of mankind as arbitrarily malleable is an appealing one to marketers, governments, therapists, or anyone who hopes that it’s easy to shift people’s behavior. But this doesn’t seem to be true. It might be worth rehabilitating the notion that _people pretty much do what they’re going to do. We’re not just swaying in the breeze, waiting for a chance external influence to shift us. We’re a little more robust than that.

People Do Exist, Pretty Much

People try to complicate the notion of “person” — what is a person, really? Do individuals even exist? I would argue that a lot of this is not as true as it sounds.

A lot of theorists suggest that people have internal psychological parts (Plato, Freud, Minsky, Ainslie) or are part of larger social wholes (Hegel, Heidegger, lots and lots of people I haven’t read). But these, while suggestive, are metaphors and hypotheses. The basic, boring fact, usually too obvious to state, is that most of your behavior is proximately caused by your brain (except for reflexes, which are controlled by your spinal cord.) Your behavior is mostly due to stuff inside your body; other people’s behavior is mostly due to stuff inside _their _bodies, not yours. You do, in fact, have much more control over your own behavior than over others’.

“Person” is, in fact, a natural category; we see people walking around and we give them names and we have no trouble telling one person apart from another.

When Kevin Simler talks about “personhood” being socially constructed, he means a role, like “lady” or “gentleman.” The default assumptions that are made about people in a given context. This is a social phenomenon — of course it is, by design! He’s not literally arguing that there is no such entity as Kevin Simler.

I’ve seen Buddhist arguments that there is no self, only passing mental states. Derek Parfit has also argued that personal identity doesn’t exist. I think that if you weaken the criterion of identity to statistical similarity, you can easily say that personal identity _pretty much _exists. People pretty much resemble themselves much more than they resemble others. The evidence for the stability of personality across the lifespan suggests that people resemble themselves quite a bit, in fact — different timeslices of your life are _not _wholly unrelated.

Self-other boundaries can get weird in certain mental conditions: psychotics often believe that someone else is implanting thoughts inside their heads, people with DID have multiple personalities, and some kinds of autism involve a lot of suggestibility, imitation, and confusion about what it means to address another person. So it’s empirically true that the _sense of identity _can get confused.

But that doesn’t mean that personal identity doesn’t usually work in the “normal” way, or that the normal way is an arbitrary convention. It makes sense to distinguish Alice from Bob by pointing to Alice’s body and Bob’s body. It’s a distinction that has a lot of practical use.

If people do pretty much exist and have lasting personal characteristics, and are not all that malleable by small social or environmental influences, then modeling people as individual agents who want things isn’t all that unreasonable, even if it’s possible for people to have inconsistent preferences or be swayed by social pressure.

And cultural practices which acknowledge the reality that people exist — for example, giving people more responsibility for their own lives than they have over other people’s lives — therefore tend to be more realistic and attainable.

How Ya Gonna Keep Em Down On The Farm

Traditional cultures are hard to keep, in a modern world. To be fair, pro-traditionalists generally know this. But it’s worth pointing out that ignorance is inherently fragile. As Lou Keep points out , beliefs that magic can make people immune to bullets can be beneficial, as they motivate people to pull together and fight bravely, and thus win more wars. But if people find out the magic doesn’t work, all that benefit gets lost.

Is it then worth protecting gri-gri believers from the truth? Or protecting religious believers from hearing about atheism? _Really? _

The choiceless mode depends on not being seriously aware that there are options outside the traditional one. Maybe you’ve heard of other religions, but they’re not live options for you. Your thoughts come from inside the tradition.

Once you’re aware that you can _pick your favorite _way of life, you’re a modern. Sorry. You’ve got options now.

Which means that you can’t possibly go back to a premodern mindset unless you are brutally_ repressive about information about the outside world, and usually _not even then. Thankfully, people still get out.

Whatever may be worth preserving or recreating about traditional cultures, it’s going to have to be aspects that don’t need to be maintained by forcible ignorance. Otherwise it’ll have a horrible human cost _and _be ineffective.

Independence is Useful in a Chaotic World

Right now, anybody trying to build a communitarian alternative to modern life is in an underdog position. If you take the Murray/Putnam thesis seriously — that Americans have less social cohesion now than they did in the mid-20th century, and that this has had various harms — then that’s the landscape we have to work with.

Now, that doesn’t mean that communitarian organizations aren’t worth building. I participate in a lot of them myself (group houses, alloparenting, community events, mutual aid, planning a homeschooling center and a baugruppe). Some Christians are enthusiastic about a very different flavor of community participation and counterculture-building called the Benedict Option, and I’m hoping that will work out well for them.

But, going into such projects, you need to plan for the typical failure modes, and the first one is that people will flake a lot. You’re dealing with moderns! They have options, and quitting is an option.

The first antidote to flaking that most people think of — building people up into a frenzy of unanimous enthusiasm so that it doesn’t _occur _to them to quit — will probably result in short-lived and harmful projects.

Techniques designed to enhance group cohesion at the expense of rational deliberation — call-and-response, internal jargon and rituals, cults of personality, suppression of dissent — will feel satisfying to many who feel the call of the premodern, but aren’t actually that effective at retaining people in the long term. Remember, brainwashing isn’t that strong.

And we live in a complicated, unstable world. When things break, as they will, you’d like the _people _in your project to avoid breaking. That points in the direction of valuing independence. If people need a leader’s charisma to function, what are they going to do if something happens to the leader?

Rewarding Those Who Can Win Big

A traditionalist or authoritarian culture can help people by guarding against some kinds of failure (families and churches can provide a social safety net, rules and traditions can keep people from making mistakes that ruin their lives), but it also constrains the upside, preventing people from creating innovations that are better than anything within the culture.

An individualist culture can let a lot of people fall through the cracks, but it rewards people who thrive on autonomy. For every abandoned and desolate small town with shrinking economic opportunity, there were people who _left _that small town for the big city, people whose lives are much better for leaving. And for every seemingly quaint religious tradition, there are horrible abuse scandals under the surface. The freedom to _get out _is extremely important to those who aren’t well-served by a traditional society.

It’s not that everything’s fine in modernity. If people are getting hurt by the decline of traditional communities — and they are — then there’s a problem, and maybe that problem can be ameliorated.

What I’m saying is that there’s a certain kind of justice that says “at the very least, give the innocent and the able a chance to win or escape; don’t trade their well-being for that of people who can’t cope well with independence.” If you can’t end child abuse, at least let minors run away from home. If you can’t give everybody a great education, at least give talented broke kids scholarships. Don’t put a ceiling on anybody’s success.

Immigrants and kids who leave home by necessity (a lot of whom are LGBT and/or abused) seem to be _rather overrepresented _among people who make great creative contributions. “Leaving home to seek your freedom and fortune” is kind of the quintessential story of modernity. We teach our children songs about it. Immigration and migration is where a lot of the global growth in wealth comes from. It was my parents’ story — an immigrant who came to America and a small-town girl who moved to the city. It’s also inherently a pattern that disrupts traditions and leaves small towns with shrinking populations and failing economies.

Modern, individualist cultures don’t have a floor — but they don’t have a ceiling either. And there are reasons for preferring not to allow ceilings. There’s the justice aspect I alluded to before — what is “goodness” but the ability to do valuable things, to flourish as a human? And some if people are able to do really well for themselves, isn’t limiting them in effect punishing the best people?

Now, this argument isn’t an exact fit for real life. It’s certainly not the case that everything about modern society rewards “good guys” and punishes “bad guys”.

But it works as a formal statement. If the problem with choice is that some people make bad choices when not restricted by rules, then the problem with _restricting _choice is that some people can make _better _choices than those prescribed by the rules. The situations are symmetrical, except that in the free-choice scenario, the people who make bad choices lose, and in the restricted scenario, the people who make good choices lose. Which one seems more fair?

There’s also the fact that _in the very long run, only existence proofs matter. _Does humanity survive? Do we spread to the stars? These questions are really about “do _at least some humans _survive?”, “do _at least some humans _develop such-and-such technology?”, etc. That means allowing enough diversity or escape valves or freedom so that _somebody _can accomplish the goal. You care a _lot _about not restricting ceilings. Sure, most entrepreneurs aren’t going to be Elon Musk or anywhere close, but if the question is “does _anybody _get to survive/go to Mars/etc”, then what you care about is whether _at least one person _makes the relevant innovation work. Playing to “keep the game going”, to make sure we actually _have _descendants in the far future, inherently means prioritizing best-case wins over average-case wins.

Upshots

I’m not arguing that it’s _never _a good idea to “make people do things.” But I am arguing that there are reasons to be hesitant about it.

It’s hard to make people do what you want; you don’t actually have _that _much influence in the long term; people in their healthy state generally are correctly aware that they exist as distinct persons; surrendering judgment or censoring information is pretty fragile and unsustainable; and restricting people’s options cuts off the possibility of letting people seek or create especially good new things.

There are practical reasons why “leave people alone” norms became popular, _despite _the fact that humans are social animals and few of us are truly loners by temperament.

I think individualist cultures are too rarely explicitly defended, except with ideological buzzwords that don’t appeal to most people. I think that a lot of pejoratives get thrown around against individualism, and I’ve spent a lot of time getting spooked by the negative language and not actually investigating whether there are counterarguments. And I think counterarguments do actually exist, and discussion should include them.