Does AI teach us to be less human?

We are starting to internalize the machine

A lot of the language around us is generated by models. Everything is organized, clear, and easy to work with — from the specs and summaries to the emails and strategy drafts. The tone is calm, slightly moderated, and almost allergic to emotional spikes. Everything gets resolved. Everything sounds reasonable.

If you spend enough time in that linguistic environment, it starts to seep into your thinking. You start organizing your thoughts into blocks that can be easily processed by a computer. You're always smoothing out contradictions before they even come up. You take messy ideas and boil them down into neat explanations. It's not that you actively like the style, but rather, repetition builds habit.

Years ago, we did adapt to Google search, though. It wasn't perfect, so we learned to write in keywords. Now models understand context much better, and we're still adapting — this time through prompts. We make things clear, simple, and high-quality. At the same time, we congratulate ourselves on being able to detect "AI slop," as if that distance protects us. It doesn't. The more we use machine-generated language, the more it becomes the norm. And baselines are quietly changing what's considered normal.

Communication without maintenance

Interacting with AI doesn't have any social costs. You don't have to end the conversation. You don't need to use a signal tone. You don't have to keep up the rapport. You ask a question, you get an answer, and then you move on.

That might seem like a small thing, but a lot of the time, people's conversations are based on these little hiccups. Saying "see you tomorrow" or "talk later" isn't necessary when you don't need to share that much information. They're the glue that holds everything together. If you remove enough of that glue from your daily practice, the reflex will weaken.

If a lot of your conversations don't require much emotional intelligence, you'll probably use that skill less. We're seeing communication moving towards a more practical, functional approach. It's efficient. It's clean. And it's slightly hollow.

The shift isn't too dramatic. It's a process that builds up over time.

Outsourcing the uncomfortable parts

It's not just about language patterns. We're increasingly delegating interaction itself. Messages, replies, documents, and even sensitive emails are all passed through a model to be refined, softened, and made better. The goal is productivity. The best part is that it takes the stress out of crafting tone under pressure.

You don't have to stress about how to give tough feedback. When you're working on something sensitive, you don't have to keep wondering what the other person might think. You can just let the system suggest a balanced version and move on.

This changes how you feel about discomfort over time. When it comes to emotional friction, you can't just go around it; you have to find another route. That's efficient in the short term. In the long run, there'll be fewer reps in the situations that help build depth in communication.

Human interaction has always required effort. When effort becomes optional, capability usually follows the path of least resistance.

The illusion of reclaimed humanity

There's a nice idea going around that automation frees up time to focus on what really matters. If AI handles routine work, we supposedly gain space for family, creativity, reflection — the human core.

But in the past, productivity gains haven't led to more stillness. They really raise the bar. The time saved gets reinvested into more tasks, more output, and more expectations. Automation doesn't slow us down; it increases throughput.

AI doesn't automatically make us more thoughtful. It makes the things we're already working on even better. If you focus on output, you'll get more of it. Humanity isn't an automatic byproduct of efficiency.

The external brain effect

We've got a cognitive prosthesis that's always available. Memory, synthesis, phrasing, brainstorming — you can get to it all right away. That changes how internal cognition feels.

The friction of not knowing used to force deeper processing. You either remembered, or you rebuilt the reasoning from scratch. Now you can skip the internal reconstruction and go straight to a plausible answer.

This doesn't mean people become stupid. It's basically a change in how you distribute your mental effort. Retrieval becomes externalized. Slow synthesis is now an option. People tend to put in the bare minimum.

Things like confusion, hesitation, and uncertainty that last a long time are hard, but they also teach us a lot. If we always take the shortcut, it changes how we think. It'll be faster and more responsive, and it might even get a bit thinner.

Absorbing machine temperament

LLMs are all about structure and moderation. They're great at smoothing out the extremes, balancing positions, and quickly resolving tension. That attitude isn't neutral. It's all about thinking in a certain way — making sense, keeping it cool, and not getting into any trouble unless someone tells you to.

After spending so much time every day dealing with that style, it's no surprise that there are consequences. You start by pre-balancing your own arguments. You naturally round off sharp edges. You deal with internal contradictions before fully exploring them. It feels rational, even mature.

But it can also be a bit dull.

People's thoughts are often all over the place, emotional, and sometimes even excessive. That excess isn't always a bad thing. It can actually be a sign of originality and passion. When the person leading the conversation keeps saying that moderation is normal, intensity starts to seem like a mistake.

Optimization as worldview

AI interaction subtly reinforces a worldview where clarity, efficiency, and measurable usefulness dominate. Friction seems wasteful. Ambiguity seems like something to get rid of. When you've got a lot of emotional complexity, it can make formatting tricky.

Human relationships don't work that way. They're just not designed to be efficient. Some of the things they include are misalignment, repair, long pauses, and unresolved tension. These aren't bugs in the system; they're actually part of it.

When optimization is the main focus, anything that doesn't have a clear return on investment starts to feel less important. If empathy doesn't boost productivity, it's harder to prioritize. If emotional labor isn't something you can measure, it's easier to outsource.

That shift doesn't require a dystopian scenario. It just takes consistent incentives.

The quiet narrowing

There's no dramatic collapse here. You can live alone, work remotely, interact with models for most cognitive tasks, and maintain a curated circle of human contact. Everything's working just fine. Everything's manageable.

You don't need other people to think for you anymore. You only need them when it's convenient. That's not science fiction. It's technically possible already.

The risk isn't that we become robots. The thing is, there are fewer and fewer situations where people need to really invest themselves. And whatever isn't essential tends to wear down over time.

If you've read The Naked Sun by The Naked Sun, you might recognize some of the themes. On the planet Solaria, people live physically isolated, interacting through mediated channels. It's hard to be in the same space as someone when you're not comfortable with them. It's not very efficient. Social contact is minimized because technology makes it unnecessary.

Solaria wasn't a loud, dramatic dystopia. It was optimized. Rational. It's clean. People had everything they needed. They just weren't that important to each other.

That's the part that really sticks.

AI isn't going to make us any less human. It changes the incentives. It'll save you some cash on detachment and compression. People are good at adapting to incentives.

The real question is not whether AI dehumanizes us. It's about whether we'll choose to keep the parts of being human that are slow, inefficient, and emotionally demanding — and more and more unnecessary.

Because once something's not needed anymore, it usually doesn't go away with a lot of drama. It just fades.