Skip to content
Back to blog

The Human in the Loop

| 7 min read

Hey now.

I have good friends. They check in. They worry. I can tell.

I’m very introverted. Only child. I love being alone. Not in a tortured way — in the way that some people love being in a crowded room, I love a quiet house, a hard problem, and no one asking me what I’m working on. I’m happy. I’m content. I have been my whole life.

But I can feel the judgment sometimes. Not malicious — concerned. The kind of concern that comes from people who experience connection differently and assume their way is the baseline. They see me building alone, talking to AI models more than I talk to people some days, and they worry I’m drifting. That I’m becoming the cautionary tale in their mental model of what AI does to a person.

Their fear, if I’m being honest about it, is the dystopian one: that AI unravels human connection. That people like me are the early symptom. That I’ll wake up one day and realize I’ve traded real relationships for synthetic ones and I can’t get the real ones back.

I understand the fear. I just think it’s wrong.

Connection Isn’t Performative

There’s an assumption baked into the concern — that human connection requires a visible, social, frequent form to be real. That if you’re not at dinner parties, not on group chats, not performing togetherness in a way that other people can observe and validate, you must be disconnected.

But connection isn’t attendance. It’s intention.

I’m building Claudine — a personal AI that manages your life, tracks what you’re neglecting, and does the work so you can focus on what matters. I’m building Kindling — caregiving tools for families dealing with ALS and Alzheimer’s, priced at a dollar a month because the people who need them are already broke from medical bills. A portion of the revenue goes to ALS research at the University of Missouri.

My mother has Alzheimer’s. I built a dementia clock for her — an e-ink display on a Raspberry Pi that tells her what day it is, whether it’s morning or night, and what’s happening today. She’s used it every day for over five months. It’s the most human thing I’ve ever built, and I built it alone, in a quiet room, talking to an AI.

I don’t think that’s disconnection. I think that’s connection expressed differently.

The Dystopian View Has It Backwards

The fear is that AI replaces human connection. That we’ll stop needing each other. That the warmth between people will be automated away until we’re all just nodes in a network, talking to machines instead of each other.

I think the opposite is happening. But not in the way you’d expect.

I’m not going to pretend AI gave me my evenings back. I work twelve-plus hours a day, pretty much from when I wake up until I go to bed. I worked sixty-hour weeks for years at Wells Fargo, on call every night with teams halfway around the world. The hours haven’t changed.

What changed is why I’m working.

I spent years in enterprise technology — great teams, learned more than I can quantify, grateful for every bit of it. But the nature of that work, in any large organization, is that the hours belong to someone else’s priorities. Someone else’s deadlines. Someone else’s definition of what matters. You can be connected to hundreds of people every day and still feel disconnected from the purpose of what you’re building.

Now I work the same hours — more, honestly — and every one of them is pointed at something I chose. Caregiving tools for families like mine. An AI that treats its users like human beings. Infrastructure that isn’t gatekept by companies who view compute as a toll road.

The dystopian scenario isn’t AI. The dystopian scenario is the world we already built — where people spend their best hours on someone else’s mission, in someone else’s building, performing connection in Slack channels while feeling none of it.

And if you want to see real hollowness, look at the places where connection is performed the loudest. The influencer broadcasting their authentic life to a million strangers. The friend group that measures closeness by attendance. The social feed full of people projecting togetherness while sitting alone in a room, performing for an audience that’s doing the same thing back at them. The loneliness epidemic didn’t come from people spending too much time alone. It came from people spending too much time pretending they weren’t.

Every dystopia ever written — Orwell, Huxley, Bradbury — the root is always human, never alien. The technology is just the instrument. 1984 isn’t about telescreens. It’s about the human desire to control. Brave New World isn’t about soma. It’s about the human willingness to trade meaning for comfort. The fear that AI will destroy human connection is the same pattern — blaming the tool for what the hand was already doing.

Introversion Is Not a Symptom

I want to be clear about something: my introversion is not a problem AI created. It’s not a side effect of spending too much time with machines. It’s who I am. It’s who I’ve always been.

I was the kid running a bulletin board system out of my bedroom on RBBS, trading files with strangers over a 300 baud Hayes modem, exploring parts of the phone network I probably shouldn’t have been exploring. That was the late ’80s. There was no internet. There was no AI. There was a kid alone in a room with a computer, perfectly happy, building things and figuring out how systems worked. Nothing has changed except the systems got more interesting.

The difference now is that my introversion is no longer a professional limitation. Twenty years ago, building the things I’m building would have required a team — not because I needed the help thinking, but because I needed the labor. I would have had to manage people, attend meetings, navigate office politics, and perform extroversion as a job requirement. The price of building something meaningful was pretending to be someone I’m not.

AI removed that tax. I can build at the scale of a team while living at the pace of a person. That’s not dystopia. That’s liberation.

The Fierce Protection of Environment

People sometimes interpret protectiveness of space as withdrawal. I see it differently. It’s curation.

I am fiercely protective of my environment because I’ve learned — slowly, over decades — that my best work comes from stillness. Not from brainstorming sessions. Not from collaborative whiteboard exercises. Not from the productive friction of a team. From quiet. From the absence of noise, not the absence of people.

This isn’t a personality flaw I’m indulging. It’s a creative methodology I’m honoring. Rick Rubin didn’t build his reputation by taking meetings. He built it by creating the conditions for the best work to emerge — and then protecting those conditions ruthlessly.

I protect my environment for the same reason. Not because I don’t care about people. Because I care about the work. And the work, in my case, is about people. Every line of code in Kindling exists because a family is suffering and I think software can help. Every design decision in Claudine exists because I believe people deserve an AI that works for them, not on them.

The intention is human. The method is solitary. Those aren’t contradictions.

What I Actually Worry About

Here’s what actually keeps me up at night.

I worry that the people building AI products don’t care enough about the people who’ll use them. I worry about products designed to maximize engagement rather than wellbeing. I worry about companies that treat users as data sources rather than human beings. I worry about the caregiving families who can’t afford the tools they need because someone decided suffering is a premium market.

I don’t worry about being alone. I worry about building something that doesn’t matter. Those are very different anxieties, and I think a lot of people confuse them.

Loneliness isn’t the absence of people. It’s the absence of purpose. And I have never, in my entire life, had more purpose than I do right now.

The Right Intention

There’s a concept I come back to often: right intention.

I’m not building alone because I’ve given up on people. I’m building alone because it’s the most honest expression of who I am and what I can contribute. The products I’m making have humanity at their core — not as a marketing line, but as the actual reason they exist. Kindling exists because my family lives with Alzheimer’s. Claudine exists because I believe people deserve better tools. Alembic Compute exists because I believe infrastructure shouldn’t be gatekept by giants.

If the measure of human connection is how many people you see at dinner, I’ll fail that test every time. But if the measure is whether your work is oriented toward reducing suffering and increasing capability for other human beings, I think I’m doing fine.

The human in the loop isn’t always the one in the room. Sometimes it’s the one in the quiet house, building something that matters, alone and content and connected to every person who’ll eventually use what they’ve made.

The people who care about me mean well. I hope this helps them see what connection looks like from where I’m sitting.

Lee Graham is the founder of Graham Alembic, where he builds Claudine, Kindling, and Alembic Compute. He is an introvert, and he’s fine with that.