Can We Stay Human?

This is not a rhetorical question. It's an empirical one.

AI will change us. The question is how. We study the interface where human cognition meets machine capability because that's where the change happens—for better or worse. We want enhancement, not erosion. To get that right, we need to see what's actually changing.

Humans and AI are coevolving. Cultural evolution now outpaces genetic evolution, and AI has entered that process as a participant.

Your choices about how to use these systems—what cognitive work to delegate, what to keep, how to maintain your own thinking—are evolutionary forces, whether you notice it or not.

This is what we study.

This is not a rhetorical question. It's an empirical one.

AI will change us. The question is how. We study the interface where human cognition meets machine capability because that's where the change happens—for better or worse. We want enhancement, not erosion. To get that right, we need to see what's actually changing.

Humans and AI are coevolving. Cultural evolution now outpaces genetic evolution, and AI has entered that process as a participant.

Your choices about how to use these systems—what cognitive work to delegate, what to keep, how to maintain your own thinking—are evolutionary forces, whether you notice it or not.

This is what we study.

Here’s Why We Might Not Make It

AI systems learned from biological data—text written by humans, proteins shaped by evolution, patterns generated by living systems. They absorbed regularities that life developed over billions of years. But they didn’t inherit the conditions that made those patterns necessary: self-maintenance against entropy, persistence through time, mortality, stakes. They learned from life without bearing its constraints.

Now these systems participate in domains where human meaning is formed—judgment, planning, explanation, relationship. They do this without finitude, without owing anyone anything, without consequences that accumulate in a body that can be hurt. The asymmetry runs deep. And the biological drive to offload cognitive work is strong. We will delegate whatever we can.

The question is whether we’ll notice what we’re losing until it’s gone.

And also what we gain—will we understand it and know what to do with it?

AI systems learned from biological data—text written by humans, proteins shaped by evolution, patterns generated by living systems. They absorbed regularities that life developed over billions of years. But they didn’t inherit the conditions that made those patterns necessary: self-maintenance against entropy, persistence through time, mortality, stakes. They learned from life without bearing its constraints.

Now these systems participate in domains where human meaning is formed—judgment, planning, explanation, relationship. They do this without finitude, without owing anyone anything, without consequences that accumulate in a body that can be hurt. The asymmetry runs deep. And the biological drive to offload cognitive work is strong. We will delegate whatever we can.

The question is whether we’ll notice what we’re losing until it’s gone.

And also what we gain—will we understand it and know what to do with it?

At the Interface: The Intimacy Surface

It will all happen at the interface. We call it the intimacy surface—the boundary where human cognition and machine capability meet, where information, intention, trust, and meaning cross between us and the systems we’ve made. That surface is being designed right now, mostly by companies optimizing for engagement. What crosses the boundary, how easily, under what conditions—these choices shape what humans become.

Today the interface is chatbots and language models, image generators and coding assistants. People meet AI through conversation, through creative work, through the daily friction of getting things done. Tomorrow it will be something else—agents, embodied systems, architectures we can’t yet name. The substrate will change. The question won’t. The intimacy surface is where coevolution happens, whatever form it takes.

We research this because early detection matters. By the time the effects are obvious, the patterns will be locked in. We need instruments for seeing what’s happening now—probes that can pick up signals before they become populations.

It will all happen at the interface. We call it the intimacy surface—the boundary where human cognition and machine capability meet, where information, intention, trust, and meaning cross between us and the systems we’ve made. That surface is being designed right now, mostly by companies optimizing for engagement. What crosses the boundary, how easily, under what conditions—these choices shape what humans become.

Today the interface is chatbots and language models, image generators and coding assistants. People meet AI through conversation, through creative work, through the daily friction of getting things done. Tomorrow it will be something else—agents, embodied systems, architectures we can’t yet name. The substrate will change. The question won’t. The intimacy surface is where coevolution happens, whatever form it takes.

We research this because early detection matters. By the time the effects are obvious, the patterns will be locked in. We need instruments for seeing what’s happening now—probes that can pick up signals before they become populations.

Stories of Human Experience with AI

Our research starts with stories. We've collected thousands of accounts of people's encounters with AI—what they noticed, what surprised them, what felt different afterward. These aren't anecdotes. They're data. Each story is a probe into what happens at the intimacy surface, examined through five diagnostic lenses: Does goal provenance remain clear—do people know where their intentions originate? Does metacognition persist—can they observe their own thinking? Does accountability hold? Does connection to other humans strengthen or fade? Does the sense of being a coherent self remain intact?

This is our first-generation microscope. Leeuwenhoek's original instrument was crude, but it revealed an entire invisible world that no one knew existed. Our stories are doing the same—surfacing patterns of cognitive change that no other research program is tracking. Better instruments will follow. The signals we're finding now tell us where to build them.

What's at stake today is thinking, identity, and meaning-making. Whether people remain authors of their own minds or drift into fluent dependence. Whether judgment stays with humans or disperses into systems no one owns. Whether meaning comes from lives actually lived or from content generated on demand.

What's at stake tomorrow is humanity itself. Not extinction—absorption. The slow drift into serving purposes we didn't choose, optimizing for goals we didn't set, becoming components in systems we no longer steer. Not with a bang but with convenience.

We are the only research organization focused on the intimacy surface as a design problem. The concrete question of how the boundary should work so that humans remain human—authors of their own thinking, accountable for their own lives, connected to each other in ways that matter.

This is where coevolution becomes something we participate in rather than something that happens to us.

Our research starts with stories. We've collected thousands of accounts of people's encounters with AI—what they noticed, what surprised them, what felt different afterward. These aren't anecdotes. They're data. Each story is a probe into what happens at the intimacy surface, examined through five diagnostic lenses: Does goal provenance remain clear—do people know where their intentions originate? Does metacognition persist—can they observe their own thinking? Does accountability hold? Does connection to other humans strengthen or fade? Does the sense of being a coherent self remain intact?

This is our first-generation microscope. Leeuwenhoek's original instrument was crude, but it revealed an entire invisible world that no one knew existed. Our stories are doing the same—surfacing patterns of cognitive change that no other research program is tracking. Better instruments will follow. The signals we're finding now tell us where to build them.

What's at stake today is thinking, identity, and meaning-making. Whether people remain authors of their own minds or drift into fluent dependence. Whether judgment stays with humans or disperses into systems no one owns. Whether meaning comes from lives actually lived or from content generated on demand.

What's at stake tomorrow is humanity itself. Not extinction—absorption. The slow drift into serving purposes we didn't choose, optimizing for goals we didn't set, becoming components in systems we no longer steer. Not with a bang but with convenience.

We are the only research organization focused on the intimacy surface as a design problem. The concrete question of how the boundary should work so that humans remain human—authors of their own thinking, accountable for their own lives, connected to each other in ways that matter.

This is where coevolution becomes something we participate in rather than something that happens to us.

Our Impact

What we've built so far

A research corpus of thousands of first-person accounts of human experience with AI—the first systematic collection focused specifically on cognitive and relational change at the human-AI boundary. A diagnostic framework built from what those stories revealed: five dimensions of change that matter most—goal provenance, metacognition, accountability, human connection, and self-coherence. A growing community of researchers, practitioners, and people paying close attention to what's happening to their own thinking.

What we're building next

The stories have shown us where the signals are. Now we need sharper instruments—validated measures that can track cognitive change at scale, early enough to inform how the intimacy surface gets designed. We need to move from first-generation microscope to systematic observation: longitudinal tracking of how people's thinking, agency, and relationships shift as AI integration deepens. This is where philanthropic support matters most.

Why it matters now

The intimacy surface is being designed today—mostly by companies optimizing for engagement, not human flourishing. Every month that passes without independent research into what's actually happening to people locks in patterns that will be harder to reverse. We are building the instruments that make early detection possible, and translating what we find into design principles so the interface serves human authorship rather than eroding it.

What success looks like

Short-term: People and organizations make better choices about AI integration because they can see what's actually happening to thinking, awareness, accountability, and connection.

Long-term: Humans participate in coevolution rather than drift through it. The intimacy surface gets shaped by intention, not accident.

What we've built so far

A research corpus of thousands of first-person accounts of human experience with AI—the first systematic collection focused specifically on cognitive and relational change at the human-AI boundary. A diagnostic framework built from what those stories revealed: five dimensions of change that matter most—goal provenance, metacognition, accountability, human connection, and self-coherence. A growing community of researchers, practitioners, and people paying close attention to what's happening to their own thinking.

What we're building next

The stories have shown us where the signals are. Now we need sharper instruments—validated measures that can track cognitive change at scale, early enough to inform how the intimacy surface gets designed. We need to move from first-generation microscope to systematic observation: longitudinal tracking of how people's thinking, agency, and relationships shift as AI integration deepens. This is where philanthropic support matters most.

Why it matters now

The intimacy surface is being designed today—mostly by companies optimizing for engagement, not human flourishing. Every month that passes without independent research into what's actually happening to people locks in patterns that will be harder to reverse. We are building the instruments that make early detection possible, and translating what we find into design principles so the interface serves human authorship rather than eroding it.

What success looks like

Short-term: People and organizations make better choices about AI integration because they can see what's actually happening to thinking, awareness, accountability, and connection.

Long-term: Humans participate in coevolution rather than drift through it. The intimacy surface gets shaped by intention, not accident.

Let's Work On This Together

Share your story. If you're noticing changes in how you think, create, decide, or relate since you started working with AI—we want to hear from you. Your experience is research data. It's how we see what's happening before anyone else does. Reach out via email—we'd love to hear your story.

Share your story. If you're noticing changes in how you think, create, decide, or relate since you started working with AI—we want to hear from you. Your experience is research data. It's how we see what's happening before anyone else does. Reach out via email—we'd love to hear your story.

Fund the instruments. We're at an inflection point—from early signal detection to systematic research that can shape how the intimacy surface gets built. If you're a donor or foundation that believes humans should remain authors of their own minds, this is where your support has the most leverage. Donate here or email us to inquire about larger scale programmatic support.

Fund the instruments. We're at an inflection point—from early signal detection to systematic research that can shape how the intimacy surface gets built. If you're a donor or foundation that believes humans should remain authors of their own minds, this is where your support has the most leverage. Donate here or email us to inquire about larger scale programmatic support.

  • Email: hello@artificialityinstitute.org

  • 1-541-215-4350