Skip to content
Back to Blog

How AffectSync Changed the Session Experience

By the LiveThru Engineering Team · December 8, 2034 · 10 min read

When we launched LiveThru in 2033, our Sensorimotor Relay Architecture handled the core transfer beautifully: motor control, proprioception, and the five classical senses. Controllers could walk, talk, taste, and touch through their Hosts. But something was missing. Controllers described sessions as vivid and immersive but emotionally flat — like watching the world through perfect glass. With the release of AffectSync 2.0 in October 2034, that glass disappeared.

The Emotional Gap

To understand what AffectSync solves, you need to understand what was missing. The original SRA relay captured sensory data with extraordinary fidelity — a Controller tasting espresso through a Host in Naples could identify the roast profile, the crema texture, the temperature. But they didn't feel the pleasure of it. They could perceive the taste without the emotional response that gives taste its meaning.

This made sense architecturally. The SRA was designed to relay motor and sensory pathways — the channels that carry information between the body and the brain's sensory cortices. Emotional experience, however, doesn't live in those pathways. It lives in the limbic system, the insula, the anterior cingulate cortex. These are deeper structures, and relaying from them was a problem our founding team initially considered too risky to attempt.

Dr. Leon Voss's position was clear: we would not relay cognitive content. Thoughts, memories, personality — these would never cross the bridge. That principle hasn't changed. But emotional valence — the feeling associated with a sensation, not the narrative around it — that turned out to be separable. And relaying it turned out to be transformative.

How AffectSync Works

AffectSync monitors the Host body's autonomic emotional markers — micro-fluctuations in skin conductance, heart rate variability, pupil dilation, and a set of 14 neural oscillation patterns we identified in the insular cortex. These markers don't carry semantic content. They don't tell you what someone is thinking. They tell you how the body feels about what's happening to it.

When a Controller walks a Host body through a flower market in Bangkok, AffectSync captures the body's low-level affective responses — the micro-pleasure of a sweet fragrance, the alertness triggered by a sudden sound, the calm induced by rhythmic walking — and translates them into corresponding autonomic signals in the Controller's resting body. The Controller doesn't receive the Host's emotions. They receive the body's reflexive emotional tone, re-mapped onto their own emotional architecture.

The distinction is critical. AffectSync doesn't make you feel what the Host would feel. The Host is in a rest state — they're not feeling anything. It makes you feel what the body's hardware-level responses are, translated into your own emotional vocabulary. When the body encounters something pleasant, you feel pleasure. When it encounters something startling, you feel a spike of alertness. The system is pre-cognitive — it operates below the level of thought.

The Safety Question

We knew the first question would be about safety. If you're relaying emotional signals, could a Controller experience distress? Could the system relay fear, pain, or trauma?

The answer is no, because of how AffectSync is architecturally constrained. The system operates with hard-coded valence boundaries. Negative-affect signals above a calibrated threshold are attenuated to a neutral baseline before reaching the Controller. If the Host body stubs its toe, the Controller feels a dull awareness of impact, not pain. If a loud noise occurs, they feel mild alertness, not a startle response. The system filters for safety the same way the motor guardrails filter for consent — at the hardware level, before the signal reaches the relay bridge.

We tested this extensively. Over 12,000 controlled sessions during the beta period, with continuous emotional-state monitoring of Controllers. Not a single Controller reported emotional distress attributable to AffectSync. The most common report was the opposite: Controllers said the system made sessions feel warmer, more present, more real.

What Controllers Say

The feedback since the AffectSync 2.0 rollout has been the most positive response to any update in LiveThru's history. Session satisfaction scores jumped from 87% to 94%. Average session length increased by 22%. Rebooking rates went up by 31%.

But the numbers don't capture it as well as the words do. Here's a selection from our post-session surveys:

"Before AffectSync, tasting food through a Host was like reading about flavor. After, it was like actually eating. The pleasure is there. The satisfaction is there. I finished a meal in Rome and felt genuinely full and happy." — Controller, Session #88,412
"I did a sunset hike in Iceland last week and I felt awe. Not the idea of awe. Actual awe. My chest got tight and my eyes watered and I stood there on a ridge in a body that wasn't mine and felt what it means to see something beautiful. That didn't happen before." — Controller, Session #91,207

What's Next

AffectSync 2.0 is a foundation. Our research team is already working on refinements to the valence-mapping algorithm that will make the emotional layer even more naturalistic. We're also exploring what we internally call "ambient mood transfer" — the ability to relay not just stimulus-specific emotional responses but the general affective tone of a place. The calm of a Japanese garden. The energy of a street market. The quiet melancholy of a rainy afternoon.

These are hard problems. They require extraordinary precision and even more extraordinary caution. But the path forward is clear: LiveThru sessions should feel not just sensorially real, but emotionally real. With AffectSync, we're most of the way there.

This article was authored by the LiveThru Sensory Engineering team, led by Dr. Tomás Reyes, VP of Neural Architecture.