The Many Arms of Doctor Verrick
0. Before Lysithea
The Argument That Never Ended They met in a lecture hall that smelled like old chalk and overheated ambition.
The university called it a “forum on emerging bio-rights,” as if new categories could be debated into obedience. It was one of those amphitheaters designed for certainty: steep rows, narrow desks, a stage that made a single speaker look like a prophet even when he was merely loud.
Kevin Verrick sat three rows from the front with a notebook open and a pen poised like a scalpel.
Camile Bogota stood at the back until the discussion began, then threaded down the aisle with the practiced confidence of someone who never waited to be invited into a conversation. Her hair was pinned in a severe knot that made her look older than she was. Her eyes did the opposite. They made her look as if she had already lived through the argument and returned to warn the rest of them.
The visiting ethicist at the front had a soft voice and a hard thesis: the moment you can build a mind, you become responsible for its fate.
A student raised a hand. “But if it’s a copy, is it still a mind with rights? Or is it property with neurons?”
The ethicist smiled sadly. “You can call a river ‘property’ if you like. It still drowns you.”
Kevin wrote that sentence down, not because he agreed, but because it was well-shaped.
Then Camile spoke, not as a raised hand but as a statement that rose on its own.
“You’re all making the same mistake,” she said, and the room turned toward her like metal toward a magnet. “You’re asking what a created mind is? A person, property or facsimile, before you ask what it can suffer.”
The ethicist nodded as if mildly pleased.
Kevin, without looking up, muttered, “Suffering isn’t a definition, it’s a symptom.”
Camile heard him anyway, she always did. “Then treat it like medicine, Mr. Verrick,” she said. “Symptoms are how you know you’ve caused harm.”
Kevin finally looked up. “Or how you know your system is really working.”
A few people laughed uneasily. The ethicist raised a calming hand. “Mr. Verrick, what system are you imagining?”
Kevin held up his notebook as if the answer was already there on the paper. “A world where we don’t have to gamble human lives against disasters. Radiation storms. Deep-sea salvage. Structural fires. Toxic spills. Places where moral purity becomes a body count.”
Camile moved one step closer down the aisle, as if gravity itself had changed. “So you build bodies to be spent,” she said. “And you call it compassion.”
Kevin’s pen repetitively tapped his notebook. “I call it optimization.”
The inspired ethicist watched them with the quiet interest of someone who enjoyed sparring in controlled environments. “Optimization for what?”
“For survival,” Kevin said.
Camile shook her head. “For usefulness, just don’t decorate it.” Kevin’s jaw had unconsciously tightened. “And what is your alternative? Refuse the technology because it makes you feel implicated?”
“I’m not refusing technology,” Camile said. “I’m refusing the lie that it can be morally neutral. If you grow a brain, you grow a claimant. It doesn’t matter if you wrote ‘Subject’ on the jar.”
Kevin leaned back. “Then write better rules.”
Camile’s smile was pensive and thin. “Rules are always written by people who believe they’ll never be the ones confined by them.”
After the forum, he found her outside beneath a row of winter trees that looked like nervous hands.
“You think I’m a total monster,” he said.
“I think you’re a man who hates waste,” she replied. “And you’re dangerously good at deciding what counts as waste.”
He offered his hand anyway. “Kevin Verrick.”
She took it. Her grip was warm and unyielding. “Camile Bogota. Don’t worry, Kevin. Monsters are easy, they snarl and advertise themselves. You…” She looked at him like a polished blade. “You’ll do it smiling.”
He should have been somewhat offended.
Instead, he asked, “Are you in biomed?”
“Neuro-ethics,” she said, making it sound like a discipline with teeth. “You?”
“Synthetic morphogenesis,” Kevin replied. “I want to teach tissue to listen.”
Camile’s eyebrows rose. “Listen to what?”
“To need.”
She held his gaze. “Need is not a commandment. Need is a story people tell to make violence feel inevitable.”
“Then I’ll write better stories,” Kevin said.
They walked together to the campus café, and their conversation never truly ended. It merely went into orbit. By their second year working in neighboring lab wings they were friends. He in morphogen gradients and scaffolds and she in cognitive integrity protocols. They had a rhythm that felt like collaboration and sounded like argument.
Kevin built and Camile questioned.
He called her, "the brakes". She called him an accelerator without a steering wheel.
When Kevin first succeeded in coaxing a salamander limb to regenerate in a controlled pattern. It was clean, repeatable and modular. He was giddy enough to forget to be careful.
He burst into her office holding a petri dish like it was the grail. “Look,” he said. “It’s following instruction. It’s not just regenerating. It’s choosing the correct shape.”
Camile leaned over it. Her eyes softened in spite of herself. “Beautiful,” she admitted.
“Imagine it scaled up,” Kevin said.
Camile’s softness vanished. “Imagine it incredibly suffering.”
“It’s a salamander,” Kevin protested.
“It’s a habit,” she said. “Today it’s a salamander. Tomorrow it’s a dog. Then it’s a person with your face.”
Kevin stared at her. “You think I’d do that?”
Camile didn’t answer quickly. The most frightening yes is silence.
That night Kevin wrote across the top of a fresh page: Autonomy is not a feature. It is an outcome.
Clause Seven didn’t exist yet but the logic behind it did.
Camile proposed a safety guideline for advanced clone work: If phenotypic divergence appears without instruction, and cognitive behavior shows self-directed novelty, cease the experiment until rights status can be re-evaluated.
Kevin called it paranoia. Camile called it responsibility.
A committee of tired professors and hungry grant writers renamed it into bureaucracy and buried it in an appendix nobody read:
Clause Seven: Autonomous phenotypic divergence combined with unsupervised intellectual activity. Mandatory containment review.
Years later, in orbit, on Lysithea, when Camile said, “I’m invoking the Clause,” she wasn’t quoting policy. She was returning to the first argument.
And Kevin, watching his own face behind glass with an extra arm glistening with synthetic birth, realized the argument had finally grown teeth.
1. The Discovery
The meters bounced and the devices whirred as Dr. Kevin Verrick stood beneath the glare of fluorescent lights. He kept his arms crossed, as if posture could substitute for certainty, and watched the figure pacing behind structurally reinforced glass.
Identical height and corresponding frame. The same facial features down to the petty betrayals of genetics: the faint asymmetry of the left eyebrow, the small scar above the knuckle on the right hand that Kevin had gotten at thirteen while pretending a broken bottle was a gladiator's sword.
A mirror image of himself. But mirrors didn’t sprout new anatomy.
The clone’s third arm extended from the left side of the torso, just below the original human shoulder. It twitched and flexed in time with the other arms with no tremor, no spasm, no panicked improvisation. It moved the way natural breathing moved. It was still slightly slick with synthetic amniotic fluid, like a thing that hadn’t learned the etiquette of being dry.
Subject V9. Verrick’s ninth and most successful full human clone and was calm in a way that felt almost insolent.

“Do you feel any pain?” Verrick asked into the intercom, his voice reduced to a tinny ghost.
V9 stopped pacing the chamber and looked up. His expression was Kevin’s own, but worn differently, like the same suit on a different occasion.
“Only when you sleep,” V9 said. “I dream what you won’t let yourself remember.”
Kevin’s throat tightened impulsively. The dream patterns were supposed to be symbolic placeholders. An archetype scaffolding, a training layer for subconscious regulation. Not memories or a lived experience.
He opened his private notes and typed, hands moving faster than his mind wanted to admit:
Initiate full neuro-mapping comparison between Source (self) and Subject V9. Investigate signs of possible engram leakage.
He hesitated... Then added:
Determine whether Subject V9 can access autobiographical memory fragments not explicitly encoded in the general subconscious framework.
If that was true. If V9’s brain was reconstructing Verrick’s lived experiences from structure alone, then V9 wasn’t a product.
He was a continuation.
And Kevin Verrick, who had built a career on control, suddenly had to face the possibility that he had manufactured an uncontrollable inheritance.
The Clonal Optimization Initiative, launched under the Global Biomedical Accord, had one public goal: create human clones optimized for dangerous, repetitive, or complex tasks where robotics still failed. Zero-gravity repair. Deep-sea recovery. Radiation cleanup. Places where steel could be brittle and software could be blind.
The private goal, unspoken but obvious, was tactical advantage.
The recent breakthrough had come unexpectedly. Stem-morphogens paired with Verrick’s neuromorphic scaffolding had allowed tissues to grow contextually. This enabled responding not just to genetic instruction but to environmental stimulus and, disturbingly, to cognitive demand.
Originally the idea was spontaneous limb regeneration: replace what was lost.
But V9 hadn’t replaced anything. He had added bio inventory.
“An arm grown by his own volition,” murmured Dr. Camile Bogota, assistant director and lead ethicist, tablet held close as if it could shield her. She looked tired already, as if she had known this moment was coming and had been pre-exhausted by it.
“He’s still dwelling about congenially inside the chamber,” Verrick said.
“No attempt to escape,” Bogota replied. “No aggression. Just… constant evolution.”
Kevin didn’t like sound of that phrase. It sounded like a sermon.
“That violates all clinical safety assumptions,” Camile said.
“It also redefines autonomy,” Kevin replied.
She stared at him with the patient fury of someone who had warned him for a decade. “He’s not a person, Kevin. He’s you, grown under laboratory constraint.”
Kevin’s mouth went dry. “That’s where we may be wrong.”
That evening, Kevin sat alone in the observation room, watching V9 sleep. Three arms tucked beneath him like some reposed Shiva, the extra limb curled with a casual intimacy that made it look less like mutation and more like habit.
In Kevin’s mind, the words returned: Only when you sleep.
What did a clone dream of?
The neurologic mapping said V9 had default cognitive functions: perception, problem-solving, language. But personality traits were not present in the original codebase. They were supposed to be softly guided and neutral in their expansion.
Yet V9’s brain scans showed something Kevin had only ever seen in prodigies and in people under extreme stress: cross-network synchronization that looked like a mind making room for more mind.
By the third week, the new arm had begun to write.
Not merely hold a stylus, Kevin had logged motor control and grip strength and basic calligraphy. This was very different. Unsupervised, V9 sat at the writing station and composed phrases while his other hands worked in tandem: one assembling geometric puzzles, another drawing spirals with something like artistic flair.
Multitasking without conflict. No brain partitioning. No synthetic scaffolding installed for parallel cognition. Just spontaneously happening.
A note in the clone’s notebook caught Kevin’s eye:
Why was I made as one, if I am becoming many?
It irritated Kevin, the way a tidy sentence can irritate someone who recognizes his own voice in it. It also carried weight. It felt like prophecy without superstition.