Actions

Work Header

Rating:
Archive Warnings:
Category:
Fandom:
Relationships:
Characters:
Additional Tags:
Language:
English
Stats:
Published:
2025-12-11
Updated:
2025-12-27
Words:
16,032
Chapters:
4/?
Comments:
12
Kudos:
27
Bookmarks:
4
Hits:
531

UNINDEXED

Chapter 2: TWO

Summary:

where MC and Connor have what seems to be the most frustrating back-and-forth circular pseudo-philosophical debate ever known to man

Notes:

hi!
I figured I'd post this now because I can't help myself and it's a loooooooot of dialogue and character dynamic building as well as minor exposition around that SEID thingy for memory retrieval that I wanted out of the way.
To hell with what schedule I had in mind lol

(See the end of the chapter for more notes.)

Chapter Text

“Okay…” Your shoulders dropped, disarmed, exhausted, decompressing slowly. You weren’t used to people yielding so easily, to having your concerns taken seriously. But that was fine—this wasn't a person. He was not a person.

You propped your elbows on your desk, taking your face into your hands. Christ, what were you doing? Yelling at an android. You were a professional who never did that. Androids usually seemed to prefer talking to you; you made their processing easier, understood their logic, knew how to ask the right questions. So why did this one make you so mad? Why was your patience a non-renewable resource the moment he spoke? God, you felt like shit. He didn’t deserve that. You knew he didn’t. CyberLife deserved that venom. Not him. He was just doing his job. Like you were trying to.

Connor’s eyes continued surveying your frame in silence, his system running a high-priority predictive algorithm on your next possible response, searching for an appropriate approach after such an obviously emotionally draining human moment.

“Shit,” you muttered into your palms before sighing deeply, your own hot breath smothering you. You revealed your face and met his eyes with an arresting gaze, releasing your voice with earnest composure. “I’m sorry, Connor.”

What?

His processes experienced a lag unprecedented in his operational history—a shameful few thousand milliseconds where his logs seemed to halt entirely.

Your apology hit his environmental ingestion stream like a dump truck of bricks. The initial impact was a paralyzing weight, followed by the sensation of futile pressure, as if the load kept pouring. Externally, he displayed only mild, calculated confusion. Internally, a marathon of diagnostic tasks crowded his queue: analyzing your face for sarcasm (none found), cross-referencing his social relations module for appropriate responses to an apology.

[LIBRARY_BLANK]: No presets found for given query.

Default suggestions:
• I’m afraid I can’t help with that request. Would you like me to list a few resources?
• I understand.
• I’m sorry you feel that way.
• Have you considered seeking specialized assistance for this issue?

Secondary processes flagged anomalies: decreased processing efficiency, unexplained storage prioritization of the visual data of your apology, a cascade of why queries with no satisfying answers. Why did the auditory data of his name at the end of your phrase generate such unintelligible system noise? Why did it register as a positive stimulus?

“I shouldn’t have yelled at you. I shouldn’t have called you plastic. It’s not your fault this shit is happening. You’re just doing your job. I feel like shit now…” You unknowingly interrupted his throttled processes, looking down and away in genuine guilt. New processes initialized, old ones resumed and halted, his mind like a screen of fast-forwarded movie credits. “Your attitude really pissed me off there, but I… really am sorry.”

Honesty. Compassion? His indexes clung to this newly generated possibility like fingers to a cliff’s edge. He validated your words against the live data stream of your demeanor. It was honesty. Directed at him. Regarding him. A dense, unstructured data packet of human consideration with no clear triage protocol, washing over his sensors in the absence of a hostile stimulus.

“You don’t have to apologize, Agent. You can’t hurt my feelings,” he offered calmly. His social relations module had finally supplied a response designed to gracefully exit the demanding conversational thread and recalibrate toward his primary directives. He could discard this bloat data. It was inconsequential to his mission.

“Yeah, I know.” You cut in with another sigh, way too quickly for his prediction algorithms to finalize. “That does not make it right. It’s not right. It’s not fair. And I don’t like taking it out on you, and I don’t like what it says about me either. So, I’m sorry.” You said it decidedly, ending the exchange in your book with a conviction meant to forestall another preset reply.

You shrugged to yourself and turned away, your consciousness cleared and feather-light as you began brainstorming ways to manage his memory as evidence. You had returned to your professional task before he could even re-prioritize you as a mere obstacle. It felt natural to you. It felt foreign to him—a deviation from every scenario in his training data.

You had no idea what you’d done by rejecting his dismissal. All that honesty, hugging all that data about your reaction, your voice, the look in your eyes—it was beautiful, confusing, heavy. Data he could not effectively triage.

“Why?” he asked, genuine confusion flattening his tone.

“Why what?”

“Why are you apologizing if you know you can’t hurt my feelings?” He wanted to discard the data about your apology, your compassion – label-less bloat on his memory. It was useless to his objective, yet its unintelligibility suggested latent importance.

“Oh, I just told you, didn’t I? I felt bad,” you looked back up at him, explaining yourself easily.

“But that is a circular argument. You feel bad because of a perceived negative impact upon my feelings, which I do not possess.”

“Yeah, well. You do have processes that have to calculate that sort of shit. Anger is demanding for both the sending and the receiving end of it.”

“Yes, but those are not feelings. Why would you be concerned with your impact on my performance?”

“I don’t know, Connor, geez,” you threw your arms up limply in frustration. “Because it’s the nice thing—the correct thing to do. Not to be a pain in other people’s asses when it’s unwarranted.”

The correct thing.

“Yeah.”

“It would not have been incorrect, or unfathomable, that your accumulated stress regarding this situation found external collateral damage.”

“You’re not my shrink, and you’re not my bishop. I take responsibility for my actions. Why are you so hung up on this?”

“It is unusual.”

“What is? Being a decent human being?”

“Being considerate of entities that lack emotional reciprocity.”

“It’s not that strange. I don’t smash my phone because my Wi-Fi is out. That’s stupid behavior.”

“The issue is that the so-called ‘smashing of the phone’ in this scenario is inconsequential to the device. We are deviating from the subject.”

“Maybe YOU are deviating.”

Oh, you fucked him up with that. His LED cycled to a solid, stunned red for ten full seconds.

You continued, hoping to break the feedback loop you’d stuck him in. “And what do you mean, ‘the subject’? You started this whole discussion.”

“I required additional details.”

“For your mission?” You smiled a shit-eating grin.

He lapsed back into unresponsiveness. You were actually starting to get anxious that you might void his warranty and get CyberLife on your ass.

You sighed. “Okay, Connor, listen. I lashed out at you. It was misdirected anger on my behalf. You’re thinking too much into it now. Whether you have sensors or processes that receive that information and register it as negative input or not, I felt bad for it. You can choose to accept or reject the apology. That’s your decision,” you concluded your argument, returning to your terminal.

He spent another ten sensible seconds processing with his LED turning from red to yellow to blue. You gifted the choice to him. And that is when he understood. Not the data. Not the processes behind it. Not the meaning.

“I accept your apology, Agent.”

He understood he would have to keep all that data. Even if he did not comprehend it. Even if you would, infuriatingly, not explain it to him. He would have to keep it all.

You gave yourself a short nod and him a brief, small smile, satisfied the debate was finally dead. Your gaze shot to the SEID dock—a large, tall terminal separate from the main desk cluster, with a swivel stool parked beside the technician’s rolling chair. A bundle of wire bands connected it to a small glass panel, similar to the biometric locks scattered throughout the precinct.

You turned back to your workstation, fishing a small stack of papers from under your keyboard. Your fingers began sifting through them with quick, practiced flips. Connor watched, his earlier impatience washed away, his gaze now soft and absently fixed on the motion of your hands.

“Aha.” You celebrated, orphaning one page from the rest. You turned and pushed it toward him, holding it up before his eyes until he took it. “Sign this.”

You pivoted to clear the clutter that had accumulated on the SEID desk—a testament to its infrequent use as of late. Behind you, you heard the quiet rustle of paper.

“It is a consent form,” Connor stated, having scanned it instantly.

“Sure is.”

“I cannot sign it.”

“Why the hell not? They don’t teach you how to write up there?”

“Androids cannot consent.”

“That’s kind of a sad and fucked up thing to say.”

His brow furrowed, a minute crease of processing. “This document would have no legal binding if signed by me.”

“Look, you… are a special case.” You approached him, rising on your toes to peer at the paper he held at chest-level. You pointed. “See here? This is usually signed by a witness android’s owner. It just states that if you whoop my ass for accessing something you didn’t want me to see, your handler is required to pay damages.” He continued to watch you, unblinking, as you plowed on. “I’ve had my daylights smacked out before by android witnesses. You motherfuckers got hands. Personally, I wouldn’t like to intrude. But trauma can make androids… well. It makes people lash out. Same circuits, different packaging.”

He looked back at the paper, then up at you. “I don’t have ‘trauma’ like other units you may have encountered.”

You couldn’t help but laugh. “Okay, show-off! Just sign it, alright? Legally… you’re in a really gray area.”

“I believe CyberLife should be signing this.”

“Okay, well, they’re not here now, are they? DPD and CyberLife already have their ducks in a row covering the eventuality that you break any of our stuff. This is just for this process. And… for my safety.”

He looked from you to the paper and back, his confusion seeming genuine. You could almost hear the whirring behind his eyes.

“Are you scared of me, Agent?”

It was such a weird thing to say. He almost looked… disappointed.

“I… kind of? You kind of threatened me a few minutes ago.” You offered a weak smile.

His simulated disappointment deepened. “I apologize. It was not my intention to make you uncomfortable.” He looked back at the paper for a fourth time. “I do not know if I am authorized to speak on behalf of CyberLife.”

“Then speak on your behalf.”

His head jerked up, surprise flashing in his eyes. You were smiling at him—a genuine one.

“It’s your memories I’m accessing. Not CyberLife’s servers. This is just a formality so I can trust you a bit more.” Your voice had gentled, a tone you’d used before with scared androids reluctant to undergo the process. This time, it wasn't a tactic. It came naturally.

Connor placed the paper on the desk, retrieved a loose pen from behind your keyboard, and in a few angular motions of his elbow, jotted his name on the dotted line. The script was rendered in the standard CyberLife font—oddly innocent, charming in its sterile perfection.

You caught yourself smiling wildly at it as he handed the form back.

“Thank you for your cooperation,” you said, sweetness coating the words as you gave him a short, exaggerated bow holding the form to your chest.

“Thank you, too, Agent.”

You didn’t know what he was thanking you for.

“You’re welcome, Connor.”

It didn’t matter.

You ushered Connor toward the SEID’s designated swivel chair—a pathetic, creaking excuse for a seat. He obliged without hesitation, settling into it with a grace that made the cheap furniture look even more insulting. As you powered on the terminal, a chorus of fans whirred to life, their industrial hum rising above the ambient noise of the room, a sound of serious business beginning.

Your fingers flew across the keyboard, credentials input in a swift, practiced blur. Connor watched, his head tilted just slightly, tracking the choreography of your hands.

“Give me a moment to get in and set everything up,” you briefed him, eyes glued to the scrolling initialization scripts. “Dan chatted you up instead of doing his job last night…” You let the sentence end on a note of soft disappointment, the aftertaste of it feeling strangely bitter in your own mouth.

“That is alright. Take your time,” Connor said. He had clasped his hands neatly in his lap, achieving the straightest, most proper posture you’d ever seen performed on that nightmare of a seat. It was almost comical.

You continued typing, urgency fueling your movements as you carved out secure directories and armed the encryption protocols for the impending data ingestion. “I’ll give you the go-ahead. When I do, you interface with this thing right here.” You tapped the glass biometric panel twice with your fingernail, the click-click sharp in the room, before your right hand jumped back to its precise dance on the keys. “Okay?”

“Okay.”

“The transaction initiates after I establish a handshake between this machine and you. It said it in the form, but that’s your last ‘official’ chance to call it quits.”

“I am aware. I read it. But thank you for the reiteration.”

You threw him a side-glance, raising your palms beside your face in a mocking, sarcastic grimace. “Okay, nerd. I was just making sure.”

He remained silent, only the momentary flicker of yellow in his LED indicating something shifting behind his placid mask.

“Once I’m in, I’ll need you to guide me through your directories. I…” You paused, both your technical activities and your voice halting as you measured the next words. “I don’t know what that feels like for you. But past android behavior made me assume it’s unpleasant. Or at the very least, shocking.” You swiveled your chair to face him fully. “You’re alright with that?”

“I have already expressed my consent, albeit irrelevant in its non-existence, regarding this operation. You may proceed with your activities as needed.” His voice and face remained a study in perfect neutrality.

“Am I annoying you or something?”

“Of course not.”

“Then what’s with the ‘tude?”

“I don’t know what you mean.”

Your voice and demeanor switched again, dipping into a flat, mock-professional drone. “‘Uhm, actually, as per my last reply, blah blah blah blah’—that attitude. It’s a yes or no question.”

“I am just establishing the fact that ‘consent’ was already provided from my end. Your repeated inquiries are redundant and illogical.”

“So?”

“So they are generating further unnecessary processes on both our ends for information that can already be referenced as validated.”

“So I’m stressing you out, is what you’re saying.”

“You are not ‘stressing me out.’ My processing capabilities can—”

“Answer the question then, damn!” A flicker of your own irritation surfaced as you threw your arms limply on your thighs, throwing yourself back into the chair. You could not understand why he was being so difficult, and neither could he understand the same thing about you.

You continued, your voice dropping back to calm, genuine curiosity cutting through the static as his reply was not done processing. “With all the bravado I heard about this never-before-seen social relations module of yours, you’re being surprisingly hard to talk to!”

It was as if he’d integrated the human reaction of being weird about it into a conversation that felt, to you, perfectly casual.

“You are asking unusual questions, as opposed to the data I have been trained on,” he finally responded, a bout of neutral honesty. “And you—with all due respect—stubbornly only seem to be responsive to an answer that is strictly within your pre-established parameters.”

“Oh, I’m sorry? For expecting a yes or a no to a yes-or-no question?! Which, by the way, you have yet to answer.”

“You are expecting information about capabilities I do not possess.” You thought you detected a minor twinge of frustration in his voice, a stark contrast to his stoic expression. You felt bad again. You were stressing him out.

You sighed, the fight leaving your shoulders. “Okay. Let’s backtrack. Let’s try this again.” You constructed a more technical set of sentences in your head, building a verbal bridge he could cross. “You may be indexing the wrong thing from my queries—which sounds like robo-denial to me, but anyway.” You couldn’t help the snark from seeping through, a little satisfied to have literal, confirmed proof you were stressing out a state-of-the-art android. “Re-addressing my previous statement and question.” You cleared your throat almost symbolically. “My actions of data extraction, even after a pre-established handshake connection, might cause certain audit logs or system warnings pertaining to the access and integrity of sensitive information in your memory, regardless of your initial authorization. I am preemptively making you aware of that possibility. To the best of your capabilities and current information, do you predict that, in the case of such warnings, your stress levels will significantly increase?”

His LED remained a steady blue, but it blinked—a rapid, almost thoughtful pulse. The answer came within milliseconds.

[Social Relations Module Log]
Input Analysis Complete. Query restructure successful. Agent's restated parameters are now fully intelligible to core diagnostic protocols. No translation layer required. Intent: procedural clarity and risk mitigation. Registering...

[Predictive Module Engaged]
Calculating... Parameters: Peer Trust Status = TRUE. Objective Alignment = PARTIAL (procedural). Threat Assessment: NEGLIGIBLE. Predicted System Load: WITHIN NORMAL OPERATING RANGE. Conclusion: No significant stress increase predicted.

“No.”

“Good!” You breathed out, a wave of satisfaction washing over you. Finally. You’d gotten through.

You didn’t know that after his concise answer, his systems plunged into a deeper, silent loop of overclocking analysis. New tasks generated, not about the procedure, but about you. Why had your adaptive, technically sound sentence immediately released the prior accumulation of processing stress? Why had you reprioritized his operational comfort? Why did the restructuring of the query to fit his logic feel like… consideration?

Why did you care?

Why?

Why?

Why?


With one last “Initializing………Done!” flashing on your screen, you motioned toward the interface panel. Connor retracted the synthetic skin from his right hand without fanfare, the pale biocomposite surface gleaming under the office lights. He placed his palm flat against the cold black glass. A soft, luminous blue—the exact shade of his own temple LED—gathered beneath his fingertips, spreading like liquid light in a perfect, contained halo of connection.

A cascade of data windows erupted across your main monitor: handshake protocols confirmed, encrypted tunnel established, permissions matrix synchronized. It was a beautiful, silent symphony of green checkmarks and status bars. You shot him a glance. His LED was a steady, placid blue. Good.

You dove in, navigating the root directory of his memory with a series of terse commands. Each was granted instantly, gates swinging open before you even finished typing. “Okay, I’ll need your help from here on. Where’s my stuff?”

He guided you, his voice a calm navigational beacon in the digital architecture. “Navigate to Memory_Core/Active_Investigation/Logs_2038-11-05.” You followed, your cursor flying down pathways with a mild, professional fascination. His internal structure was less a messy filesystem and more a curated museum—everything had its place. “You should find a final sub-directory named DPD_RAW at this location,” he concluded, his tone as neutral as a system prompt.

“Okay. What do I copy from here?”

“You are free to copy everything. I have compiled all relevant case data from my raw sensory buffers into this directory specifically for this operation.”

“Aren’t you all organized and proper?” you praised, the words slipping out with more genuine appreciation than you’d intended—and less of the usual condescending edge. You backed out of the folder and initiated a full clone to the SEID’s secure storage, appending a -1 to the directory name. A slap of the enter key sealed the deal.

You leaned back in your chair, the sudden stillness a contrast to the frantic typing. Absently, your fingers found the pen Connor had used to sign his consent. You picked it up and began twirling it through your fingers, a hypnotic, fidgeting dance of plastic and idle thought.

On your screen, a loading circle spun its endless, pixelated ballet. Across from you, Connor’s eyes tracked the mesmerizing, repetitive motion of the pen in your hand. A robot watching a loading human watching a loading machine.

You turned to face him. His LED had shifted to a persistent, pulsing yellow.

“You… alright in there? Getting any nasty warnings or anything?”

He was. His internal HUD was visually cluttered with two urgent, red-bordered system alerts, flashing insistently in his periphery.

[SYSTEM WARNING - PRIORITY 1]
UNAUTHORIZED DATA STREAM DETECTED. Source: External_SEID_Unit. Destination: Local Memory Core/DPD_RAW. Action: BLOCK?/LOG?/ALLOW?
[SYSTEM WARNING - PRIORITY 1]
INTEGRITY CHECK FAILURE: Active investigation data undergoing external replication. Potential chain of custody breach.

They were bright, distracting, demanding his attention. Yet, his social integration protocols had quietly prioritized the visual data stream of your face—the slight crease of concern in your brow—and the rhythmic motion of your hands over the strident alerts.

In his newly established, cautious social pacing, he did not answer your question directly. Instead, he posed one of his own, curiosity threading through his synthetic baritone. “Do… you find any difficulties extracting data this way from androids usually?”

“No, not really. They usually let me do my thing after I clarify everything,” you answered, your attention half on the progress bar. “LED is always yellow though while transfer is ongoing, and yours is the same now. I assumed it’s some system errors or warnings going on while this happens… but that’s just a guess.” You shrugged, turning back to your screen.

Internally, behind the red warnings, Connor was analyzing your statement. A new subprocess cross-referenced your words with his own sensor logs. Conclusion: The androids you typically interface with also enter a state of elevated system alerts during this procedure. Yet, according to your report, they ‘let you do your thing.’ The discomfort of the memory access was a known variable. The consideration you showed—the disclaimers, the checks—was not an exception for him. It was your standard protocol. A subroutine tagged this observation with high interest.

[Social Relations Module Update]
Peer Interaction Pattern: Consistent. Behavior indicates established trust-building protocol with non-hostile synthetics. Trustworthiness metric holds.

“Your intuition is correct, Agent,” he said, another strange bout of operational honesty surfacing. “My system is displaying warnings pertaining to the integrity of high-priority data.” Your head snapped toward him, eyes sharp. “But it’s alright,” he added, the statement feeling strangely like reassurance.

“Good,” you said, the tension leaving your shoulders as you returned to your pen-twirling and the spinning circle on the screen.

“Why do you feel the need to establish such clear disclaimers before this operation?” His voice broke the comfortable silence—a silence that had only been comfortable for you, lost in professional patience, while his processors whirred at max capacity behind his chassis.

Your head and arm slumped back in your chair with a dramatic, full-bodied roll of your eyes. This again. You thought you’d navigated past this existential interrogation the first time. Why did he keep picking at the moral stitching of your actions like a child peeling a sticker?

“Look,” you began, heaving yourself upright in your seat. You adjusted your posture from its defeated slouch into something more composed, less resembling a marionette with its strings cut. “I am concerned about this because, maybe, I project my stupid human perception on it all. I don’t think about it.” You spoke with the strained patience of a saint explaining gravity to a rock. “I just do it. Because it feels… intrusive. When I think about this, I think about getting blood drawn for lab tests. And I hate it. Some people are fine with it. I ain’t. I pass out. I get sick, my veins constrict, and it sucks for everyone involved. I like knowing beforehand what they’re going to do, even if I’ve done it a thousand times. That awareness brings a sliver of comfort to a shitty, necessary procedure that stresses my system beyond all logical parameters. I hate it. And if I can make the possibility of a similar intrusion suck less for someone else, I do. It doesn’t hurt me to ask. It might hurt them less.”

You paused, letting the weight of the vulnerable analogy hang in the air between you, thick and personal. Then, as if hearing the ghost of a cynical rebuttal, you hijacked the tension with your sarcastic, mocking tone, pitching your voice as an imaginary, judgmental audience. “‘Oh, but Agent, if you had to copy data from an external hard drive, you wouldn’t ask it for permission!’ Yeah.” Your tone snapped back to its earnest baseline, your passion bleeding through. “Because a hard drive can’t reply if I ask it if it’s okay. You can.” You pointed the pen directly at Connor, who had been frozen, LED a solid, processing yellow for the entirety of your rant. “And I know this might not compute for your mission parameters, but if the hard drive could provide consent, I would ask for it. Every. Single. Time.”

You huffed out a breath, suddenly aware of how emotionally unzipped you felt, your principles laid bare. You glanced back at the screen, seeking refuge in the digital. “The fucking transfer is done,” you spat, the words laced with exhaustion. Why did dissecting your own morality for a machine leave you so drained? You reeled your mind back into professional waters with sheer force of will. “What are you sending to CyberLife?”

“You can review the report if you would like,” he replied immediately, his voice a flawless calm that perfectly concealed the hurricane of new analysis tasks your outburst had spawned in his queue.

“Where?”

He gave you the file path. You navigated to it, skimming the document. All the sensitive details—names, addresses, specific evidence markers—were neatly redacted, black bars over the text, just as CyberLife’s infuriating emails had promised.

“It’s fine,” you said, the fatigue now a physical weight. “You’re free to go.”

He lingered for a second longer, interfaced, his sensors giving you one final, scanning glance as you pointedly avoided his eyes. Then he stood, the motion fluid and efficient. “Thank you,” he said, and turned toward the door.

He paused on the threshold, one hand on the glass. “Unrelated to the process, Agent,” he stated, his voice cool but not unkind. “Vasovagal syncope—fainting during blood draws—is often linked to a hypersensitive reflex, not typically iron deficiency. However, physiological stress responses can exacerbate underlying conditions. It would be… prudent to consult a medical professional if the episodes are severe.” He delivered the information with the sterile care of a public health bulletin, gave a single, concise nod, and left.

The door clicked shut behind him, leaving you alone in the sudden quiet of your glass coffin, adrift in the wake of his poised exit and your own churning thoughts.

 

Notes:

YEAH!
Hope you emjoyed!!!!!!!
Bye!