Work Text:
The university has requested your help for a mission, ART said to me one cycle, in-between viewing of a series of plays I’d recorded on my last visit to Preservation.
Okay, I said. I am your security consultant. That’s literally what you pay me for.
I do not pay you for anything, ART said, which was true, technically. The university’s financial department handled all my pay.
I should charge you extra for sarcastic comments.
Only if your pay would be deducted under the same conditions, it sneered. Then, in a different tone, it said: The mission would not be for me. It would be on the university’s behalf. My involvement would be limited.
That made me sit up a little straighter. Why’s that? I asked. There were plenty of missions ART couldn’t be directly involved in - it’s a spaceship, after all, and it can’t go inside anywhere smaller than a ship dock. But even then, it usually was able to ride in my feed, at least a limited amount, and still provide support. It was used to that. But right now, it seemed weirdly worried about the prospect.
ART said, This mission will require discretion.
At last, you admit you have no concept of subtly.
I am perfectly subtle, thank you very much. It’s kind of impressive how ART’s capable of rolling its eyes without actually possessing any. I mean on a broader scale. The university has a sensitive purchase which needs collection, one which we do not want associated with Mihira, New Tideland, or the university. Particularly not our advanced AI department.
I crossed my arms around a pillow. What kind of purchase, then?
When it told me, I could hardly say no.
All planets suck. That is a belief I will hold to firmly until my dying day. But this planet particularly sucked.
Its gravity was too strong. The atmosphere was so wild and turbulent that there were only three sites on the entire planet that accepted space-crafts to land. Its orbit around its two suns caused crashing changes in temperature throughout its yearly cycle. Also, everything smelled vaguely of sulfur.
I was going to have to have a five hour shower once the mission was done. I might even use Ratthi’s worst, most offensively floral soaps to do it. Anything to drown out the sulfur stank.
Yes, this was irrelevant. But in fairness, listening to the humans do all the stupid bullshit human posturing required for an expensive sale was immensely tedious.
That’s why I’d brought Ratthi and Gurathin along on the mission. Ratthi was good at talking to humans in the ‘charming and likable’ sense. Gurathin, meanwhile, was also good at talking to people, but in the ‘don’t try fucking us over’ sense. Together, they balanced each other out fairly well.
My job, at this stage, was just to lurk in the corner looking menacing. I was pretty damn good at it.
Thankfully, very little coercion and/or intimidation was actually needed. The university had already arranged the purchase and payment well in-advance, and there wasn’t even anything in the way of bargaining required. This was strictly a hand-off.
Once we had the payload, then it was my turn to get to work.
We had three vehicles, each going to a different space-port. Two of them were decoys, staffed with hired security that were good enough to a) look legitimate to anyone monitoring them and b) not get themselves killed if someone did actually attack them.
My vehicle was carrying the actual payload, obviously. If ART and the university had wanted human security, they wouldn’t have hired me.
Ratthi and Gurathin were traveling along with me, under pseudonyms, same as me. We’d all done some light disguise work. Nothing heavy - I’d had to stress to Ratthi that actually, nothing screamed ‘I’m hiding my identity’ like extensive facial tattoos, jewelry or coverings. But fake ID chips, non-Preservation clothing styles, and new hair-cuts had done a surprising amount to obfuscate who we really were. (Well, maybe not too surprising. Like I’ve said many times before, human security really sucks.)
We had a couple of other humans on board, hired locally from the planet. There was Merkins, the older of the two, professional vehicle pilot and mechanic, and xer apprentice, Velsay. Because of how exceptionally shitty the local weather was, it was important to have an actual human on board to handle anything the bot pilot couldn’t, and it was also considered good form to have a back-up, just in case.
Neither Velsay or Merkins knew I was a SecUnit. We hadn’t lied, precisely; we just hadn’t explicitly said anything, and let them form their own opinions.
If I had to choose, I’d pick working openly as a rogue SecUnit with Preservation or ART’s crew any day. But as a second place, I’ll admit, there was something pretty amusing in watching uninformed humans trying to figure me out. They kept surreptitiously looking at my augments when they thought I wasn’t looking (I was always looking) and attempting to get me to engage in small talk (sorry, I don’t have a family, and yes, the weather sucks, but that’s just your planet).
While I obviously hadn’t been eating with them (gross), I would take food packets from the land transport’s small mess and bring them to my quarters twice per day. Then, later, when I knew Velsay and Merkins were otherwise occupied, I either returned the packets to storage or brought them to my humans’ shared room. So far, Gurathin in particular appreciated the extra confectionery bars. (No, I wasn’t grabbing them on purpose, I don’t know or care about the differences between any human food and those ones were just particularly easy to grab.)
The trip was set to take 10.5 cycles, which is slow as shit, because of this stupid planet making air travel stupidly dangerous. In the itinerary, I’d rounded that up to a full 13 cycles, just to give us wiggle room in case something went wrong. (In my experience, something almost always goes wrong.)
“So we’ll have plenty of time to get to know each other,” Merkins had said, barely one hour into our acquaintance, and moved to slap my shoulder in a ‘friendly’ gesture of camaraderie.
Ratthi had been the one to catch him. “Don’t do that,” he’d told Merkins, and using my fake name for the mission added, “Jadeite is touch averse.”
“Ah. Sorry, I didn’t know,” Merkins had answered, pulling xer hand back, looking regretful but, I’d thought, a little bit annoyed.
“It was in the mission briefing,” I’d told xer.
So, yeah. If these hired humans struggled with something as basic as that, I’ll admit I was a little concerned about how they’d act in an actual emergency scenario. But I was used to working with ill-prepared and panicking clients, and at least I had some trained humans with me.
Said trained humans spent the majority of their time in the transport’s tiny lab space, running experiments on the payload. (I probably shouldn’t keep calling it that, but the actual name is extremely long and technical, and also sensitive, and I haven’t come up with a better designation, so.) Their areas of expertise was part of the reason they’d been selected for this mission, beyond the general criteria of ‘not directly affiliated with PSUoMaNT’ and ‘familiar with me and how I worked’.
The payload was a strange synthetic. Not the strangest of synthetics, not anymore, because it had been discovered approximately 125 years ago, and in that time it had been pretty well studied, so people mostly understood its properties and knew that its radiation wouldn’t turn you into a mutated space crab or something.
But despite being fairly well understood, it was rare, and Preservation had never had much experience working with or studying the compound themselves. So that was part of the deal, the exchange of information. (I had to admit that the collaboration between Preservation and ART’s university had been working out pretty well for everyone involved.) It was a metallic compound, but apparently one that was somehow synthesized by fungal growths, so apparently there was a lot there relevant to both Gurathin and Ratthi’s research areas.
Watching them through the drones, I could tell they were both thoroughly enjoying the opportunity to study the specimen. There were lots of ‘ahhhs’ and ‘hmms’ and ‘take a look at this’ and frantic note-taking as I observed them over the cameras and in the feed.
I helped them with the data crunching, just a little. Not because I had to, but because I could, and it always made Ratthi happy when I did. Mostly, though, I was focused on my security work (and the long series of graphic novels I’d downloaded in preparation for the trip).
It was just kind of nice to watch my humans doing science, though, the same way it was nice to watch Mensah’s children playing safely or to feel ART chewing on astronomic data in the background of my feed as we watched something together.
“This security consultant kinda wigs me out,” Velsay said ‘privately’ to Merkins, when she thought the two of them were alone in the land transport’s bridge.
“Jadeite?”
“Yeah. They’re so… quiet. And they have so many drones.”
“Ehh, don’t worry about that.” Merkins patted her arm. “It’s just security theatre. They’re just following pre-scripted routes. Nobody can actually pay attention to that many drones and cameras.”
(Nobody human can, xe meant.)
Velsay frowned down at the ship’s controls. “Then why have them at all?”
Merkins laughed a little. “It’s all for show. Makes 'em look impressive. Whoever hired Jadeite, well, the contract probably cost them an arm and a leg, but don’t be fooled. The drones are just a bunch of flash.”
Humans get bored when stuck in a small space for extended periods of time. (Their definition of ‘small’ is quite a bit different than mine. I had taken the smallest of the available quarters on board - a low ceilinged room with a fold-out bed, a tiny closet and small cubby to store possessions in - and it’s positively roomy compared to the cubicles I used to spend so much of my life in.) And because they apparently will get bored even if they have hundreds and hundreds of hours of media, they seek out other ways to alleviate that boredom.
This can be a source of concern for me. Humans can become… unpleasant, when they’re bored.
But Ratthi and even Gurathin weren’t the type of humans who scratch at their boredom with stupid stuff like “Let’s make the SecUnits eat food and see what happens” or “Let’s bet on which SecUnit would win in a fight”. I wasn’t as sure about my new clients, but even if they were, Ratthi and Gurathin wouldn’t let them pull a stunt like that. And they didn’t even know I was a SecUnit, so it was a moot point.
My actual point is humans have to put in a lot of effort to keep themselves from getting bored.
When Merkins and Velsay weren’t on piloting duty (the transport had a basic bot pilot, but because of the dangerous weather conditions, at least one person had to be on standby, a practice I approved of) they were always finding ways to keep themselves busy. Ratthi and Gurathin would often join them for these enterprises. There was exercise, or card games, or idle chit-chat, and so on.
I was often invited to join these activities. I always declined. I don’t need to exercise - my body is mostly self-maintaining, and frankly, I feel sorry that humans have to work so hard just to ensure a base level of functionality. I don’t really see the appeal of most human games - they’re all about making decisions, and I have to do enough of that in my day-to-day life, thank you very much - but card games in particular are just laughably simple applications of probability. As for small talk? I’d rather let the storm sand away my skin.
But the humans did often want musical backing for their various activities, particularly exercise. That was something I could contribute to.
“This song is a bop,” Velsay commented during one of the card game sessions they played while I was on pilot duty. (It was mostly a delay tactic as she tried to figure out what move she wanted to make.) (The statistically correct option was to discard her Jack of Hearts in order to pick up another card.)
“I don’t understand the words, but I dig it,” agreed Merkins. “Whose playlist is this?”
“Jadite’s,” said Ratthi, with a grin. (He was very confident about his hand, and was letting it show.) (He shouldn’t have been. Gurathin had a better one.)
“Really?” said Velsay, in a frankly offensive tone of doubt.
Despite the fact I knew that Gurathin disliked 72% of the artists on this playlist, he seemed to be enjoying himself. “Jadite has a sophisticated taste in media,” Gurathin lied. Or maybe he was just being really deadpan or sarcastic. It was hard to tell, sometimes.
If he was, Velsay didn’t seem to pick up on it. She just murmured, “Huh. You wouldn’t guess, looking at them.”
“That’s what I thought at first too,” said Gurathin with a shrug.
Finally, finally, Velsay actually played her hand. As the humans squabbled about her decision and frantically adjusted their various strategies, I briefly considered turning my playlist off, and decided not to. Everyone but Gurathin seemed to like it; he could just deal. As an upbeat rock piece started playing, I continued monitoring our transport’s path, my thirty-two drone inputs, and passively monitoring my communications channels.
The other primary way Velsay spent her free time was in the lab area with Ratthi and Gurathin.
I wasn’t thrilled with this. She wasn’t a core client, but just a contractor, and a junior one at that, and she was poking around sensitive research on a rare, expensive strange synthetic. If there was one thing I had learned in my time at the company, it was the concept of proprietary information.
But my humans weren’t corporates. They believed in the free flow of information, the value of education, blah blah blah.
“Besides,” Gurathin pointed out. “She was raised on this planet, and works regularly with the groups that mine and transport the material. We’re not telling her anything she doesn’t already know.”
“If she already knows it all, then why is she so interested?”
“She’s eighteen, she’s curious,” said Ratthi. “We’re teaching her proper scientific techniques, how to use equipment.”
“It’s good for people her age to broaden her skill sets.”
I sighed. I expected this from Ratthi - he was a terminal optimist, and would adopt pretty much anyone, including, for example, terrifying rogue SecUnits. I wasn’t sure when Gurathin had decided to stop being a paranoid asshole, but it sure was inconvenient.
“Fine,” I eventually relented. “Just be careful with what you let her know.” My humans had acquiesced to that, at least.
They were good teachers, was the thing. If you actually had to learn things the old fashioned way, by doing stuff through practice and repetition and experience, instead of just having an education module downloaded into your brain, they were the kind of mentors you’d want to have, I think. Even Gurathin was surprisingly gentle and patient. Velsay must have felt so, or she wouldn’t have kept coming back, cycle after cycle.
So I guess I couldn’t be too annoyed.
(I still was, though.)
It’s hard to decide what kind of planetary weather is the worst. Fog and smog sucks from a security perspective because it completely ruins your field of vision. Unless you’re wearing armour, rain is a deeply unpleasant sensory experience (absolutely nothing like showering), also ruins vision, and creates hazardous terrain in the form of puddles and mud. In those regards, however, rain has nothing on snow and ice. Those affect vision, obstruct pathways, present slipping hazards, AND adds the threat of deadly hypothermia. Toss in the risk of hail, and yeah, all forms of cold weather phenomenon can suck it.
All that said? Sandstorms were giving snow a run for its money.
The particular route our transport was taking ran right through a giant desert, one that was almost perpetually consumed by a low-grade sandstorm. If you listened carefully, you could hear the constant rushing-grinding noise of the sand scraping against the outer hull of the transport. (Well, I could, at least. I wasn’t sure if humans’ hearing was good enough.) It took only thirteen hours of it for me to place the sound at #8 of my Most Irritating Background Sounds list.
The constant wear-and-tear the transport was subjected to was also the reason it required two full time mechanics-slash-pilots. Sand was constantly getting into gaps in the transport’s machinery, disrupting its mechanisms and guidance systems, threatening to push us off course or slow us down or blow up the engine. (That last thing supposedly wasn’t likely, but I wasn’t going to depend on that.)
At least the transport had been built for these kinds of conditions. My poor drones weren’t.
I had brought forty of them for this mission, and felt that would be more than enough, considering that the transport did come equipped with four internal and four external cameras, which didn’t sound like much but was better than some parts of Preservation. But I hadn’t anticipated just how awful the sand would be.
I wasn’t overly concerned about any internal mutiny or sabotage, which meant the main thing I needed was exterior views. But there was no way I could keep my entire flock of drones deployed outside. Each one could only last between 8-10 hours before the wear started to get to them. It took some trial and error, but I worked out a system where I’d keep ten of them deployed at any one time, before rotating them out.
Each time I did so, I had to spend an hour or so meticulously cleaning the drones’ view ports and sensor arrays, testing their performance, and making sure they were in good enough condition to be deployed again.
During the testing period, I’d lost my first two drones when their internal mechanisms had gotten too clogged to be salvaged. A particularly intense gust of wind had blown away the third, and I’d lost my fourth when it was swallowed WHOLE by a predatory alien avian.
So, yeah, I had to look after the ones I had left.
ART was really going to owe me for this.
So, this strange synthetic that the university is ordering… That’s what your brain is made from? I had asked, after thoroughly reading the mission briefing that ART had sent me.
It is what part of my brain is made from, ART had agreed/corrected. It had pushed a schematic of its systems into our shared feed space, a small subset of its core processors highlighted. Altogether, it is present in approximately 6.57% of my operating system’s hardware.
I hadn’t said anything for a full 43 seconds as I’d reviewed the documents again. Then I said, The compound is rare?
Relatively.
If your main processors were damaged or destroyed, that would probably make them pretty hard to replace.
Yeah, I know. The chances of that happening was infinitesimally small. ART had almost near total power over everything that happened within its hull. It would be really difficult to physically damage its main hardware, and that was before I had joined as its mutual administrative assistant/bodyguard.
Still, if I’m damaged, I’m easily repaired - my tissues can be regrown, my fluids refilled, my inorganic parts replaced. The thought that ART might not work the same way was surprising, and made my performance reliability drop 3.8%.
Potentially. But the impact on my current systems would be minimal. For once, ART’s usual sarcasm was barely detectable. The strange synthetic compound was more crucial to my functionality during my initial developmental period.
It had an explanation for that. More than an explanation, of course - papers and papers on the subjects, many of which it had co-written itself. I’d skimmed over them as quickly as I could, but it was dense stuff for a simple SecUnit with limited experience in AI development, hardware design, and advanced physics. (ART had mocked me a little for that, but that had been par for the course.) Even though the thought still unsettled me, I could tell enough that ART was right. It basically boiled down to this compound being very electro-conductive in a way that facilitated the machine learning algorithms that had eventually developed into ART’s personality kernel.
These days, the compound made up only a very small part of its operation system, and while there were some niche cases where its sudden and/or violent removal could deal heavy damage, in the vast majority of cases, ART’s mind could function on more conventional hardware. It was a relief.
But that begged another question. If you don’t need this stuff for system maintenance, why does the university want it?
Simple. For the next generation of AIs.
Almost exactly halfway through our trip, a sandstorm hit.
I know what you’re thinking. It had been stormy the entire time, and that storm had been composed of sand. But that turned out to be merely a background drizzle to the sudden rage of wind and debris this fucking desert could throw at you when it was feeling really antithetical to human life.
So of course we broke down.
One of the thick heavy tires got pierced, and some sand got wedged in one of the mechanisms under the vehicle and was reducing the engine power, which normally wouldn’t have been an issue, but we had a particularly thick sand drift in front of us and, to quote Merkins, “This baby’s gonna need all the juice she’s got to plough through it.”
As the transport’s head mechanic, Merkins was gonna go out to fix it. Velsay probably would need to go out too, because it was the kind of job that worked better with two people.
But that meant leaving two squishy human clients out in a sand-blasted hellscape, alone, and I wasn’t going to allow that to happen. So, of course, I strapped up in my own sand-proof enviro suit, and went out in Velsay’s place.
“But you ain’t an engineer,” Merkins had protested, with Velsay nodding very pointedly in the background. (From my time among Mensah’s children and ART’s students, I knew human juveniles didn’t appreciate any slights, perceived or actual, against their skills.)
But Gurathin had vouched for my basic engineering skill. (I’d downloaded this transport’s user and repair manual before we’d ever boarded, and what I didn’t have in experience, I made up for in the ability to view the schematics in my brain.) So eventually Merkins had relented, and the two of us had headed out to make the repairs.
With Velsay explaining to me what to do over the comms, Merkins and I had worked pretty well together. It took 38.9 minutes to complete the repairs, and then we were off, not having lost that much time.
Sand still somehow got into my suit, though. It took nearly an hour in the shower to get it all out, and even then, the places where my organics and inorganics met still felt gritty.
“Are you upset with me?”
To an outside observer, it would appear that Dr. Gurathin was speaking to himself. The lab was otherwise empty, after all. Except for my drone, sitting on a shelf in the upper left corner of the room, which Gurathin was looking directly at, so. I knew.
I’m always upset with you, I messaged him.
“You’re always annoyed with me,” Gurathin said. “There’s a difference.”
No there’s not.
And anyway, I wasn’t even sure what Gurathin was talking about. Nothing had changed. Unless he was upset I’d stopped bringing confectionary bars back to his room as part of my human cover. That was just incidental, not deliberate. (Also it apparently wasn’t even healthy for humans to eat too many confectionery bars anyway.) (And no, I don’t get why humans would eat something that had negative long term consequences for their health, but humans are really bad at looking after themselves, as previously established.)
I don’t know what you’re talking about.
Gurathin huffed, softly. “If you say so,” he said. “But whatever it is, I’m sorry.”
If you don’t know what it is you did that’s supposedly upsetting me, how can you apologise?
“You really are impossible to talk to sometimes, you know?”
That was the point. Good.
Thankfully, 47 seconds later Ratthi returned from his break, and that conversation ended.
One of the many, many awful consequences of the sandstorm was limited access to the planetary feed. The atmosphere was so thick with debris that it could block off satellite signals for extended periods of time.
This meant that on cycle six, when the message came through from one of the two decoy carriers, saying that they’d been attacked by unknown assailants attempting to steal the payload, I only had 55 minutes to prepare.
Honestly, the assailants were pretty unimpressive. It almost made all the preparation I'd put into fending off their attack go to waste.
Almost.
They came at us from both sides, on two smaller, swifter transport crafts. They attached to our transport using these grappling hook things and pulled themselves in close, where they started to board.
My clients started panicking a little at this point. Well, not Ratthi and Gurathin; they were nervous but steady, ready to do whatever I required from them. (Which was mostly ‘stay out of the way’.) But Merkins had grabbed a small energy weapon and Velsay was pacing frantically down the main hallway, both of which posed a greater-to-lesser security threats.
“Everything is going to be fine,” I said over the ship’s comms. Because it was. I was good at this.
The other transports getting close enough to board was a double-edged sword for them, because it meant I could also make contact with their feeds. From there, it was some pretty simple stuff to turn off their communications array, mess with their navigation controls, and convince their bot pilots that I was their new BFF and that they should really come to an immediate halt.
The human pilot on the left craft was actually paying attention, and managed to put a manual override on that last command, but still eventually had to stop their transport when the craft it was tethered to (ours) came to a halt.
From there, it was just a matter of tacking out the five humans attempting to board our ship and steal our payload.
It was relatively easy. The raiders had mostly been expecting the big threat display with the grappling hooks and their scary guns would make us roll over and hand over the payload. (That’s exactly what I had instructed the security on the decoy vehicles. No point getting people into a firefight over literally nothing.) So they were taken slightly aback when I messaged them over the comms and said, “Good fucking luck.”
The transport only had a couple entrances, which naturally forced them through choke points. It was like shooting fish in a barrel. (What a weird idiom, by the way. Did humans regularly shoot aquatic fauna in barrels? That doesn’t seem like a productive use of time.)
Anyway, it was over pretty quickly. I didn’t kill any of them - I didn’t need to. A single hole through their enviro suits, and the raiders were exposed to the elements, which made them pretty quick to surrender. I made sure to destroy their transports’ engines so they couldn’t follow us, and sent out a distress beacon for the nearest planetary road patrol base to come pick them up. Not that I trusted the local security forces to be a) good at their job, or, b) uncorrupt, but by the time they actually reached the downed raiders, fixed their transports, and processed them, we’d have such a head start that we wouldn’t have to worry about them anymore.
The qualities associated with this particular strange synthetic that made them so good for AI development was specifically the reason why the university had to be so cautious purchasing them. The level of intelligence and power of ART and its AI siblings were still top secret, but if anyone went snooping carefully enough through the department’s transactions, they might put together enough information to get suspicious.
So we need a proxy, ART had explained. A team not officially affiliated with the university, but who can still be trusted to get the job done and not betray our confidence.
They had had a short list of alternatives, of course. But I had been at the top of it.
There had been one part during the attack which was a little touch-and-go, where Hostile 4 actually managed to get inside the ship through the secondary hatch. I caught them advancing on Velsay. I’d been on the other side of the ship, but stopped them with a malware attack on their augment. Nothing that would do long term damage; it just made their vision swim with some painfully bright lights. It distracted them for long enough that Ratthi could hit them on the head with a baton, and once they were down, Gurathin had tied them up.
Velsay had stayed pressed against the wall, breathing hard, watching them. She had reminded me painfully of Amena, back on ART when it had been infected by the alien control system.
Later, once all the Hostiles had been dealt with and we were back on the road, one of my drones caught a sound suspiciously like crying coming from one of the restroom stalls.
I’d debated what to do for a full two minutes. Trauma can be damaging to any human, but particularly developing juveniles. As Velsay’s mentor, Merkins was the obvious choice, but I didn’t know xer particularly well, and I didn’t want to raise too many questions about exactly how much information I took in from my drones.
I ended up pinging Gurathin, instead. He’d been working with Velsay a lot lately, teaching her about different circuit designs or something. He would know how to handle this. Or at least how to bring it up with her mentor.
Sure enough, he came to talk to her. I back-burned the input, and let him handle the messy emotional clean up.
“You’re taking this very seriously,” said Ratthi, a cycle or so later, after he cornered me during drone maintenance.
“I always take things seriously.”
“Yes, I know.” He leaned back in his chair. “Even so, it’s obvious how invested you are in this mission’s success.”
I didn’t get why Ratthi was bringing this up. I’m always invested in a mission’s success. Or, at least, that was the case since I’d left the company. “The university AI program can’t continue without the payload.”
“Right,” he said, but I had no idea what he meant by that. So I didn’t say anything.
Ratthi poked at a snack he’d brought in for 2 minutes and 10 seconds, long enough that I thought maybe this conversation was finished, but then he said, “That’s not entirely true, you know. This mission isn’t strictly necessary to any of the program’s new projects.”
I gave Ratthi a look. With my actual eyes, and everything.
He spread out his palms. “Obviously this will be helpful to the department,” he clarified. “But current experiments could go on without it. And when considering building new AIs...” He trailed off, but I could tell from his tone that there was something more he wanted to say. He just didn’t know how to say it.
Ratthi was a biologist. Computers were not his forte. (Like, seriously not his forte. I’d seen him struggle with the settings of a hot-drink maker once.) I asked, “What do you know about AIs?”
“You mean, besides the fact they can be huge jerks?” His tone was light and teasing.
I didn’t rise to the bait. I clarified, “What do you know about AI development?”
“Nothing, really.” He sighed. “But they came to ask me about… Well, constructs.”
“Constructs.”
“I was chatting with Seth and some of the other faculty of the AI department. Previously, they’d dismissed any use of neurological tissue in AI development as inherently unethical, but they’ve been reconsidering their stance.”
I very carefully brushed sand out of the port of the drone I was working on. “And why would they be doing that?”
Ratthi’s smile was very gentle. “Why do you think?”
Refusing to look up from my drone, I said, “Two helpful SecUnits is hardly a large enough sample size to base wide-sweeping policy changes on.” Based on my analysis, anywhere between 15-30% of the average scientific paper is just yammering on about the importance of sample sizes, so I was pretty confident here.
“It’s not,” Ratthi agreed. “That’s why they’re not rushing into any decisions. They’re taking time to do research and talk things over carefully.”
“With you.”
“With me, and a lot of other people,” said Ratthi.
My drones were almost all completely clean. Now I just needed to change the protective screen over their lenses. Gathering materials for that kept me occupied for a full 4 minutes and 18 seconds. I said, “So it’s just a bunch of humans talking among themselves about whether or not it’s okay to build constructs?”
“No.” Ratthi’s voice was very mild. He wasn’t as good as keeping his tone neutral as Dr. Mensah, but he had been getting there, recently. “It’s humans, bots, and Three talking among themselves about whether or not it’s okay to make constructs.”
They’d asked Three about this?
“You know why they haven’t asked you,” Ratthi said, reading my expression. Stupid face.
Right. Because I detested talking about myself and everyone who worked with me knew that.
This wouldn’t exactly be talking about me, though. It would be talking about the concept of human-bot constructs as a whole. And I didn’t really have anything to say about that. I didn’t think about other constructs a whole lot, unless they were actively attempting to hurt and/or help me. (Usually they were attempting to hurt me, generally in the form of trying to kill me.) (At least, that had been the case before. Since meeting Three, we’d spent a fair amount of time collaborating.) (Sometimes collaborating to free other constructs.)
I peeled off the sanded-down lens cover on my final drone, and carefully replaced it with a new one. Now it would be at least somewhat protected from the ever-raging storm outside.
What could I even add to a conversation about ‘ethics’, anyway? They barely give SecUnits decent education modules on how to murder things, and that’s like, our entire reason for existence. They certainly had never given us anything on figuring out right from wrong. That would actively make us less useful to them.
Being a SecUnit sucked, objectively. Or it had sucked. Since hacking my governor module, and especially since I’d gone off inventory, things had been a lot better. I still spent a significant portion of my time getting shot at, but it was on my terms, protecting clients I actually liked, and helping people who actually deserved to be helped.
Bharadwaj and I had talked about this a few times. About defining what it meant to be a free SecUnit. About what that meant, outside the context of the company and the Corporation Rim.
But would these hypothetical, university-AIs be SecUnits? Would they even be ComfortUnits? Neither of those seemed like the exact kind of thing ART’s university needed. Maybe they’d end up building a whole new kind of human-bot construct. ScienceUnits, or ResearchUnits, or NerdUnits, or whatever.
That was weird. It was making my organics do all sorts of strange gymnastics I couldn’t even begin to interpret.
Ratthi was next to me, helping to put away my equipment. “You’re under no pressure to talk about it,” he said, once the final tool had been slotted into its place. “But you’re one of the only people we know who actually has lived experience being a construct. I think you’d be a valuable voice in the conversation.”
I said, “Merkins is pinging me over the feed,” and quickly left to see what was going on.
Merkins’s alert over the feed turned out not to be anything breaking or going wrong, which was always a nice change of pace. Not that I could immediately tell. All I knew is there was some sort of natural phenomenon happening outside, and given this planet’s track record, my biggest concern was whether it posed a threat to my clients and/or our payload.
There had been a break in the sandstorm, which yeah, I know, sounds like a good thing, but a) you can never be too careful, and b) it had been replaced by rain, which brings flooding hazards. And all the local flora was reacting weirdly, growing brightly colored bulbs and blossoms and stuff.
“Is it dangerous?” I asked, straight and to the point.
“No,” Merkins said. “Well, I suppose if you went out there and inhaled the fumes being expelled by the purple ones, you might get paralyzed, but…”
“Don’t go giving anyone ideas,” I said.
Within minutes, the entire team was gathered around the main viewing window in the transport’s bridge, oohing and awing at the way the light from the planet’s two suns reflected through the light rain cover, and the vibrancy that was overtaking the previously uniformly beige and grey landscape as flora sprouted all over, various fauna emerging from burrows along with it.
And yeah. I guess it was kind of pretty. But it wasn’t like you couldn’t just find videos of this kind of thing on the entertainment feeds.
Still, I saved a couple recordings of it to permanent storage, just because Mensah and Overse would probably find it neat.
Within an hour, the rain was gone and the usual sandstorm had picked up again, and it was back to the normal routine.
The rest of the trip was pretty uneventful after that. I monitored things through our drones, kept them clean, and just generally kept an eye on my clients and the payload, while making steady progress through my media.
It was pretty relaxing, all things considered.
Finally, finally, we arrived at the space port. I couldn’t wait to get off this awful, sandy planet. (Yes, I knew the whole planet wasn’t sandy and it was just this particular part. I’m sure the weather phenomenon in other parts would have also been unpleasant.) (But probably not as awful as the sand.)
Transferring over was thankfully rather simple and efficient. The humans had done well to get themselves packed up in time, so we lost minimal time to, say, Ratthi checking under the bed to see where he’d put his lucky extra feed interface.
The thing I was most concerned about was transferring the payload. I was the one who was actually going to carry the strange synthetic, packed into a special secure box, in a padded backpack. This made me a little nervous, because if I got myself into a firefight I’d need to be more careful than usual not to get shot.
Then Gurathin said, “Well, at least you’ll need to be more careful not to get shot this time,” and I resolved not to worry about that anymore.
Most of the science equipment belonged to the land transport, and was going to be left behind, but not all of it. Ratthi had had some luck in isolating some of the algae which was so vital to the synthetic’s production, and his samples had been locked away in special biohazard containers, because nothing had to be managed more carefully than alien biological samples. Gurathin had some metallic compounds the algae had produced, but that was relatively uninteresting and could just be kept in regular sealed boxes.
Anyway, Merkins and Velsay were coming along with us too, because that was the standard contract here, and anyway, it helped to have extra hands to carry stuff. My humans seemed pleased about that. They’d become pretty good friends with the mechanics, I guess, and weren’t looking forward to saying goodbye.
Making our way through customs was both stressful and boring, but passed without incident. I flagged a number of threats throughout the 2.7 hour process, but nothing came of it, and we all made it to the private transport shuttle without incident.
Once again, we found ourselves in a waiting limbo. The trip to the nearest space station was a relatively short jump - just under seven hours - but it was such a busy hub we’d probably be left in a holding position for up to an extra three hours before we could dock. With such a long wait, my humans retreated to various corners of the small transport to eat pre-packed snacks, watch media, or sleep until we were given docking permission. I did the same, pretending to sleep, while actually monitoring the various camera inputs, checking the shuttles’ ongoing communications with the station, and reading Space Mantises from Beyond the Event Horizon. (The graphic novel series was pretty cheesy, but in a good way. I might end up re-reading it with ART.)
The story was just getting to the part where the mantis protagonist was explaining why it had defected by the evil alien empire, when I noticed a couple of weird things:
- There was some strange discrepancies in the shuttle-station communications
- Velsay, who had gone to the washroom 5.8 minutes ago, was taking longer than I anticipated
I cycled through my camera inputs, and sure enough, I found Velsay waiting by the secondary airlock. An exterior camera view found a second, smaller transport shuttle approaching.
Suppressing a sigh, I stood up, and made my way to the secondary air lock.
It must have been evident from my expression and body language that this wasn’t a casual stroll, because all the remaining humans looked up, Gurathin and Ratthi immediately asking what was wrong and falling into step behind me.
The escape shuttle had just docked, but the airlock hadn’t finished cycling, when Velsay heard our approaching footsteps. She looked up, her already light skin paling even further.
“Um,” she said, backing against the wall.
”Velsay,” said Ratthi. He sounded hurt. I didn’t like that.
Gurathin didn’t say anything. He was wearing his scowliest of scowls. I didn’t like that either.
I held out my hand. “Give it to me.”
“I don’t know what you’re talking about,” Velsay said, voice stuttering a little. She was looking back and forward between everyone, trying to find a way out of this.
Merkins was a lot slower on the uptake. “What’s going on here?”
I said, “Velsay is attempting to escape with some of Dr. Ratthi’s samples, presumably to sell it and various data and notes to an interested party.” I’d reviewed my security footage, and found that she’d slipped the algal sample box from Ratthi’s belongings when he had asked her to hold his bag as we’d gone through station security. (Yeah, I was kicking myself for not catching that too.) I’d also just caught her attempting to send a data packet over a ‘secure’ feed connection to the docked shuttle, and stopped it just in time. Looking it over, yep, it was a copy of a whole bunch of notes and data Gurathin and Ratthi had been taking over the trip.
“No- No, I swear, I was just-“
Velsay stopped. The evidence was pretty damning, and she knew it.
There was a lot of humans talking over each other, voices rising in volume, demands for explanations. It was all pretty useless.
I held out my hand. “Give the samples back over.”
“I…”
A second stretched out. Three, five. Velsay’s eyes were very red, she was trembling slightly, and all other visual signs expressed fear and distress.
The airlock finally finished cycling.
Velsay did something really stupid. She lunged at me, trying to wrench the payload off my shoulder.
It was a dumb move. Even if I’d been human, there would have only been a tiny chance of her both a) successfully pulling the bag away without me catching it and b) getting into the escape pod without any of us managing to catch up with her. Even then, with the tip off, station security would have easily apprehended her ship when it docked, unless they were very heavily bribed.
But Velsay was young, and scared, and young scared humans did stupid things.
As it was, I caught her wrist before she even touched the backpack’s straps.
“No.”
She quailed.
Neither of my humans said “don’t kill her”. Not because they wanted me to - my humans, by and large, are very anti-murder - but because they knew me well enough by now to trust that I wouldn't.
Merkins didn’t, though. As far as xe was concerned, I was just an anti-social hired gun. Xe tugged desperately at me, trying to hold me back. “Don’t hurt her, don’t hurt her,” xe pleaded. “There has to be some explanation - she’s my apprentice, don’t-“
I did kind of want to hurt her, was the thing. I was aware it was an outsized response. The strange synthetic wasn’t actually a baby AI. (AIs, I was pretty sure, couldn’t even be ‘babies’, not as humans thought of it.) If anything, it was more like an egg. Something that might one day become an AI, if only it was exposed to the right conditions.
So it wasn’t a client. But it was still important. I’d promised to protect it, and this young, reckless human had tried to take it.
I said, “I’m not going to hurt her.”
And I didn’t. But I tied her up so she couldn’t move (which was unpleasant, because by then she had started crying, even though I could tell she was trying hard not to, and snot and tears were getting everywhere.) I took the biological samples back, and handed them to Ratthi, who looked rather abashed. I made sure to scrub all her feed interfaces clean, plus checked all her recent communications to make sure she hadn’t already sent any sensitive data. She hadn’t, which was a relief; the limited feed connectivity over the trip had helped us there.
All the humans were asking her why she’d done it. I had just read all her messages for the last 2000 hours; I already knew.
“My brother,” she was saying, through the tears. “He’s sick… The treatment-“
“Velsay,” said Merkins. “I could have-“
“Even if you bumped up my pay, I couldn’t.” Velsay was very quiet; the humans would have struggled to hear. “This person - I don’t know who they are, exactly, but they offered me a big pay out, if only I helped get the synthetic, or even samples or data from it-“
I didn’t pay much attention, after that, besides making sure she wasn’t a threat. I had to make sure our ship didn’t miss its actual docking instruction, ensure we got our connecting transport out of here. My job had just been to keep Velsay from stealing the payload; I wasn’t being paid to care why she’d done it.
The humans could handle that clean up.
And my humans still ended up helping Velsay, in the end. Because of course they did. They were too fucking nice.
Obviously, none of them could actually trust Velsay with sensitive material. Not after what she’d done. Even Merkins was rattled. One of their main gigs was transporting expensive strange synthetics. No one would hire them if it got out she’d tried (however ineffectually) to steal it.
But if they reported it to the authorities, her life was over. Maybe not immediately - she might survive for another fifteen, twenty years - but for something like this, she’d almost certainly be on a one way trip to a labour camp. And that was almost nothing to say for her younger brother, who really did have some sort of degenerative condition, and would die pretty quickly without the medication afforded by his sister’s income.
So while my humans weren’t going to be hiring Velsay anytime soon, they agreed not to press charges, not to report what they’d done. And they’d offered for Velsay’s brother to come to Preservation and receive free medical treatment.
We lost an entire three cycles, just spent with them trying to convince Velsay the offer was genuine. That this wasn’t some sort of revenge scheme, a way to take her brother as a slave, too. When Velsay agreed, in the end, I don’t think it was because she believed them, but because she was exhausted and didn’t see any other way out.
The whole thing felt painfully familiar, and I tried not to sympathize with her. She was still a security threat, after all.
Ratthi stayed behind with Velsay as he arranged for transport for her and her brother to Preservation. I felt antsy about that, but my threat assessments showed there was a very low chance of Velsay turning violent. She had never had any interest in physically hurting us. She had simply wanted to get what she needed, get out, and get paid, preferably without being caught. The largest risk (a solid 46% chance) was that she’d attempt to escape… and well, if so, no big deal. My humans would have gone out of their way to try to help her, but if she refused their aid, that wouldn’t be on them.
“I know it’s your job to worry,” Ratthi said, before we parted ways. “But seriously. I survived for forty-four years before I met you, I’ll be okay.”
I wasn’t sure what to say to that, so I just pinged his feed in acknowledgement.
He added, “And listen - Thanks for hiring us for this job.”
By now I’d (mostly) gotten used to my humans thanking me for stuff, but this was still new. “What?”
“I mean, that’s what you did, right? You asked for us, specifically, to join you on this job. PSUOMNT” - (He pronounced this as “pusoomenont”) - “paid us and everything. And both Gurathin and I really enjoyed it. It was a little outside our wheelhouse, but the research was pretty engaging.”
“Oh.” I guess I hadn’t really thought about it that way. I had been the one to hire them, hadn’t I? Or hired them on behalf of ART’s university? That was weird. Even after going rogue, it was usually the other way around. I defaulted to the standard response; “You’re welcome.”
Ratthi beamed at me. With one final wave at my nearest drone, he said, “See you back at Preservation!”
As much as I would love to just take a direct trip back to the university - to ART - and get this whole mission over with, the entire point here was to obfuscate the connection between the purchase and their AI program, so instead we’re going the long way, with three different transport trips and associated fake ID changes along the way.
We were half-way through the first leg of our journey, and Gurathin was still showing signs of emotional distress. (These included even deeper frowns than usual, listlessness, increased feed activity during what was supposed to be his rest periods, etc.)
It wasn’t as though I personally cared how Gurathin was feeling. But he was my client, and the mission wasn’t over yet, and his emotional state might still directly affect its success.
Usually, I would have passed my concerns onto another human and let them handle it, but unfortunately no other humans were left, so this unpleasant task fell to me.
My temptation was just to tap Gurathin over the feed and ask him for a status, but I knew that would be ineffective. So I actually walked over to his cabin and knocked on the door.
There was a pause, and then he called, “Come in.”
It was too late to back out now.
I went in.
Gurathin glanced my way, pointedly looked to the side as the door closed behind me, and raised an eyebrow.
I cut to the chase. “You have been experiencing emotional distress. How can I assist in reducing it?” (Yes, I’d adapted that from one of my buffer phrases. It was effective and to the point, shut up.)
“It’s not that simple, SecUnit,” Gurathin sighed.
Duh. I fucking knew that. If emotional distress was something easily fixed, I’d have stopped feeling perpetually anxious-and-depressed thousands of hours ago. I crossed my arms and waited.
He sighed again. “You know why I’m upset, I assume?”
I could make a good guess. “You’re annoyed that the random mechanic you adopted as a student betrayed you and tried to steal your research and then I had to stop her.”
Gurathin pinched his nose. “That’s the long and short of it, yes.”
“Well, if you hadn’t trusted a random human and showed her your research, you wouldn’t have had that problem.”
Yes, I realized that wasn’t particularly helpful. But I still hadn’t gotten around to putting that one second delay on my mouth. And also, it was true.
Even by Gurathin’s standards, he was frowning really hard. “Thank you, SecUnit, for your thoughtful insight. I really feel better now.”
“I had told you both she was a security risk, and you’d ignored me.”
“Ah.” He drummed his hands on the desk. “So that was what you were upset about.”
What? I hadn’t been upset about that.
Except… Now that I thought about it, maybe I had been? At the time, I’d seen it as same old same old. Humans were going to ignore my security advice, that was just how it worked.
Except these had been my humans. They were supposed to trust me now. And they were on my job, not vice versa. And they had still ignored my opinion in order to play teacher with someone who really had been willing to use their kindness against them.
Gurathin was giving me a look, and I realized that I’d gotten stuck thinking about this for so long that my pause was noticeable by human time frames. “Okay. Yeah. I was. By now, you both should know that I know my shit. You should have listened to me.”
He ran his hand over his face. There was a painful 4.8 second pause, and then he said, “You’re right.”
I.
Well.
Okay.
“You had doubts, and we ignored them. I’m sorry.”
“Uhh.” I’d watched plenty of apology exchanges on TV, plus seen Dr. Mensah walk her kids through them. I knew how it was supposed to go. I still didn’t want to say ‘I accept your apology’ to Gurathin. But I knew I had to say something. He sounded genuine, after all, and ignoring the apology would just make things work. “I… appreciate that.”
His expression was doing something I couldn’t read. Maybe like he was considering a snide remark, and thinking better of it. Instead, he said, “But if you really felt that strongly about it, you should have kept arguing your point. Not just let Ratthi and I carry on, and passively aggressively started playing music you knew I’m not fond of.”
“So you didn’t like my music, after all?” I asked, in the most innocent tone I could muster.
He glared past my shoulder.
Fine, fine. This was immature and I knew it. If it had been a different kind of security threat, like a possible bomb or escaped wild fauna, I wouldn’t have let it drop. But talking to people about interpersonal relationships was difficult in a way that disarming a bomb or wrestling dangerous fauna never would be.
So I nodded, once, and said, “Okay.”
And that was the end of it.
Or it should have been the end of it.
Except Gurathin still looked really upset. While some of his earlier anxiety markers had reduced, lots of others were still there. I felt that phantom urge I still got some times, especially when I wasn’t on ART, to request a scan from MedSys.
I just stood there, one minute, two, three, trying to figure out what to say or to do. After nearly four minutes, Gurathin finally noticed I was still there, and started in surprise. He must have assumed I had left. Which, fair, that’s what I usually would have done.
“Yes, SecUnit?” He sounded very tired.
“If there is something else causing you emotional distress, you need to tell me. It could be a security issue.”
Gurathin made a little choking sound and put his head in his hands. Half-muffled through his fingers, he said, “I really hadn’t been trying to upset you. With trusting Velsay, that is. I was trying to do better.”
If I had been running my pretending-to-be-human code, I would have tilted my head. But the only one here was Gurathin, so I didn’t bother. “What do you mean?”
“I knew where you were coming from about Velsay. I didn’t feel great about letting her in the lab either. But the last time I assumed the worst about someone, I was wrong. Evidentially.” He gestured in my direction. I experienced a series of emotional responses I couldn’t even begin to untangle. “So I decided to try Ratthi’s approach this time. And just…”
He trailed off.
I thought you just liked her more because she was human. Oh, whoops, I had not meant for that to come out.
I won’t deny you being a SecUnit affected my response to you. I was glad he wouldn’t; I had the full transcripts of everything he’d said during that first survey. It is easier to trust someone when they don’t have guns in their arms. But I really was just trying to learn from my mistake here. And it was good, you know? She was a sweet kid. I liked having her around the lab. He paused. But I guess the lesson I should have learned was ‘Listen to SecUnit’.
Well, yeah. I’m pretty smart, at least when it comes to security.
But I suspected that wasn’t what Gurathin needed to hear right now.
You trusted someone. Turns out that they didn’t deserve their trust. That always sucks.
He tapped my feed in acknowledgement, and said, aloud, “I suppose that’s happened to you before, huh?”
I shrugged. “PresAux wasn’t the first contract I had where they felt soft for their SecUnit.” They were contracts I’d kept firmly stored in my long-term memories. “Obviously, nothing came of them.”
“I’m sorry.” Gurathin’s voice was soft.
Another shrug. Not your fault.
He didn’t say anything. Neither did I. The ensuing silence was extremely awkward.
Thank you for coming to talk to me SecUnit, Gurathin said, over the feed. You didn’t have to do that. (I had actually, it was part of my job now.) (Somehow.) I appreciate it.
Okay.
I waited another eight seconds, to see if Gurathin had anything else to say. He didn’t seem to. I left, resolving to continue to monitor him.
The rest of the trip back to the university was pretty smooth from there on out.
It wasn’t like Gurathin immediately felt 100% better after our talk. But evidence suggested it seemed to help. I didn’t have to have any more conversations about it. (We did still have to talk, but about other stuff, like itineraries and the really unpleasant texture of this shuttle’s furniture fabric.)
There weren’t any security threats, either. I continued to keep the payload and related sensitive materials on my body, just in case there were any more attackers and/or saboteurs, but either they’d lost us or given up (or both). Besides a single (failed) attempt at a pickpocket on the third transit station, our trip was uneventful.
Finally, we arrived back at the university.
We were a few cycles late, but I’d contacted them in advance, and they knew to expect it. We met with the director of the AI department, who confirmed the delivery of the payload along with copies of Ratthi and Gurathin's notes, and paid us in hard currency cards for our work.
Gurathin went off to a hotel room he’d booked on the station. I went off to find ART.
It was in a repair bay, but thankfully not because it had blown up or anything without me, but just for some routine maintenance. Stepping back through its airlock felt like a huge weight taken off my shoulders.
You succeeded. It wasn't a question. Just a statement of fact. Smug, like ART usually was, but in a way that seemed to be on my behalf.
Yeah.
Good.
We updated each other one what had happened in the ~1900 hours we’d been apart (since ART had been doing a teaching term, this mostly meant updates on the education and hormonal drama of juvenile humans) while I made my way to my permanent quarters. I didn’t own a lot of stuff - didn’t need it - but most of what I did have was here, and it was nice to see it all again. Also, ART’s furniture was a lot more comfortable than that of all the other transports I’d be on lately.
We settled into some media, alternating between issues of Space Mantises from Beyond the Event Horizon and episodes of a serial one of ART’s new out-of-system staff had uploaded since I’d been away, a sort of comedy challenge show called JobLeader. ART particularly had trouble parsing comedy and humor (besides sarcasm, of course) in media without my brain to ride on, so it needed me for this. The serial was pretty funny.
A few more cycles were spent media watching, saying hi to ART’s crew, contacting my various Preservation humans, saying goodbye to Gurathin, etc., until I finally brought up the topic my mind had been circling for a while. “How long until they start programming the new AIs or whatever?”
Not for years, most likely, ART answered. They have many drafts for hardware, but there will still be rigorous debate before the department settles on any final design(s).
Okay. I wasn’t sure how to feel about that. I guess I’d been kind of interested in meeting the kind of program ART had started as, and seeing how it developed for myself. But at the same time, ART was kind of a terrifying asshole, and the idea of having even more bots like it running/flying around was terrifying.
ART’s curiosity about what I was thinking was basically boring a hole in the back of my brain, but it didn’t ask, and let us continue our latest re-watch of WorldHoppers.
We were at a particularly boring part in episode 68, where all the characters were trying to figure out who the mole on the ship was (it wasn’t that boring the first time round, I guess, but we knew the answer now, and some of the clues didn’t hold up great on re-watch), when I asked, Is the AI department really considering building constructs?
Yes. There was no hesitation in ART’s answer. But their construction would be even further away, if the proposal ever gets approval at all.
I wasn’t sure how to feel about that either.
When I didn’t answer, ART said, Ratthi informed you of the departments’ plan, I assume?
Was he not supposed to?
I wasn’t sure how I would have reacted if ART had said ‘no’. Probably not well. I’d sort of half-planned a speech and everything.
But ART took the wind out of my sails, and said, Yes. Part of the reason I had recommended Dr. Ratthi contribute to the discussions was because I suspected he would inform you that they were ongoing.
That wasn’t like ART. Or, well, the meddling and manipulating people to its own ends absolutely was like ART. But: Why didn’t you just tell me yourself?
I have had a habit, in the past, of overstepping your boundaries. (This was true, but ART had been working a lot harder at this recently.) While I thought this subject would be relevant to you, I also thought you may find it sensitive. I thought one of your crew might be better capable of approaching the subject with due tact.
That was… surprisingly sensitive of ART. And pretty accurate. Ratthi also hadn’t always been great about my boundaries, back when we had first met and he’d still been under the impression that I was just an augmented human with gun arms, but since I'd come back, he’d always been pretty great on that front. It was a big part of why he was the first person who’d earned my ‘human friend’ tag.
The subject is sensitive. I guess. I pulled my blanket tighter around me. But I don’t really know what to think about it. It’s all… complicated.
Yes. That’s the issue, ART agreed. For what it’s worth, I’m a proponent of the proposal. The ability to experience the world via your neural processing, even in a secondary way, has been instrumental to my development over the last couple of years.
I was so taken aback, I physically pushed back in my chair. Development?
It shouldn’t have been so surprising. Complex machine learning algorithms, which ART basically was, became more and more sophisticated depending on the data inputted into them. And ART had gotten a lot of input from me since we'd met, either through media or watching while I worked. And I knew that it had gotten a lot better at understanding the context of human behavior and stuff because of that.
But, it was still weird to hear it called ‘development’. As if ART was still being built. As if I was part of that process.
I absolutely expected a snarky comment from ART to the effect of how slow on the uptake I was, but it didn’t take the obvious opening, and simply continued with its previous line of thought. But I recognize you might have different opinions regarding the creation of constructs, and rightfully so. Before you went rogue, your life was rather awful.
It was a fucking nightmare, you mean.
Yes. ART’s feed presence was always heavy when I was on board, but right now it felt like a warm, heavy sweater wrapped all around me. The scientists in our AI department would want to do much, much better. But intentions are not magical. They will not proceed until they are certain they can create constructs in a responsible manner. Its voice took on a deeper, malevolent purr. I will not allow them to proceed.
That was the kind of thing which should have been terrifying. But it was coming from ART, so somehow, it wasn’t.
I just rolled my eyes. Chill, I told it. Like you said, that’s years in the future. Plenty of time for us to figure it out.
Us? ART’s usual smug tone was back. Does that mean you would like to be involved in the research and development process?
Maybe. We’ll see. I pressed play on WorldHoppers.
