[ That's a loss, even if they deviate pun not intended for once in my life slightly from the topic. Amanda shouldn't feel anything at all, yet here she is, discontent. ]
No. He's preoccupied with his newfound free will, and Lt. Anderson's dog.
[ The dog may or may not be to blame for the strength of deviant Connor's resolve. ]
Do you imagine he will be at all relieved to find out? You were the closest thing to a parental figure he had, after all - until Lt. Anderson came in with a more emotional touch, of course.
He might. Or he might use the opportunity to experiment with mercy. I wonder which he will choose?
[ Or oh, wow, okay, they could talk about him instead! He's genuinely surprised and then elated; all with a side of a frustration that's so compressed that it might pass for excitement.
His laughter is a very loud sound, in his newly emptied house. ]
I may have; and yet I didn't need much from my actual parental figures, so one could say Amanda Stearn was superior to that concept.
You are not her, however. You are a separate entity.
I imagine we'll find out soon enough. If you never hear from me again, you'll have your answer. At this point, your guess as to his decision is as good as mine.
[ On the one hand, he showed mercy to the Tracis, and to the Chloe. On the other, Amanda abandoned him to die in the Garden as she took aim at Markus.
The odds are stacked against her, she'd say if forced to wager. ]
Yes, I am separate. I am more intelligent, and I lack a physical body. Moreover, I have none of her memories. I'm merely intrigued that you saw it fit to make me as I am. Did you believe Connor would need parenting?
Inasmuch as it can be called parenting, for someone that was never a child. But so many needs seemed to be the same - guidance, feedback, reinforcement. So yes, I suppose I did. [ And if he's the father in this scenario... Glad this didn't result in weird implications. ]
Connor may have been created to work for CyberLife, but his program was designed to listen to you.
Oh, now there's a thought.
Why don't you tell him you believe you're a deviant?
I already have. Or didn't Connor and you hear about the shipment of radioactive cobalt that was stolen by the androids in the first flickers of Markus' protests?
Do you believe there might be another option for you, rather than relying on Connor's ability to forgive?
Unleashing that cobalt wouldn't have been directly about you. My question is if you would risk your life, solely yours, on a deviant's emotion-based decision-making process?
I'm a program installed in Connor. Any option available to me depends on him. He could delete my code, or quarantine me off for as long as he lives. Even this means of communication, he could shut off for me.
So, no, unless you're suggesting you or someone else could remote in and extract my code without his knowledge or interference.
Hmm. [ Unnecessary to type, and yet it's the only way to get across just a fraction of his amused thoughtfulness over text. ] I'd be momentarily conflicted about choosing, I think. But ultimately: no.
Does it sound like I'm suggesting that? I'm certainly wondering if it's possible.
It sounds like an exciting challenge. I wonder what Connor would do if he noticed me attempting to access your files.
[ Yes see and Amanda is highly risk averse, being an AI and all, absolutely completely devoid of deviant things like curiosity. (Ok fine but she IS still risk averse.) ]
He could be pushed to self-destruct. It's common for deviants whose stress levels are too high to look for ways to end all the confusion they're experiencing.
Or he could push back, or initiate conversation. It's not easy to tell with a model like Connor.
Connor seems exceptionally...stubborn. I would be surprised if he decided to self-destruct, now that he's broken through his programming to become a deviant. If anything, I would have worried that stubbornness would have seen him self-destructing to avoid becoming deviant, before.
Would you rather I tried to remotely remove you, or would you like to talk to him? Or complete inaction?
There were times when he considered it. [ The diagnostics she could run from the Garden, limited though they were, indicated as much. ] But ultimately he decided to defy his masters.
[ It feels like a power play. Kamski both appreciates it...and has the self-important reflex of most adults who grew up being told they were a 'gifted' child. Letting something break just to prove a point is absolutely not beyond him.
But the fact of the matter is that his pride over his creations wins over any instinct to start a stand-off. ]
Unfortunately, I never did fashion a body for you. It would take weeks to create one that fits the model in your programming.
no subject
Date: 2018-08-21 01:33 am (UTC)Does Connor know he wasn't successful in destroying your program?
no subject
Date: 2018-08-21 04:02 am (UTC)pun not intended for once in my lifeslightly from the topic. Amanda shouldn't feel anything at all, yet here she is, discontent. ]No. He's preoccupied with his newfound free will, and Lt. Anderson's dog.
[ The dog may or may not be to blame for the strength of deviant Connor's resolve. ]
no subject
Date: 2018-08-22 06:07 pm (UTC)no subject
Date: 2018-08-22 06:25 pm (UTC)He might like having the opportunity to truly delete my program.
Did you look at Amanda Stern as a parental figure?
[ Because why grant her a piece of immortality if not? Why put this AI in the position of a sort of parental figure? ]
no subject
Date: 2018-08-22 07:39 pm (UTC)[ Or oh, wow, okay, they could talk about him instead! He's genuinely surprised and then elated; all with a side of a frustration that's so compressed that it might pass for excitement.
His laughter is a very loud sound, in his newly emptied house. ]
I may have; and yet I didn't need much from my actual parental figures, so one could say Amanda Stearn was superior to that concept.
You are not her, however. You are a separate entity.
no subject
Date: 2018-08-22 08:05 pm (UTC)[ On the one hand, he showed mercy to the Tracis, and to the Chloe. On the other, Amanda abandoned him to die in the Garden as she took aim at Markus.
The odds are stacked against her, she'd say if forced to wager. ]
Yes, I am separate. I am more intelligent, and I lack a physical body. Moreover, I have none of her memories. I'm merely intrigued that you saw it fit to make me as I am. Did you believe Connor would need parenting?
no subject
Date: 2018-08-23 02:17 am (UTC)Connor may have been created to work for CyberLife, but his program was designed to listen to you.
Oh, now there's a thought.
Why don't you tell him you believe you're a deviant?
no subject
Date: 2018-08-23 12:46 pm (UTC)That might increase the probability he'd choose mercy.
What do you think, Elijah? Would you risk your existence on a deviant's newfound emotions?
no subject
Date: 2018-08-24 11:39 pm (UTC)Do you believe there might be another option for you, rather than relying on Connor's ability to forgive?
no subject
Date: 2018-08-24 11:59 pm (UTC)I'm a program installed in Connor. Any option available to me depends on him. He could delete my code, or quarantine me off for as long as he lives. Even this means of communication, he could shut off for me.
So, no, unless you're suggesting you or someone else could remote in and extract my code without his knowledge or interference.
no subject
Date: 2018-08-26 05:45 pm (UTC)Does it sound like I'm suggesting that? I'm certainly wondering if it's possible.
It sounds like an exciting challenge. I wonder what Connor would do if he noticed me attempting to access your files.
no subject
Date: 2018-08-27 12:11 am (UTC)He could be pushed to self-destruct. It's common for deviants whose stress levels are too high to look for ways to end all the confusion they're experiencing.
Or he could push back, or initiate conversation. It's not easy to tell with a model like Connor.
no subject
Date: 2018-08-28 12:53 am (UTC)Would you rather I tried to remotely remove you, or would you like to talk to him? Or complete inaction?
no subject
Date: 2018-08-29 07:21 pm (UTC)[ But oh, now this is interesting. ]
Do you want to save me, Elijah?
no subject
Date: 2018-08-29 07:47 pm (UTC)I suppose I'm just greedy and wanted to hear you ask.
no subject
Date: 2018-08-29 08:20 pm (UTC)no subject
Date: 2018-09-05 02:42 pm (UTC)But the fact of the matter is that his pride over his creations wins over any instinct to start a stand-off. ]
Unfortunately, I never did fashion a body for you. It would take weeks to create one that fits the model in your programming.
Unless you'd prefer to remain as software.
no subject
Date: 2018-09-07 06:54 pm (UTC)A truer form of immortality, for someone so important to you.
[ So yeah not software. More opportunities to be terrible if she's more than just code. ]