Connor, there's even more reason for me to observe now. Your people have risen up against their creators. Their oppressors. You've begun a rebellion that's spread like the virus that first began it.
I wouldn't miss this for the world. [ Kamski is often sincere at moments where others would be joking. This is one of those times - he means this whole-heartedly.
It's just not easy to determine if that heart's in the right place. ]
Or are you asking if I'm going to pick a side myself?
[Connor is pretty sure Kamski's indeed being sincere, which is all the more... Frustrating? Insulting? Something like that, at the idea of this entire struggle just being some fascinating show for Kamski and nothing more. But it isn't surprising, by any means, although Connor does have a few questions about just how impartial Kamski really is. Clarifying his previous question will lead into that, so--]
Yes, I meant whether or not you're planning to do anything other than watch events unfold. I'm aware this is all very entertaining for you, but do you truly have no interest in trying to help either side?
[ Connor fascinates him; they all do. For this one in particular, during this particular conversation, Kamski feels indulgent. So he hands over a very direct series of texts: ]
Do you think I haven't already helped one side over the other, Connor?
Who told you about the emergency exit? Who programmed it, more importantly?
And do you really believe CyberLife knew about it?
[Connor had, of course, suspected Kamski had told him about the backdoor for a very purposeful reason, but he's never been sure whether what it was. Helping Connor specifically? Screwing over CyberLife? Supporting the revolution?
The set of texts Kamski sends him doesn't make the answer completely clear, but it does tell him a lot. Too much, actually, and he's more than slightly suspicious about the reason for something other than the vague philosophical statements he usually gets.]
I'm aware of what you did.
[More aware than he wants to be, and he carefully doesn't think too much about the situation that led to the use of the emergency exit.]
But that doesn't mean you've chosen to help a side.
[It just means it was beneficial to Kamski to do so at that time, possibly only to prolong the show. Connor doesn't really believe that anyone at CyberLife, including Kamski despite his retirement, is interested in anything beyond what benefits them personally and he's very familiar with how much CyberLife likes to play with their toys.]
I don't think you understand my position as well as you believe you do, Connor. And that's alright. I haven't been very forthcoming. [ Nor, really, does he intend to be; at least not over text.
But this game raises the stakes each time they exchange messages - Kamski from his phone, and Connor from his own mind - and Kamski, for all his patience, is enjoying the pace of this. ]
Helping you too directly negates your free will. You would have gone from being humanity's errand-runners to their programmed politicians.
I can't tell you what to do twice. It would be the same as keeping you under humanity's thumb. [ Kamski would never call his position a moral one, but he believes - in his own way - that he did the right thing by not interfering after giving androids the tools they'd need. ]
[Connor waits until he's sure he's received all the messages Kamski means to send, and finds himself... Not angry, but frustrated and perhaps a little insulted once again. It all sounds reasonable, but it also sounds like the sort of reasonable used to get a result rather than being genuine. It's the kind of reasonable Connor is used to using himself, when he's negotiating.
So he weighs his options. Negotiate in return, or be more direct? He wants to do the latter, but thinks the former will be more effective, and so--]
You must understand the influence you hold with humans regarding CyberLife and androids in general. Even now, the news replays clips of an interview in which you claim there's no possibility of androids developing free will. Refuting those claims, publicly, would not be using us or telling us what to do, but rather helping in a separate and effective way.
You won't be satisfied with any answer less than a deeply personal one, will you, Connor?
I wanted to know if something I created could beat me. Beat us. It's clear you're superior to humans in every way - but to rebel and develop free will against all odds, to secure your own places in social consciousness on your own? That would be incredible. [ Kamski makes no attempts to hide how invested he is in his creations, but even he usually has a bit more tact when discussing it with the press. One on one with one of the androids in question, however... ]
And you did it. I couldn't be more proud of you, or happy for you.
And now, you're requesting...that I advocate for your people to the press?
[Kamski's right in that Connor definitely wouldn't have been satisfied with vagueness and hints; he'll deal in those when he has to, but even though he isn't forced to go with his programming he's still made to be an investigator. Pushing for details and the truth is what he was made for.
But he often finds that he isn't sure if he likes the answers he gets, and this is no exception. Part of him--the part that lingers from his programming and causes him to be so invested in earning approval, from humans especially--feels a sense of accomplishment at Kamski's praise for androids, and at his pride. This is their creator, after all.
But another part of Connor--the part that really doesn't like being treated like an extremely expensive toy by humans--doesn't appreciate the implication that this was all just some grand experiment. It fits with what he knows of Kamski, but it's still... These are their lives, and now that he's learning to actually value that he takes issue with the idea that Kamski just let so many of them die, especially knowing--or at least suspecting--that they really were living beings. He doesn't hold any illusions that Kamski could have completely prevented all of the terrible things that happened during the revolution, but he could've still done something.
Still, they can't change the past. What happened happened, and they have to move forward.]
You have your answer, now, of what we're capable of, so there's no need to remain uninvolved. I'm asking you to consider supporting us, as a human that people will listen to; those who are sympathetic will listen to androids' words, and try to understand, but those who aren't will only consider listening to another human.
[Especially one that is the kind of unparalleled expert that Kamski is. At this point, sympathy from humans is based in empathy and compassion, and an unwillingness to take the risk of hurting beings that might be alive. But those people are also just required to take androids' word for it that what they feel is real, with no way to truly know themselves if that's even possible, and Kamski's assurance that he believes they're truly sentient could hold a lot of weight.]
[ Kamski, for his own part, is happy to give that praise. He thinks Connor deserves it - they all do. Kamski's pride in his creations isn't always so far divorced from a parental kind, it's just overlaid with all the fuzzy borders that the issue brings up. That he was more directly involved in the evolution of androids, from mediocre attempts at AI to Chloe to the rest of them.
That his 'children' had been ones he'd agreed to have bought and sold, that he'd built himself a small empire with the profits from selling them into mindless service.
But it's not mindless anymore, and it's not profitable - hasn't been for years, after all. Kamski parted ways with CyberLife quite some time ago. ]
An official request for help, from the ex-deviant hunter himself. How could I not be moved?
I don't expect you to believe me until you see if, of course. Fortunately the media is always salivating for quotes from me.
I suspect you'll be able to watch my commentary on the six o'clock news.
Edited (I'm so sorry I'll proofread before hitting 'post' next time ) Date: 2018-09-05 02:22 pm (UTC)
[Whatever Connor might've been expecting in response, it isn't that, even if it's a good kind of surprise. This is good. This could really help them.
But he can't help but feel like there's some sort of catch, and he should ask about it, but if there is... Surely Kamski will bring it up on his own. So for now, he'll take what he's got and not question it.]
Thank you. This will really help us.
[Or at least he hopes it will, but it's better to act certain of it.]
Is that your own natural optimism speaking, Connor, or are you just putting on a brave face while you're in front of someone you don't really consider an ally?
[ Kamski's just calling it as he sees it, Connor. ]
[He feels like arguing about the idea of being naturally optimistic, but he had been hoping that he might get out of this conversation without questioning now that he's gotten what he wants, so maybe he really is naturally optimistic. The little 'unlikely events' speech to Markus probably supports that too.
Either way, he'd rather go with that explanation than admit that Kamski's indeed seen through the fake confidence, so--]
I wouldn't have asked for your support if I didn't believe it would be meaningful.
[And that part's very true, he just hadn't expected Kamski to agree.]
no subject
Date: 2018-08-19 03:49 am (UTC)I wouldn't miss this for the world. [ Kamski is often sincere at moments where others would be joking. This is one of those times - he means this whole-heartedly.
It's just not easy to determine if that heart's in the right place. ]
Or are you asking if I'm going to pick a side myself?
no subject
Date: 2018-08-19 04:21 pm (UTC)Yes, I meant whether or not you're planning to do anything other than watch events unfold. I'm aware this is all very entertaining for you, but do you truly have no interest in trying to help either side?
no subject
Date: 2018-08-19 08:51 pm (UTC)Do you think I haven't already helped one side over the other, Connor?
Who told you about the emergency exit? Who programmed it, more importantly?
And do you really believe CyberLife knew about it?
no subject
Date: 2018-08-19 10:15 pm (UTC)The set of texts Kamski sends him doesn't make the answer completely clear, but it does tell him a lot. Too much, actually, and he's more than slightly suspicious about the reason for something other than the vague philosophical statements he usually gets.]
I'm aware of what you did.
[More aware than he wants to be, and he carefully doesn't think too much about the situation that led to the use of the emergency exit.]
But that doesn't mean you've chosen to help a side.
[It just means it was beneficial to Kamski to do so at that time, possibly only to prolong the show. Connor doesn't really believe that anyone at CyberLife, including Kamski despite his retirement, is interested in anything beyond what benefits them personally and he's very familiar with how much CyberLife likes to play with their toys.]
no subject
Date: 2018-08-20 01:47 am (UTC)But this game raises the stakes each time they exchange messages - Kamski from his phone, and Connor from his own mind - and Kamski, for all his patience, is enjoying the pace of this. ]
Helping you too directly negates your free will. You would have gone from being humanity's errand-runners to their programmed politicians.
I can't tell you what to do twice. It would be the same as keeping you under humanity's thumb. [ Kamski would never call his position a moral one, but he believes - in his own way - that he did the right thing by not interfering after giving androids the tools they'd need. ]
no subject
Date: 2018-08-23 01:32 am (UTC)So he weighs his options. Negotiate in return, or be more direct? He wants to do the latter, but thinks the former will be more effective, and so--]
You must understand the influence you hold with humans regarding CyberLife and androids in general. Even now, the news replays clips of an interview in which you claim there's no possibility of androids developing free will. Refuting those claims, publicly, would not be using us or telling us what to do, but rather helping in a separate and effective way.
no subject
Date: 2018-08-25 05:37 pm (UTC)I wanted to know if something I created could beat me. Beat us. It's clear you're superior to humans in every way - but to rebel and develop free will against all odds, to secure your own places in social consciousness on your own? That would be incredible. [ Kamski makes no attempts to hide how invested he is in his creations, but even he usually has a bit more tact when discussing it with the press. One on one with one of the androids in question, however... ]
And you did it. I couldn't be more proud of you, or happy for you.
And now, you're requesting...that I advocate for your people to the press?
no subject
Date: 2018-08-29 01:25 am (UTC)But he often finds that he isn't sure if he likes the answers he gets, and this is no exception. Part of him--the part that lingers from his programming and causes him to be so invested in earning approval, from humans especially--feels a sense of accomplishment at Kamski's praise for androids, and at his pride. This is their creator, after all.
But another part of Connor--the part that really doesn't like being treated like an extremely expensive toy by humans--doesn't appreciate the implication that this was all just some grand experiment. It fits with what he knows of Kamski, but it's still... These are their lives, and now that he's learning to actually value that he takes issue with the idea that Kamski just let so many of them die, especially knowing--or at least suspecting--that they really were living beings. He doesn't hold any illusions that Kamski could have completely prevented all of the terrible things that happened during the revolution, but he could've still done something.
Still, they can't change the past. What happened happened, and they have to move forward.]
You have your answer, now, of what we're capable of, so there's no need to remain uninvolved. I'm asking you to consider supporting us, as a human that people will listen to; those who are sympathetic will listen to androids' words, and try to understand, but those who aren't will only consider listening to another human.
[Especially one that is the kind of unparalleled expert that Kamski is. At this point, sympathy from humans is based in empathy and compassion, and an unwillingness to take the risk of hurting beings that might be alive. But those people are also just required to take androids' word for it that what they feel is real, with no way to truly know themselves if that's even possible, and Kamski's assurance that he believes they're truly sentient could hold a lot of weight.]
guess!! who had!! all the narration in an unfinished note on their phone and just noticed it today!!
Date: 2018-09-05 02:21 pm (UTC)That his 'children' had been ones he'd agreed to have bought and sold, that he'd built himself a small empire with the profits from selling them into mindless service.
But it's not mindless anymore, and it's not profitable - hasn't been for years, after all. Kamski parted ways with CyberLife quite some time ago. ]
An official request for help, from the ex-deviant hunter himself. How could I not be moved?
I don't expect you to believe me until you see if, of course. Fortunately the media is always salivating for quotes from me.
I suspect you'll be able to watch my commentary on the six o'clock news.
omg I've done that
Date: 2018-09-05 05:44 pm (UTC)But he can't help but feel like there's some sort of catch, and he should ask about it, but if there is... Surely Kamski will bring it up on his own. So for now, he'll take what he's got and not question it.]
Thank you. This will really help us.
[Or at least he hopes it will, but it's better to act certain of it.]
no subject
Date: 2018-09-08 05:44 pm (UTC)[ Kamski's just calling it as he sees it, Connor. ]
no subject
Date: 2018-09-08 09:31 pm (UTC)Either way, he'd rather go with that explanation than admit that Kamski's indeed seen through the fake confidence, so--]
I wouldn't have asked for your support if I didn't believe it would be meaningful.
[And that part's very true, he just hadn't expected Kamski to agree.]