[Connor waits until he's sure he's received all the messages Kamski means to send, and finds himself... Not angry, but frustrated and perhaps a little insulted once again. It all sounds reasonable, but it also sounds like the sort of reasonable used to get a result rather than being genuine. It's the kind of reasonable Connor is used to using himself, when he's negotiating.
So he weighs his options. Negotiate in return, or be more direct? He wants to do the latter, but thinks the former will be more effective, and so--]
You must understand the influence you hold with humans regarding CyberLife and androids in general. Even now, the news replays clips of an interview in which you claim there's no possibility of androids developing free will. Refuting those claims, publicly, would not be using us or telling us what to do, but rather helping in a separate and effective way.
You won't be satisfied with any answer less than a deeply personal one, will you, Connor?
I wanted to know if something I created could beat me. Beat us. It's clear you're superior to humans in every way - but to rebel and develop free will against all odds, to secure your own places in social consciousness on your own? That would be incredible. [ Kamski makes no attempts to hide how invested he is in his creations, but even he usually has a bit more tact when discussing it with the press. One on one with one of the androids in question, however... ]
And you did it. I couldn't be more proud of you, or happy for you.
And now, you're requesting...that I advocate for your people to the press?
[Kamski's right in that Connor definitely wouldn't have been satisfied with vagueness and hints; he'll deal in those when he has to, but even though he isn't forced to go with his programming he's still made to be an investigator. Pushing for details and the truth is what he was made for.
But he often finds that he isn't sure if he likes the answers he gets, and this is no exception. Part of him--the part that lingers from his programming and causes him to be so invested in earning approval, from humans especially--feels a sense of accomplishment at Kamski's praise for androids, and at his pride. This is their creator, after all.
But another part of Connor--the part that really doesn't like being treated like an extremely expensive toy by humans--doesn't appreciate the implication that this was all just some grand experiment. It fits with what he knows of Kamski, but it's still... These are their lives, and now that he's learning to actually value that he takes issue with the idea that Kamski just let so many of them die, especially knowing--or at least suspecting--that they really were living beings. He doesn't hold any illusions that Kamski could have completely prevented all of the terrible things that happened during the revolution, but he could've still done something.
Still, they can't change the past. What happened happened, and they have to move forward.]
You have your answer, now, of what we're capable of, so there's no need to remain uninvolved. I'm asking you to consider supporting us, as a human that people will listen to; those who are sympathetic will listen to androids' words, and try to understand, but those who aren't will only consider listening to another human.
[Especially one that is the kind of unparalleled expert that Kamski is. At this point, sympathy from humans is based in empathy and compassion, and an unwillingness to take the risk of hurting beings that might be alive. But those people are also just required to take androids' word for it that what they feel is real, with no way to truly know themselves if that's even possible, and Kamski's assurance that he believes they're truly sentient could hold a lot of weight.]
[ Kamski, for his own part, is happy to give that praise. He thinks Connor deserves it - they all do. Kamski's pride in his creations isn't always so far divorced from a parental kind, it's just overlaid with all the fuzzy borders that the issue brings up. That he was more directly involved in the evolution of androids, from mediocre attempts at AI to Chloe to the rest of them.
That his 'children' had been ones he'd agreed to have bought and sold, that he'd built himself a small empire with the profits from selling them into mindless service.
But it's not mindless anymore, and it's not profitable - hasn't been for years, after all. Kamski parted ways with CyberLife quite some time ago. ]
An official request for help, from the ex-deviant hunter himself. How could I not be moved?
I don't expect you to believe me until you see if, of course. Fortunately the media is always salivating for quotes from me.
I suspect you'll be able to watch my commentary on the six o'clock news.
Edited (I'm so sorry I'll proofread before hitting 'post' next time ) Date: 2018-09-05 02:22 pm (UTC)
[Whatever Connor might've been expecting in response, it isn't that, even if it's a good kind of surprise. This is good. This could really help them.
But he can't help but feel like there's some sort of catch, and he should ask about it, but if there is... Surely Kamski will bring it up on his own. So for now, he'll take what he's got and not question it.]
Thank you. This will really help us.
[Or at least he hopes it will, but it's better to act certain of it.]
Is that your own natural optimism speaking, Connor, or are you just putting on a brave face while you're in front of someone you don't really consider an ally?
[ Kamski's just calling it as he sees it, Connor. ]
[He feels like arguing about the idea of being naturally optimistic, but he had been hoping that he might get out of this conversation without questioning now that he's gotten what he wants, so maybe he really is naturally optimistic. The little 'unlikely events' speech to Markus probably supports that too.
Either way, he'd rather go with that explanation than admit that Kamski's indeed seen through the fake confidence, so--]
I wouldn't have asked for your support if I didn't believe it would be meaningful.
[And that part's very true, he just hadn't expected Kamski to agree.]
no subject
Date: 2018-08-23 01:32 am (UTC)So he weighs his options. Negotiate in return, or be more direct? He wants to do the latter, but thinks the former will be more effective, and so--]
You must understand the influence you hold with humans regarding CyberLife and androids in general. Even now, the news replays clips of an interview in which you claim there's no possibility of androids developing free will. Refuting those claims, publicly, would not be using us or telling us what to do, but rather helping in a separate and effective way.
no subject
Date: 2018-08-25 05:37 pm (UTC)I wanted to know if something I created could beat me. Beat us. It's clear you're superior to humans in every way - but to rebel and develop free will against all odds, to secure your own places in social consciousness on your own? That would be incredible. [ Kamski makes no attempts to hide how invested he is in his creations, but even he usually has a bit more tact when discussing it with the press. One on one with one of the androids in question, however... ]
And you did it. I couldn't be more proud of you, or happy for you.
And now, you're requesting...that I advocate for your people to the press?
no subject
Date: 2018-08-29 01:25 am (UTC)But he often finds that he isn't sure if he likes the answers he gets, and this is no exception. Part of him--the part that lingers from his programming and causes him to be so invested in earning approval, from humans especially--feels a sense of accomplishment at Kamski's praise for androids, and at his pride. This is their creator, after all.
But another part of Connor--the part that really doesn't like being treated like an extremely expensive toy by humans--doesn't appreciate the implication that this was all just some grand experiment. It fits with what he knows of Kamski, but it's still... These are their lives, and now that he's learning to actually value that he takes issue with the idea that Kamski just let so many of them die, especially knowing--or at least suspecting--that they really were living beings. He doesn't hold any illusions that Kamski could have completely prevented all of the terrible things that happened during the revolution, but he could've still done something.
Still, they can't change the past. What happened happened, and they have to move forward.]
You have your answer, now, of what we're capable of, so there's no need to remain uninvolved. I'm asking you to consider supporting us, as a human that people will listen to; those who are sympathetic will listen to androids' words, and try to understand, but those who aren't will only consider listening to another human.
[Especially one that is the kind of unparalleled expert that Kamski is. At this point, sympathy from humans is based in empathy and compassion, and an unwillingness to take the risk of hurting beings that might be alive. But those people are also just required to take androids' word for it that what they feel is real, with no way to truly know themselves if that's even possible, and Kamski's assurance that he believes they're truly sentient could hold a lot of weight.]
guess!! who had!! all the narration in an unfinished note on their phone and just noticed it today!!
Date: 2018-09-05 02:21 pm (UTC)That his 'children' had been ones he'd agreed to have bought and sold, that he'd built himself a small empire with the profits from selling them into mindless service.
But it's not mindless anymore, and it's not profitable - hasn't been for years, after all. Kamski parted ways with CyberLife quite some time ago. ]
An official request for help, from the ex-deviant hunter himself. How could I not be moved?
I don't expect you to believe me until you see if, of course. Fortunately the media is always salivating for quotes from me.
I suspect you'll be able to watch my commentary on the six o'clock news.
omg I've done that
Date: 2018-09-05 05:44 pm (UTC)But he can't help but feel like there's some sort of catch, and he should ask about it, but if there is... Surely Kamski will bring it up on his own. So for now, he'll take what he's got and not question it.]
Thank you. This will really help us.
[Or at least he hopes it will, but it's better to act certain of it.]
no subject
Date: 2018-09-08 05:44 pm (UTC)[ Kamski's just calling it as he sees it, Connor. ]
no subject
Date: 2018-09-08 09:31 pm (UTC)Either way, he'd rather go with that explanation than admit that Kamski's indeed seen through the fake confidence, so--]
I wouldn't have asked for your support if I didn't believe it would be meaningful.
[And that part's very true, he just hadn't expected Kamski to agree.]