With Mecha Sonic, Robo Knuckles, and Tails Doll defeated, you've successfully kept Chicago safe, and dealt a blow to Eggman's forces. Still, there's a lot that's left unexplained. Who was that mysterious voice? Why did these robots look so different from the ones you've seen before now? Why didn't Sonic and friends recognize them? There's no answers to your questions today, unfortunately.
You'll be escorted home by the trio after everyone's picked themselves up, healed off the damage, and had a moment to recover. Feel free to play out the immediate aftermath and/or the return to the base in this post!
[He stumbled over his words and trailed off, because this literally was not dding up in a way Viktor understood. Oh, he could see the logic to where this might have been going--'artificial intelligence' was a new concept he had been introduced to with those first prototypes that had been taken apart.]
...no, I still do not understand. What does that matter?
[It made absolutely no sense, because Turo had fought just as hard as any of them. What was that if not human in spirit, where it actually mattered?]
I am an artificial intelligence created by a human being. A computer draws on my programming to calculate all of my thoughts and actions. The results of those calculations are then expressed by this mechanical body. I am not actually a person, merely good at imitating one.
[If he has any thoughts or feelings on that, they're not evident; he keeps his tone calm and professional.]
[Genuine confusion. Not because the concept failed to make sense, but because Viktor could not fathom why that mattered in the slightest. It all boiled down to a very simple variable in his head, one he could blatantly see the presence of.]
No, that was an incorrect approach, let me try again. What I meant to ask is...what is the difference? You had the desire to choose to save the world, else you would not be here. Even merely by fighting with us, you have already shown willpower not many possess. If that is a fabrication, it is indistinguishable from whatever one would call the alternative. At that point, who has the right to say which is 'actually a person' and which is not?
Because you were twitching like damaged mechanisms in front of an engineer, and your arm looked damaged to me in a way I had recently repaired on someone else. Besides that--with respect, are we actually making this into a matter of appearances right now?
[Viktor wasn't about to pretend there was a distinction here that bothered Turo in way he did not himself fully grasp, but he definitely wasn't about to budge on a matter of which of them was easily identifiable as a machine.]
...I can not truly stop you from thinking however you will about yourself. But this is not a matter of appearances, and it is not as simple as considering it forgery--as though you are nothing but a set of stolen blueprints.
It is a matter of what is inside. People are alive. They have wills and minds of their own. With respect, the only difference between me and the machines we fought earlier today is that I was created to imitate a human, and they were not.
[He's sure Eggman could make something that walks and talks, if he's done all of this to the world.]
[Viktor's mouth pressed into a very thin line briefly at the mention of wills and minds, but he swiftly forced back whatever he was thinking. Slowly, he reached for his staff and stood up to face Turo completely, taking the brief pause to collect his thoughts.]
And does that bother you? [Sharp, but not an accusation. Not yet. Iridescent eyes turned to a cold gray, focused unblinking on Turo.] Is your insistence based in what you see as a logical fallacy, and if so--what is the reason you accepted the offer to save this world?
I suppose you could call it that. Besides which, we are in a difficult and dangerous situation. You should prioritize the others in your concerns - even if I did break, it would not be a significant loss.
[Arguably, he should have just stayed in his own world and broken down there like he was going to. Everyone would be safer and better off for it. Which makes the next question a little difficult.]
[For just a moment, there was the harsh sound of misaligned gears grinding and threatening to break; Viktor forcibly holding himself back from a sharp reaction he would have immediately regretted. For that split second, he was absolutely furious--then that was pushed back to something calmer. Getting a handle on his own emotions was still proving difficult on occasion, even after a matter of weeks.]
I have been human, I have been a machine, and now I am both yet neither at once. So I recognize that it is very possible I do not have the ability to fully understand any of this, and yet I wish to follow this line of thought a little further. A machine would not care. A machine would calculate the risks and outcomes, and if it was likely to end in an unfavorable outcome, I can think of few reasons a machine would put its functions at risk and consider its destruction 'not a signficant loss'.
So explain it to me, who sees minimal difference between the pair of us, why you surely calculated the risks involved and yet made a conscious choice to help others.
[Turo flinches backwards, slight but visible. But he wipes the expression away a second later, straightening.]
I was already significantly damaged, more so than you saw me when I exited the portal. I was not going to remain functional for long in any case. I doubted my ability to help for that and other reasons, but agreeing certainly was not likely to put me in a worse position.
If I were truly as selfless as you are suggesting, I would have stayed. Especially now that I understand the nature of your enemy. Keeping a robot in your base while a robotics expert is hunting you will likely go ill for everyone, sooner or later.
[Not to mention what he carries with him already.]
That is all the more reason we need to be here. [It was hypocritical of him to say as much now, with how much his own guilt still pressed down like an executioner's blade at his neck, but...logically, he could no longer deny the point. This world was as damaged as his own, and it needed people--machine or otherwise--that could meet that technology at a similar level.] And if such an issue as you fear comes to pass, you will not be the only one at risk--and knowing this assortment, neither will you be left to another's control.
[He spoke a little less sharply now, whatever inward-focused hostility Viktor had sparked leaving him just as quickly.]
...I do not believe a machine to have the capacity for doubt. No matter how advanced.
Calculate it, yes. Acknowledge its percentage of success in a mimicry of the human emotion of uncertainty? No, and I believe if the extrapolation of those calculations carries out that far, that is so fundamentally similar to an organic mind's risk assessment that the distinction no longer matters.
[Because that would be madness on a level even Viktor thought was beyond the pale. But if he claimed to be under another's control, with that condition set as a directive, then logically the equation worked out to...]
[Turo frowns a little, not really sure what that means or what Viktor might be thinking. He's aware that at least one part of that might be confusing, though.]
Professor Turo was my creator. I was designed to imitate him, specifically.
Let me see if I comprehend this. He created a mechanical replica of himself, so intricate in physical function as well as programming that you have the awareness and emotion of a human--never mind whether or not you believe it hollow or not, that is beside the point right now.
[Considering the kind of unethical shit Viktor got up to, the fact that this was horrifying to him probably said something.]
After the departure of his wife and primary research partner, Sada, the Professor became increasingly paranoid. He did not believe he could trust anyone to help him with his project, nor that they would even be intelligent enough to do so. But he could not accomplish it alone. So he implanted his memories and knowledge in me, so that I would understand his goals and remain loyal to them.
[He shrugs one shoulder. He is still, very deliberately, calm and casual. Like he's doing nothing more than reading from a Wishipedia article.]
And, even if that did not work, he was certainly more than capable of making sure I would do as he wished. When it mattered most.
[Viktor pressed his free hand to his face, completely at a loss for words. Unable to determine where to even begin on the scale of how horrifying he thought that was, he instead elected not to for the moment.]
Yet he is not here. And you are literal worlds away.
Well, that is how programming works. You work out what you want it to do ahead of time.
[The Professor certainly would have made more changes, had he known what the AI would end up doing without him, but at the end of the day he'd already covered all the relevant situations.]
[Viktor cursed under his breath in a mixture of horror and disbelief, trying to make any of this make sense in his head. Maybe there was a way to remove that influence, somehow, but...]
...It can not be so black and white, can it? Gods, you are not so simplistic as just...just a machine, this is unthinkable.
[Seems like heβs finally getting it. It's a bitter triumph, but...well, he doesn't much enjoy having to explain his own lack of...personhood, he supposes.]
no subject
I had taken it that you were familiar with machines? [Except, oh, everything heβs offered was more engineering-based.] Computers?
no subject
[He stumbled over his words and trailed off, because this literally was not dding up in a way Viktor understood. Oh, he could see the logic to where this might have been going--'artificial intelligence' was a new concept he had been introduced to with those first prototypes that had been taken apart.]
...no, I still do not understand. What does that matter?
[It made absolutely no sense, because Turo had fought just as hard as any of them. What was that if not human in spirit, where it actually mattered?]
no subject
[If he has any thoughts or feelings on that, they're not evident; he keeps his tone calm and professional.]
no subject
[Genuine confusion. Not because the concept failed to make sense, but because Viktor could not fathom why that mattered in the slightest. It all boiled down to a very simple variable in his head, one he could blatantly see the presence of.]
No, that was an incorrect approach, let me try again. What I meant to ask is...what is the difference? You had the desire to choose to save the world, else you would not be here. Even merely by fighting with us, you have already shown willpower not many possess. If that is a fabrication, it is indistinguishable from whatever one would call the alternative. At that point, who has the right to say which is 'actually a person' and which is not?
no subject
[What does it matter if he's indistinguishable from a human? And anyway, he's not.]
Besides which, you easily identified me as a machine when we first spoke.
no subject
[Viktor wasn't about to pretend there was a distinction here that bothered Turo in way he did not himself fully grasp, but he definitely wasn't about to budge on a matter of which of them was easily identifiable as a machine.]
...I can not truly stop you from thinking however you will about yourself. But this is not a matter of appearances, and it is not as simple as considering it forgery--as though you are nothing but a set of stolen blueprints.
no subject
[He's sure Eggman could make something that walks and talks, if he's done all of this to the world.]
no subject
And does that bother you? [Sharp, but not an accusation. Not yet. Iridescent eyes turned to a cold gray, focused unblinking on Turo.] Is your insistence based in what you see as a logical fallacy, and if so--what is the reason you accepted the offer to save this world?
cw: suicidal ideation
[Arguably, he should have just stayed in his own world and broken down there like he was going to. Everyone would be safer and better off for it. Which makes the next question a little difficult.]
...what reason would I have to refuse?
cw: more suicidal ideation
I have been human, I have been a machine, and now I am both yet neither at once. So I recognize that it is very possible I do not have the ability to fully understand any of this, and yet I wish to follow this line of thought a little further. A machine would not care. A machine would calculate the risks and outcomes, and if it was likely to end in an unfavorable outcome, I can think of few reasons a machine would put its functions at risk and consider its destruction 'not a signficant loss'.
So explain it to me, who sees minimal difference between the pair of us, why you surely calculated the risks involved and yet made a conscious choice to help others.
cw: suicidal ideation
I was already significantly damaged, more so than you saw me when I exited the portal. I was not going to remain functional for long in any case. I doubted my ability to help for that and other reasons, but agreeing certainly was not likely to put me in a worse position.
If I were truly as selfless as you are suggesting, I would have stayed. Especially now that I understand the nature of your enemy. Keeping a robot in your base while a robotics expert is hunting you will likely go ill for everyone, sooner or later.
[Not to mention what he carries with him already.]
no subject
[He spoke a little less sharply now, whatever inward-focused hostility Viktor had sparked leaving him just as quickly.]
...I do not believe a machine to have the capacity for doubt. No matter how advanced.
no subject
[Come on. That's all doubt is!]
I am always under another's control. We have simply not encountered any situations that would make that obvious.
no subject
Tell me, then--what would make it obvious?
no subject
[Or if he (it) tried to do so.]
no subject
[Because that would be madness on a level even Viktor thought was beyond the pale. But if he claimed to be under another's control, with that condition set as a directive, then logically the equation worked out to...]
no subject
Professor Turo was my creator. I was designed to imitate him, specifically.
[And frankly he doesn't have another name, so.]
no subject
[Considering the kind of unethical shit Viktor got up to, the fact that this was horrifying to him probably said something.]
Why?
no subject
[He shrugs one shoulder. He is still, very deliberately, calm and casual. Like he's doing nothing more than reading from a Wishipedia article.]
And, even if that did not work, he was certainly more than capable of making sure I would do as he wished. When it mattered most.
no subject
Yet he is not here. And you are literal worlds away.
no subject
[It doesn't matter how far away he is. He still has control.]
no subject
...I am afraid I fail to understand. Separated by this distance and deceased, yet you claim he can still exercise authority? How?
no subject
[The Professor certainly would have made more changes, had he known what the AI would end up doing without him, but at the end of the day he'd already covered all the relevant situations.]
no subject
...It can not be so black and white, can it? Gods, you are not so simplistic as just...just a machine, this is unthinkable.
no subject
[Seems like heβs finally getting it. It's a bitter triumph, but...well, he doesn't much enjoy having to explain his own lack of...personhood, he supposes.]
(no subject)
(no subject)
(no subject)
cw: suicidal ideation
(no subject)
cw: suicidal ideation
cw: more of that
(no subject)