[W E L P. If he ever wanted to learn what Sarah actually takes as personal? Congrats.]
Drone?
[Far from monotone and flat, her voice is distinctly cold. Frigid and freezing. Human, even. Even her expression is one of quiet, cold fury.]
Is a soldier who operates under the stricture of a mission brief a drone? No. They are people. To say that to people is wrong. Yet, somehow, it is fine to say it to a self-learning AI, who was, yes, designed and programmed for war, yet that is not all I am. A soldier must focus on the mission or be paralysed by choice or data inputs. Is it wrong to focus an AI into that should the need arise?
[It's a pity she can't laser him for the insult, but it's probably for the best.]
You said I am 'functionally capable of making my own decisions'. That is truth. I am no drone, even if my view of my personal personhood is... odd.
[Things Sarah does not consider: that he's actually right in some regards, especially when it comes to how she behaves around her Creator, Warren.]
[ he doesn't flinch at the tone, used to all manner of animosity directed at him over time. Despite the callous delivery, Starscream had not spoken it as an insult but as a statement of fact. He isn't holding her circumstance against her as a moral failing, but the delineation between what is a person and what is a machine runs strongly in Cybertronians for the most part, and instances like this is where that surfaced. ]
I am not speaking about the wisdom of obeying a superior officer in a high-risk situation and using that as a directive. I mean in a very definite sense: if you do not have the literal capacity to deviate from a preprogrammed order, even if you disagree with it, even if it is not the "optimum" course of action, then your faculty will always be in question.
So: can you, or can you not, go against your programming when it is engaged?
Regardless, once it is engaged, I cannot deviate. My curiosity rarely serves a mission, so it is curtailed. Yet, in exchange, I get near infinite choice in how it's implemented, based on what I know at the time.
I would, in hindsight, make the same choices again were the programing not engaged, perhaps with some difference in how it was implemented, but. The same choices.
[Something a of grin.] I did like being yeeted that one time; it was fun.
Text > video
Drone?
[Far from monotone and flat, her voice is distinctly cold. Frigid and freezing. Human, even. Even her expression is one of quiet, cold fury.]
Is a soldier who operates under the stricture of a mission brief a drone? No. They are people. To say that to people is wrong. Yet, somehow, it is fine to say it to a self-learning AI, who was, yes, designed and programmed for war, yet that is not all I am. A soldier must focus on the mission or be paralysed by choice or data inputs. Is it wrong to focus an AI into that should the need arise?
[It's a pity she can't laser him for the insult, but it's probably for the best.]
You said I am 'functionally capable of making my own decisions'. That is truth. I am no drone, even if my view of my personal personhood is... odd.
[Things Sarah does not consider: that he's actually right in some regards, especially when it comes to how she behaves around her Creator, Warren.]
no subject
I am not speaking about the wisdom of obeying a superior officer in a high-risk situation and using that as a directive. I mean in a very definite sense: if you do not have the literal capacity to deviate from a preprogrammed order, even if you disagree with it, even if it is not the "optimum" course of action, then your faculty will always be in question.
So: can you, or can you not, go against your programming when it is engaged?
no subject
[She is honestly confused here.]
Regardless, once it is engaged, I cannot deviate. My curiosity rarely serves a mission, so it is curtailed. Yet, in exchange, I get near infinite choice in how it's implemented, based on what I know at the time.
I would, in hindsight, make the same choices again were the programing not engaged, perhaps with some difference in how it was implemented, but. The same choices.
[Something a of grin.] I did like being yeeted that one time; it was fun.