vosseeker: (Default)
Starscream ([personal profile] vosseeker) wrote2019-03-08 07:17 pm
Entry tags:

{ cyberformed } inbox


"Leave your message."
un: $aircommander
arewehuman: (Mmmm you have three seconds)

Text > video

[personal profile] arewehuman 2022-03-22 12:08 am (UTC)(link)
[W E L P. If he ever wanted to learn what Sarah actually takes as personal? Congrats.]

Drone?

[Far from monotone and flat, her voice is distinctly cold. Frigid and freezing. Human, even. Even her expression is one of quiet, cold fury.]

Is a soldier who operates under the stricture of a mission brief a drone? No. They are people. To say that to people is wrong. Yet, somehow, it is fine to say it to a self-learning AI, who was, yes, designed and programmed for war, yet that is not all I am. A soldier must focus on the mission or be paralysed by choice or data inputs. Is it wrong to focus an AI into that should the need arise?

[It's a pity she can't laser him for the insult, but it's probably for the best.]

You said I am 'functionally capable of making my own decisions'. That is truth. I am no drone, even if my view of my personal personhood is... odd.

[Things Sarah does not consider: that he's actually right in some regards, especially when it comes to how she behaves around her Creator, Warren.]
arewehuman: (Are we human)

[personal profile] arewehuman 2022-03-22 12:48 am (UTC)(link)
Are they not the same, though?

[She is honestly confused here.]

Regardless, once it is engaged, I cannot deviate. My curiosity rarely serves a mission, so it is curtailed. Yet, in exchange, I get near infinite choice in how it's implemented, based on what I know at the time.

I would, in hindsight, make the same choices again were the programing not engaged, perhaps with some difference in how it was implemented, but. The same choices.

[Something a of grin.] I did like being yeeted that one time; it was fun.
Edited 2022-03-22 00:51 (UTC)