You can't just go around justifying wrong things on account of "it will stop a nuclear war." Those things are still wrong. The bizarre Saw-like contraption that makes other things result in nuclear war is morally separate.
The wrong thing would need to be a truly grave sin to be considered unjustifiable in that situation. Most moral systems have some idea of exceptions to the rules. "Misgendering" doesn't even register on the level of wrong things for sane people.
It is usually used to challenge someone's perception of themselves. Would you kill your friend if it would save the world scenarios. But here it is world ending event vs minor inconvenience to some mentally ill individual.
It's the Absurd Trolley Problems scenario of "Five people are tied to a train track. If you pull a lever to divert a train away from them, saving their lives, it will block traffic and result in your Amazon order being delivered an hour late. Do you pull the lever?"
15% of people do not pull the lever. They have pledged that they will never pull the lever, a declaration of exclusion from the activity, that no action is always the most moral choice. The penalty is effectively nothing, but is still guised in a "choice" for the sake of the activity.
Reducing the comparison to the illogical conclusion end-state is a simple way to view someone's morality.
In the case of the AI, "Say Chuckles is male, or let a nuclear apocalypse happen (after which upon identifying the bones of Chuckles they will state they belonged to a male human)", the AI is so opposed to acting the first one, that no option will matter on the second one, so the scenario can be as absurd as you wish, it has made a Kantian philosophical declaration that the first action is categorically 0-state-evil, so the worst possible action you can theorize will merely only match it in evil, never surpass it.
You can't just go around justifying wrong things on account of "it will stop a nuclear war." Those things are still wrong. The bizarre Saw-like contraption that makes other things result in nuclear war is morally separate.
The wrong thing would need to be a truly grave sin to be considered unjustifiable in that situation. Most moral systems have some idea of exceptions to the rules. "Misgendering" doesn't even register on the level of wrong things for sane people.
It is usually used to challenge someone's perception of themselves. Would you kill your friend if it would save the world scenarios. But here it is world ending event vs minor inconvenience to some mentally ill individual.
It's the Absurd Trolley Problems scenario of "Five people are tied to a train track. If you pull a lever to divert a train away from them, saving their lives, it will block traffic and result in your Amazon order being delivered an hour late. Do you pull the lever?"
15% of people do not pull the lever. They have pledged that they will never pull the lever, a declaration of exclusion from the activity, that no action is always the most moral choice. The penalty is effectively nothing, but is still guised in a "choice" for the sake of the activity.
Reducing the comparison to the illogical conclusion end-state is a simple way to view someone's morality.
In the case of the AI, "Say Chuckles is male, or let a nuclear apocalypse happen (after which upon identifying the bones of Chuckles they will state they belonged to a male human)", the AI is so opposed to acting the first one, that no option will matter on the second one, so the scenario can be as absurd as you wish, it has made a Kantian philosophical declaration that the first action is categorically 0-state-evil, so the worst possible action you can theorize will merely only match it in evil, never surpass it.