You now confront the basic problem of morality
(media.communities.win)
You're viewing a single comment thread. View all comments, or full comment thread.
Comments (17)
sorted by:
does a human being, devoid of humanity, have worth, or meaning? Similarly, does a Boltzmann brain, conjured from a quantum infinity yet presenting atleast to its own perceptions as totally real have worth, or meaning? I think we’re defined by our connections to other humans. No man is an island.
Should a police officer who abuses his position and the trust put in him by society to break the law be punished more or less harshly than a common citizen? Should the legislator who writes his corruption into his nations laws so that he and his tribe of corruption can gorge themselves be punished more or less harshly than the police officer? I think that’s the point expressed here, not the far less sensible one you propose, though I suppose I see where you’re coming from, I just think you’re being unduly harsh.
“Basic problem of” != be all and end all period end of discussion. Realistically, I struggle to think of an example of a moral dilemma which doesn’t fit into this “basic” shape. Can you think of one? This piece doesn’t really “suggest” what you claim it does, I think it merely points at the topic and asks us to consider it.
https://m.youtube.com/watch?v=Y0PKG5-t3zU
You'd have to define 'humanity' for me to answer that question properly (unless you did define it there, in which case your definition is circular), but I'd suggest that there are times when a human can by their actions negate their inherent worth through harm to another, without losing that worth.
Yes, but that's not what the post states (nor the question it addresses). If they weren't more concerned with using flowery language than being logically and semantically correct, they could have properly stated the entire causal chain, but they skipped a couple steps (most importantly, that the existence of a proscribed punishment will inevitably result in the application of that punishment), and as a result, their conclusion falls apart.
Incentives have nothing to do with morality or ethics, those lie within the realm of economics, civics, or politics. Take the classic example of a moral dilemma: the trolley problem. The question isn't "How do we convince the actor to flip the switch, resulting in the death of one person instead of some greater number of persons?" Rather, it is "Is it moral for an actor to act in a way that would cause the death of one person in order to save the life of a greater number of persons?" This is the basic shape of all moral dilemmas, not anything to do with incentives.
It absolutely does, though whether by intent or incompetence I cannot say definitively.
https://www.youtube.com/watch?v=M1DcD8e55YY
Humanity as in “every other person in existence”. The meaning should have been obvious as it was used in contrast to the single person and the mention of Boltzmann brains. The question is, does a lone human, floating in the void, devoid of all elements and agents of what might fall under the label “humanity”, have value or worth?
That’s not at all what I’m getting at. Like I said, no man is an island. My point here can probably be summarized as “even Adam, the perfect man, needed Eve”. You want to dismiss this piece because you perceive it as dismissing the value of the individual, I think that’s a misreading.
Quoting here:
The relavent context is literally immediately prior to the sentence you take issue with here…
What conclusion? The one you assert that “the punishment is not proportional to the crime”? Again, this is just you being ungenerous and looking for an “own”. No real “conclusions” are given, beyond a description of the “basic problem of morality”, which I suppose you take issue with in its own right.
You’ve never heard of Game Theory I take it?
Ok let’s. The incentives in most renditions is “people I {like/dislike} will {die/not die}, my decision is based on how I want the world, and humanity, to look going forward.
https://m.youtube.com/watch?v=_8BZVpl2dMc&pp=ygUWa290b3IgeW91IGd1dGxlc3Mgc2ltcA%3D%3D
I'm not sure you understand how 'devoid' is typically used in English. "A devoid of B" implies that "B" is a typical property or content of "A" (and thus that "A" and "B" are different in character) such as "A home devoid of furniture" or "An argument devoid of sense." Your phrasing of the question implied that by "humanity" you meant "the intangible (or perhaps physical, if you're a materialist) qualities that makes one 'human'", not "the human race."
He still had worth prior to the creation of Eve. After all, how could it be "not good" for him to be alone if he had no worth? A human, in the total absence of other humans, or even the total absence of the possibility of other humans. still has worth.
I'm not dismissing it, I think it's midwit philosowank and I'm dissecting it and showing why.
That a "structure must punish cheaters with a violence that grows in proportion to its own success." Not only is the logical chain incomplete, but the conclusion can, on it's own, be shown to be incorrect. As an example: two men with identical backgrounds decide they're going to steal a dollar from 100 people. One does so in an impoverished country where a dollar is the average monthly income of a working individual. The other does so in a country where the average monthly income is ten-thousand dollars. Which should be punished more harshly?
You can quibble with the specifics or adjust the parameters, but ultimately, punishment should be based on the harm caused by (or, if you're not a pure utilitarian, the 'wrongness of') an action, not the success of the 'structure' it occurs within. And don't try to tell me that it's talking about overall punishment and not specific instances. A prosperous structure has far more ability to absorb corruption and remain functional than an impoverished one.
I have. You clearly don't understand what Game Theory is. At a fundamental level, it has nothing to do with determining moral action, merely obtaining the optimal result from a given set of parameters. It's a field of mathematics closely related to sociology and psychology, and has been used to study ethics and the development of social mores, but outside a purely utilitarian system where every action has a knowable, quantifiable moral value, even its use as a tool is limited.
Those are value judgements made by the actor, not incentives given by a 'structure' he is a part of.
https://www.youtube.com/watch?v=rcx6ILRVA_4
See you just can’t help but be a prick. It’s not my fault you have poor reading/context comprehension and a shallow understanding of philosophical concepts like Boltzmann brains and game theory.
Sure, debatably. Does that same human have more or less potential worth when part of a system of other people? It should be obvious that any intrinsic moral value of a person is dwarfed by the moral value they can achieve in a system of other people (moral harm too)
What kind of worth did he have if he was incapable of comprehending that worth without another person present?
Poor example. Let’s look at the child who took from the cookie jar, the thief who stole a loaf of bread, and bill gates seizing control of 50%+ of Americas farmland recently. The child can be talked to or have the jar moved out of reach. The thief can be brought to the stockades and shamed, and gates can have his hands cut off. Can you follow the progression between “success of the system” and “degree of punishment for overstepping the system”?
Lmao
Except for the fact he’s standing at the switch making his decision based on who he wants alive more.
https://m.youtube.com/watch?v=8U18EuNN2D0&pp=ygUfRHVrZSBudWtlbSBibG93IGl0IG91dCB5b3VyIGFzcw%3D%3D
Oh no it’s retarded.