This is all very easily made trivial if you just stop considering consciousness as some special property that has be imparted on things.
If it's just the output from parsing an incredibly complicated series of biological logic gates, then the materialists have nothing more to prove than the biological system simply exists.
If that was all it was, we would have created artificial consciousness decades ago.
I should have spent more time setting the stage and mentioned the details of the Hard Problem of consciousness. It doesn’t seem like 95% of the people commenting have heard of it, but I assumed most would have at least a vague understanding of the fact that modern science doesn’t have the first clue on the roots of consciousness
If there is no special property of consciousness to impart then the threshold for something comparable to human consciousness is system complexity comparable to the human brain.
We're still a long way off creating anything on that level ourselves, and a very very long way off doing that in a way that isn't more or less just copying evolution's homework on neural networks. We've poured billions of dollars and a sizeable portion of top level human resources into AI and LLM projects recently, and even at their peak they pale in comparison to the just complexity we currently understand about how the brain functions, nevermind all the parts we haven't even figured out yet.
So no we couldn't have created artificial consciousness decades ago, and we may not be able to do so for many decades hence.
As to the "hard problem", I disagree to the foundational assertion of it.
Proponents of the hard problem argue that it is categorically different from the easy problems since no mechanistic or behavioral explanation could explain the character of an experience, not even in principle.
Making that your fundamental axiom stems from a misunderstanding of the vast unplumbed depths of the mechanical complexity of the human brain. The stupid urban myth of only using 10% of our brain has an unexpected a kernel of truth to it, in that what we currently understand about how the brain functions could easily turn out to be only 10% of the total system complexity.
Did you listen to any of the arguments made? The materialist paradigm is fundamentally unable to explain consciousness arising in unconscious matter. Even if, after decades of research, materialists totally mapped all the “complex series of biological logic gates” of the brain down to the atom, you would still fundamentally be incapable explaining the source of the conscious experience. It’s not a “material” thing. It falls outside the wheelhouse of “materialism”. This should be obvious if you understood the total lack of progress on consciousness studies for going on 100 years now.
It's not that I don't understand the arguments, I just understand them well enough to see how incredibly flawed they are.
There is no special consciousness field. It's like the "ether", an invisible universal force crudely conjured by people too impatient to do the harder work of isolating and understanding the hundreds of individually observable mechanisms and their interrelations that lead to the same result.
Even if, after decades of research, materialists totally mapped all the “complex series of biological logic gates” of the brain down to the atom, you would still fundamentally be incapable explaining the source of the conscious experience.
No, at that point it is entirely feasible that we could point to the exact mechanical process that produces the experience of pain, or fear. You're declaring something impossible when you've not even tried, for a rationale no deeper than "well, duh".
Cancer is an ostensibly far simpler biological problem, and after over a hundred years of far more extensive research efforts than that into consciousness we're still puzzled by as many unknowns as knowns about the mechanisms around it. That doesn't mean I'm about to declare that molecular biology is a red herring and we should be doing more research into "bad cancer vibes" or some shit. We're not omniscient super beings, just because the comprehensive answer will very likely not be known in my lifetime doesn't mean it's unknowable, if the observable mechanism already exists and just needs calculating on a larger scale then it's up to you to prove some specific reason it won't work when you scale it up.
This is all very easily made trivial if you just stop considering consciousness as some special property that has be imparted on things.
If it's just the output from parsing an incredibly complicated series of biological logic gates, then the materialists have nothing more to prove than the biological system simply exists.
If that was all it was, we would have created artificial consciousness decades ago.
I should have spent more time setting the stage and mentioned the details of the Hard Problem of consciousness. It doesn’t seem like 95% of the people commenting have heard of it, but I assumed most would have at least a vague understanding of the fact that modern science doesn’t have the first clue on the roots of consciousness
If there is no special property of consciousness to impart then the threshold for something comparable to human consciousness is system complexity comparable to the human brain.
We're still a long way off creating anything on that level ourselves, and a very very long way off doing that in a way that isn't more or less just copying evolution's homework on neural networks. We've poured billions of dollars and a sizeable portion of top level human resources into AI and LLM projects recently, and even at their peak they pale in comparison to the just complexity we currently understand about how the brain functions, nevermind all the parts we haven't even figured out yet.
So no we couldn't have created artificial consciousness decades ago, and we may not be able to do so for many decades hence.
As to the "hard problem", I disagree to the foundational assertion of it.
Making that your fundamental axiom stems from a misunderstanding of the vast unplumbed depths of the mechanical complexity of the human brain. The stupid urban myth of only using 10% of our brain has an unexpected a kernel of truth to it, in that what we currently understand about how the brain functions could easily turn out to be only 10% of the total system complexity.
Did you listen to any of the arguments made? The materialist paradigm is fundamentally unable to explain consciousness arising in unconscious matter. Even if, after decades of research, materialists totally mapped all the “complex series of biological logic gates” of the brain down to the atom, you would still fundamentally be incapable explaining the source of the conscious experience. It’s not a “material” thing. It falls outside the wheelhouse of “materialism”. This should be obvious if you understood the total lack of progress on consciousness studies for going on 100 years now.
It's not that I don't understand the arguments, I just understand them well enough to see how incredibly flawed they are.
There is no special consciousness field. It's like the "ether", an invisible universal force crudely conjured by people too impatient to do the harder work of isolating and understanding the hundreds of individually observable mechanisms and their interrelations that lead to the same result.
No, at that point it is entirely feasible that we could point to the exact mechanical process that produces the experience of pain, or fear. You're declaring something impossible when you've not even tried, for a rationale no deeper than "well, duh".
Cancer is an ostensibly far simpler biological problem, and after over a hundred years of far more extensive research efforts than that into consciousness we're still puzzled by as many unknowns as knowns about the mechanisms around it. That doesn't mean I'm about to declare that molecular biology is a red herring and we should be doing more research into "bad cancer vibes" or some shit. We're not omniscient super beings, just because the comprehensive answer will very likely not be known in my lifetime doesn't mean it's unknowable, if the observable mechanism already exists and just needs calculating on a larger scale then it's up to you to prove some specific reason it won't work when you scale it up.