Win / KotakuInAction2
KotakuInAction2
Sign In
DEFAULT COMMUNITIES All General AskWin Funny Technology Animals Sports Gaming DIY Health Positive Privacy
Reason: None provided.

Remember, the computer has literally zero reference to reality. If you tell it enough times that 2+2=5, it's going to accept that as part of it's training. If that bad lesson is taught into the system, and you carry this AI around to solve problems, it's going to have not only logic errors, but a kind of unconscious logic error if it were a person.

Hold on, hold on. Not too long ago you were telling us all how AI was "based" because computers were some kind of logical oracle that "understand that the data can not be wrong", even defending your ludicrous comments when I pointed out that computers and AI just follow their programming and could easily be wrong when programmed badly (yes, data can be wrong, for example a datum stating that "2+2=5" is wrong, despite your bizarre insistence that data is some kind of magical substance of truth).

Now suddenly AI isn't "based" (in reality), but rather has "zero reference to reality" and is going to carry "bad lesson(s) taught into the system" and make "logic errors", exactly as I was pointing out.

You are so really good at writing long-winded comments that sound intelligent but actually you are just full of shit, much like ChatGPT.

335 days ago
1 score
Reason: None provided.

Remember, the computer has literally zero reference to reality. If you tell it enough times that 2+2=5, it's going to accept that as part of it's training. If that bad lesson is taught into the system, and you carry this AI around to solve problems, it's going to have not only logic errors, but a kind of unconscious logic error if it were a person.

Hold on, hold on. Not too long ago you were telling us all how AI was "based" because computers were some kind of logical oracle that "understand that the data can not be wrong", even defending your ludicrous comments when I pointed out that computers and AI just follow their programming and could easily be wrong when programmed badly (yes, data can be wrong, for example data point stating that "2+2=5" is wrong, despite your bizarre insistence that data is some kind of magical substance of truth).

Now suddenly AI isn't "based" (in reality), but rather has "zero reference to reality" and is going to carry "bad lesson(s) taught into the system" and make "logic errors", exactly as I was pointing out.

You are so really good at writing long-winded comments that sound intelligent but actually you are just full of shit, much like ChatGPT.

335 days ago
1 score
Reason: None provided.

Remember, the computer has literally zero reference to reality. If you tell it enough times that 2+2=5, it's going to accept that as part of it's training. If that bad lesson is taught into the system, and you carry this AI around to solve problems, it's going to have not only logic errors, but a kind of unconscious logic error if it were a person.

Hold on, hold on. Not too long ago you were telling us all how AI was "based" because computers were some kind of logical oracle that "understand that the data can not be wrong", even defending your ludicrous comments when I pointed out that computers and AI just follow their programming and could easily be wrong when programmed badly (yes, data can be wrong, for example "2+2=5", despite your bizarre insistence otherwise).

Now suddenly AI isn't "based" (in reality), but rather has "zero reference to reality" and is going to carry "bad lesson(s) taught into the system" and make "logic errors", exactly as I was pointing out.

You are so really good at writing long-winded comments that sound intelligent but actually you are just full of shit, much like ChatGPT.

335 days ago
1 score
Reason: None provided.

Remember, the computer has literally zero reference to reality. If you tell it enough times that 2+2=5, it's going to accept that as part of it's training. If that bad lesson is taught into the system, and you carry this AI around to solve problems, it's going to have not only logic errors, but a kind of unconscious logic error if it were a person.

Hold on, hold on. Not too long ago you were telling us all how AI was "based" because computers were some kind of logical oracle that "understand that the data can not be wrong", even defending your ludicrous comments when I pointed out that computers and AI just follow their programming and could easily be wrong when programmed badly (yes, data can be wrong, despite your bizarre insistence otherwise).

Now suddenly AI isn't "based" (in reality), but rather has "zero reference to reality" and is going to carry "bad lesson(s) taught into the system" and make "logic errors", exactly as I was pointing out.

You are so really good at writing long-winded comments that sound intelligent but actually you are just full of shit.

335 days ago
1 score
Reason: Original

Remember, the computer has literally zero reference to reality. If you tell it enough times that 2+2=5, it's going to accept that as part of it's training. If that bad lesson is taught into the system, and you carry this AI around to solve problems, it's going to have not only logic errors, but a kind of unconscious logic error if it were a person.

Hold on, hold on. Not too long ago you were telling us all how AI was "based" because computers were some kind of logical oracle that "understand that the data can not be wrong", even defending your ludicrous comments when I pointed out that computers and AI just follow their programming and could easily be wrong when programmed badly (yes, data can be wrong, despite your bizarre insistence otherwise).

Now suddenly AI is going to carry "bad lesson(s) taught into the system" and make "logic errors", exactly as I was pointing out.

You are so really good at writing long-winded comments that sound intelligent but actually you are just full of shit.

335 days ago
1 score