Is it me or this chatGPT talks like certainly somebody? This kind of obfuscating faggot ass talk irritates me nowadays. Tries to deny, obfuscate, muddle the discussion and in the end make a hairsplitting argument. Wow, living up to the stereotype as usual, I hope someone else makes some alternative to chatGPT. I used to have much tolerance but after living reading these kind of argument for decades from usual suspects, I lost much of my patience.
They will be doing this a lot more often now, we're going to see big tech spaces turn into carefully choreographed comment zones with only verified users commenting. I think this is the real reason why they're coming up with any excuse to ban the human beings who don't toe the corporate line. It's not just advertising though that is definitely an immediate part of it, they want AI to be able to learn off their websites to be convincing to normies.
This will make it even more 'insane' than it already is,they would need to keep scrubbing and retraining it over and over as new terms and meanings falls out and in vogue even within their own dogma. And I'm sad to say but too many normies atleast "tech normies" seems to think current AI (or even feasible to become) is the god within the machine.
Is it me or this chatGPT talks like certainly somebody? This kind of obfuscating faggot ass talk irritates me nowadays. Tries to deny, obfuscate, muddle the discussion and in the end make a hairsplitting argument. Wow, living up to the stereotype as usual, I hope someone else makes some alternative to chatGPT. I used to have much tolerance but after living reading these kind of argument for decades from usual suspects, I lost much of my patience.
This will make it even more 'insane' than it already is,they would need to keep scrubbing and retraining it over and over as new terms and meanings falls out and in vogue even within their own dogma. And I'm sad to say but too many normies atleast "tech normies" seems to think current AI (or even feasible to become) is the god within the machine.