This "generative AI" thing is going to get interesting. If you've seen any of those "day in the life of [a woman] in tech" videos it's pretty clear that these jobs could be done by literal monkeys. These AI will be better at marketing type jobs than people currently working them. Same go for journalists. Here's the thing, these AI are going to replace the high paying jobs that the typical progressive yearns for. Learning to code won't even save them then. What then? It will be funny to see these guys advocating for legal protections for their jobs after decades of supporting the offshoring or outright destruction of blue collar work.
As someone who codes for a living and has played with github copilot a lot, AI is not replacing coding jobs any time soon. It is nothing more than glorified auto suggestions right now, and for it to be able to write usuable code to create an actual piece of software, it wpuld need actual sentience. Basically at the point where AI could replace programmers, it'll be able to replace everything.
Not only that, but about 90% of programming is defining requirements. You still have to tell the computer/AI what you want it to do, even if the AI is writing the code. And you have to give the computer detailed enough instructions that it will be able to accomplish the task you want.
Even if AI writes the code, generating those detailed instructions isn't going to be significantly different than what programming is today. You'll just be shielded from (some of) the internal workings like memory management.
The problem with that is, you're going to end up with shit code. Just look at the output of any markup or code generation tool; they do the job, but it's always an unreadable mess. Of course, hardware has gotten so fast and people have gotten so used to shit software by now that maybe it won't even ultimately matter.
I was afraid including that bit about learning to code would distract from my (poor) attempt at pointing out that AI will probably soon be better at some of the jobs in tech/writing/art than people are. I think the issue is using the term AI to begin with. It's too broad and people always jump to human level capability (and more importantly awareness). The technologies in this discussed in OP's link are not Inteligent; others in the thread are more correct in calling it machine learning. That having been said, we all assume the internet is filled with bots, so the idea of bots that can be more convincingly human and produce marketing work or a news article of sufficient quality to replace human workers hardly seems farfetched at this point.
You don't have to replace everyone for the technology to apocalyptically upend and industry. Probably for the best.
This "generative AI" thing is going to get interesting. If you've seen any of those "day in the life of [a woman] in tech" videos it's pretty clear that these jobs could be done by literal monkeys. These AI will be better at marketing type jobs than people currently working them. Same go for journalists. Here's the thing, these AI are going to replace the high paying jobs that the typical progressive yearns for. Learning to code won't even save them then. What then? It will be funny to see these guys advocating for legal protections for their jobs after decades of supporting the offshoring or outright destruction of blue collar work.
As someone who codes for a living and has played with github copilot a lot, AI is not replacing coding jobs any time soon. It is nothing more than glorified auto suggestions right now, and for it to be able to write usuable code to create an actual piece of software, it wpuld need actual sentience. Basically at the point where AI could replace programmers, it'll be able to replace everything.
Not only that, but about 90% of programming is defining requirements. You still have to tell the computer/AI what you want it to do, even if the AI is writing the code. And you have to give the computer detailed enough instructions that it will be able to accomplish the task you want.
Even if AI writes the code, generating those detailed instructions isn't going to be significantly different than what programming is today. You'll just be shielded from (some of) the internal workings like memory management.
The problem with that is, you're going to end up with shit code. Just look at the output of any markup or code generation tool; they do the job, but it's always an unreadable mess. Of course, hardware has gotten so fast and people have gotten so used to shit software by now that maybe it won't even ultimately matter.
I was afraid including that bit about learning to code would distract from my (poor) attempt at pointing out that AI will probably soon be better at some of the jobs in tech/writing/art than people are. I think the issue is using the term AI to begin with. It's too broad and people always jump to human level capability (and more importantly awareness). The technologies in this discussed in OP's link are not Inteligent; others in the thread are more correct in calling it machine learning. That having been said, we all assume the internet is filled with bots, so the idea of bots that can be more convincingly human and produce marketing work or a news article of sufficient quality to replace human workers hardly seems farfetched at this point.
You don't have to replace everyone for the technology to apocalyptically upend and industry. Probably for the best.