I started working as a data scientist in 2019, and by 2021 I had realized that while the field was large, it was also largely fraudulent. Most of the leaders that I was working with clearly had not gotten as far as reading about it for thirty minutes despite insisting that things like, I dunno, the next five years of a ten thousand person non-tech organization should be entirely AI focused. The number of companies launching AI initiatives far outstripped the number of actual use cases. Most of the market was simply grifters and incompetents (sometimes both!) leveraging the hype to inflate their headcount so they could get promoted, or be seen as thought leaders.
And then some absolute son of a bitch created ChatGPT, and now look at us. Look at us, resplendent in our pauper's robes, stitched from corpulent greed and breathless credulity, spending half of the planet's engineering efforts to add chatbot support to every application under the sun when half of the industry hasn't worked out how to test database backups regularly. This is why I have to visit untold violence upon the next moron to propose that AI is the future of the business - not because this is impossible in principle, but because they are now indistinguishable from a hundred million willful fucking idiots.
I like this guy. :D
I was thinking I have no idea when I read it. AI seems promising but very early. Hype happens around every new technology. v1 often has little utility. The market is there to sort out winners and losers, to the extent that it works.
The analogy to natural language software might be good. People take it for granted that a computer can basically write down what you're saying today. The first versions of that sucked. But people acted like it was going to be your primary way of talking to the computer. In the end, like I said, we use it on our Fire TVs, and dictation is quite useful for certain professions. The job of typist has gone out of style. It has not replaced typing, but it is an immensely useful tool, now, whereas the first versions were only really usable to repeatedly fill out forms and charts. Oh and then of course the other direction of that works so well that we have deepfake audio.
I've seen people try to build code with AI as if it were a natural language software. Because the AI still doesn't understand what it's actually supposed to do, people have basically taken the same amount of time to fix it as they would have if they had just built it correctly in the first place.
Yeah that's my experience of the chatbots. They're just crapping out the input, and they know how to rephrase things. I haven't seen one solve an actual problem.
That said, a robot that just does what you tell it to is good. I dunno how intelligent I need these things to be. Chatbots are not really useful to me. But I'm just saying AI applied to, I dunno, cleaning up my kitchen is good. The robot can clean it according to the instructions on YouTube.
Robots are the definition of a "Capital Investment":
"Here is the procedure. Do it forever."
"OK"
If there is any variance in any scope of this situation, the robot will produce poor results.