I'm not making anything up, this is what you said:
are incapable of making art without stealing from online sources ( This is why they need to stay connected to the internet )
Please enlighten me on what this is supposed to mean, if not you saying it "steals" images from the internet when you make a prompt. If you were really just saying it has to be trained on images, then 1. that's obvious because every type of AI has to be trained so why even mention it, and 2. what does "stay connected to the internet" mean if it can be run offline?
I explained this with how the machine learning algorithm has been caught taking sources from specific artists and they noticed depending on the keywords inputted by the user that helps give the art context, doesn't always work I find, I've experimented with it myself so I do know what I'm writing about
If the programmers who have created this 'offline' AI have been ethical then they'll have taken images from some online source that isn't copyrighted or owned by an artist to help train their generation models. I wouldn't be surprised though if they just nabbed the sources from online and then gave you a bunch of models to download and carry on the generation process offline.
What happens with image generation especially is depending on what sort of keywords you're using the algorithm will eventually run out of different images to give you when enough images get generated and it quickly loses the "zomg thinking AI" mystique to it.
I've seen this happen frequently with text examples. Since programming is extremely niche, it's very easy to bugger up the algorithm and make it spew gibberish because it's searching online sources for a correct answer to a programming problem.
There was an experiment done on a godot forum where some muppet tried linking a ChatGPT and it would sometimes not even be capable of generating any code or would directly copy-paste irrelevant posts in answer to the question and it was an absolute disaster because it was confusing noobs who were trying to genuinely find things out.
TLDR: You downloaded the models for the offline machine learning and plugged them and the models will likely be using an online source. Yes it is not connected to the internet, but it is likely using data from online source that were grabbed at a specific time and that is how the machine learning works because it needs previous sources to work from and train itself. The algorithm will then continue training offline but using all of that previous data per generation to continue the image generation.
By the way, I confirmed this with other programmers as well, image generation and these chat bots really aren't that impressive.
I probably misinterpreted your original post, but it was written exactly like how many anti-AI lefties make their arguments. For example again here you said
because it's searching online sources for a correct answer
Even if you don't literally mean "the AI program is using Google when you enter a prompt", anti-AI people on the social medias do literally think that's how it works.
But I don't see how AI is "stealing" anything. Not anymore than any human artist would be "stealing" by getting inspiration from other people's art, which is what every single artist does.
I do agree with you that people overhype it a lot. I've seen way too many people using ChatGPT as a source as if it's a font of knowledge, and not just an advanced Markov chain.
An AI doesn't think, this is something that people hugely misinterpret about machine learning algorithms and how they work. It is not sentient AI and I partly blame the marketers and people trying to explain how AI works for that because as it turns out you have to be extremely specific in order for people to get it correctly.
This is precisely why the AI is stealing, to try and explain the human equivalent of what the AI is doing. A human being takes reference images and takes inspiration as you rightly point out from stuff they see around them or other artists to create something original. However that is not what the current machine learning we know does, it does the human equivalent of tracing over an already existing image, editing it and then claiming it's an original piece.
Even if you can make the legal argument that's not 'theft' it's almost certainly plagiarism and that's what many artists and programmers have noticed when tinkering with this stuff in detail. Mind you I shouldn't be stopping this I should be accelerating it because then I'll be the only vaguely competent competition around because everybody else will be endlessly generating AI trash for their work.
I'm not making anything up, this is what you said:
Please enlighten me on what this is supposed to mean, if not you saying it "steals" images from the internet when you make a prompt. If you were really just saying it has to be trained on images, then 1. that's obvious because every type of AI has to be trained so why even mention it, and 2. what does "stay connected to the internet" mean if it can be run offline?
I explained this with how the machine learning algorithm has been caught taking sources from specific artists and they noticed depending on the keywords inputted by the user that helps give the art context, doesn't always work I find, I've experimented with it myself so I do know what I'm writing about
If the programmers who have created this 'offline' AI have been ethical then they'll have taken images from some online source that isn't copyrighted or owned by an artist to help train their generation models. I wouldn't be surprised though if they just nabbed the sources from online and then gave you a bunch of models to download and carry on the generation process offline.
What happens with image generation especially is depending on what sort of keywords you're using the algorithm will eventually run out of different images to give you when enough images get generated and it quickly loses the "zomg thinking AI" mystique to it.
I've seen this happen frequently with text examples. Since programming is extremely niche, it's very easy to bugger up the algorithm and make it spew gibberish because it's searching online sources for a correct answer to a programming problem.
There was an experiment done on a godot forum where some muppet tried linking a ChatGPT and it would sometimes not even be capable of generating any code or would directly copy-paste irrelevant posts in answer to the question and it was an absolute disaster because it was confusing noobs who were trying to genuinely find things out.
TLDR: You downloaded the models for the offline machine learning and plugged them and the models will likely be using an online source. Yes it is not connected to the internet, but it is likely using data from online source that were grabbed at a specific time and that is how the machine learning works because it needs previous sources to work from and train itself. The algorithm will then continue training offline but using all of that previous data per generation to continue the image generation.
By the way, I confirmed this with other programmers as well, image generation and these chat bots really aren't that impressive.
I probably misinterpreted your original post, but it was written exactly like how many anti-AI lefties make their arguments. For example again here you said
Even if you don't literally mean "the AI program is using Google when you enter a prompt", anti-AI people on the social medias do literally think that's how it works.
But I don't see how AI is "stealing" anything. Not anymore than any human artist would be "stealing" by getting inspiration from other people's art, which is what every single artist does.
I do agree with you that people overhype it a lot. I've seen way too many people using ChatGPT as a source as if it's a font of knowledge, and not just an advanced Markov chain.
An AI doesn't think, this is something that people hugely misinterpret about machine learning algorithms and how they work. It is not sentient AI and I partly blame the marketers and people trying to explain how AI works for that because as it turns out you have to be extremely specific in order for people to get it correctly.
This is precisely why the AI is stealing, to try and explain the human equivalent of what the AI is doing. A human being takes reference images and takes inspiration as you rightly point out from stuff they see around them or other artists to create something original. However that is not what the current machine learning we know does, it does the human equivalent of tracing over an already existing image, editing it and then claiming it's an original piece.
Even if you can make the legal argument that's not 'theft' it's almost certainly plagiarism and that's what many artists and programmers have noticed when tinkering with this stuff in detail. Mind you I shouldn't be stopping this I should be accelerating it because then I'll be the only vaguely competent competition around because everybody else will be endlessly generating AI trash for their work.