People can't even look at the files to determine what sources have influenced the work. It is utterly unreadable by humans without profound, transformative interpretation by tools.
Profound, transformative interpretation? All my AI outputs are obviously direct copies of some artistic input. One character will even interpret modifiers differently because human artists draw that character in a certain way.
Also whatever whistle that OpenAI murder victim was about to blow, it might have to do with the file sourcing.
You can't look at the coded guts of a Convolutional Neural Network and tell me which artist was used to train the AI.
Nor can you look at that code and tell me what image it will produce.
The only thing you can do is wait until the black box spits out art in a style and then guess.
FYI artistic style is not covered by copyright. Specific drawings or paintings can be covered by copyright, but not a style.
Since the Neural Network code looks nothing like art, you would have to be a drooling smooth brain to say that the art was not transformed.
As for sources: So what? When you bought that poster did you sign a contract that said you would not use it to train a NN? What law was broken? Who was harmed?
Just to be clear that I understand the point that you are making.
You are saying that transforming a series of images into a very abstract probability weighted neural network matrix that can not be read by humans is, in fact, unaltered, untransformed art?
The process:
A bunch of art ---> Unreadable probability code ---> A new image that is different from the source material
Yet there is no transformation?
Is this a correct interpretation of your argument?
Profound, transformative interpretation? All my AI outputs are obviously direct copies of some artistic input. One character will even interpret modifiers differently because human artists draw that character in a certain way.
Also whatever whistle that OpenAI murder victim was about to blow, it might have to do with the file sourcing.
Whoosh.
You can't look at the coded guts of a Convolutional Neural Network and tell me which artist was used to train the AI.
Nor can you look at that code and tell me what image it will produce.
The only thing you can do is wait until the black box spits out art in a style and then guess.
FYI artistic style is not covered by copyright. Specific drawings or paintings can be covered by copyright, but not a style.
Since the Neural Network code looks nothing like art, you would have to be a drooling smooth brain to say that the art was not transformed.
As for sources: So what? When you bought that poster did you sign a contract that said you would not use it to train a NN? What law was broken? Who was harmed?
You're arguing that I can't prove the exact input the AI used and that it meets the legal definition of transformation.
I'm objecting to the idea that the art is "profoundly" transformed in the artistic sense. We are not talking about the same thing.
Just to be clear that I understand the point that you are making.
You are saying that transforming a series of images into a very abstract probability weighted neural network matrix that can not be read by humans is, in fact, unaltered, untransformed art?
The process:
A bunch of art ---> Unreadable probability code ---> A new image that is different from the source material
Yet there is no transformation?
Is this a correct interpretation of your argument?
No. The art is transformed to some arbitrary degree. But profoundly transformed by most people's understanding of the word profoundly? No.