It's an AI generated profile pic. I saw something like this on Gab the other day.
I woke up for work around 3am, saw the top post was some chick saying she got fired from her job and it was very liberating. About a paragraph long. Her profile had just been made, and she had 2 followers, so I assumed she got to the top with bots. I reported her and it got taken down pretty quick. I don't get the motive behind betting that post to the top.
From what I learned from https://www.thispersondoesnotexist.com/ look at eye position. Place your cursor on an eye and refresh (hit F5). Head direction, age, etc doesn't matter. You'll always be on an eye.
Edit (I'm bored): Apart from that... you can't do shit. Yes, some images have artifacts. Especially when jewelry / other people are present.
But eventually (took me like 30-40 refreshes and if I would use the code I could probably use parameters) it ends up like this. Sure if we know it's an AI generated image we can talk about her hair on the top left looking slightly strange. When we have this resolution and not a twitter avatar.
And algorithms probably exist that can detect AI generated images (the whole idea of generating images likes this is using a GAN basically two algorithms fighting (one tries to create a "real" image and one trying to detect a "fake" image).
Lesson learned: you cannot (as a human) detect AI images of people.
Also that site is old by now. You could easily train newer models that didn't have the restrictions you mention (like eye position.) And I'm sure with some effort you could create multiple images of the same "person" in different environments etc.
Some more interesting digging on this "person" in this thread
It's an AI generated profile pic. I saw something like this on Gab the other day. I woke up for work around 3am, saw the top post was some chick saying she got fired from her job and it was very liberating. About a paragraph long. Her profile had just been made, and she had 2 followers, so I assumed she got to the top with bots. I reported her and it got taken down pretty quick. I don't get the motive behind betting that post to the top.
From what I learned from https://www.thispersondoesnotexist.com/ look at eye position. Place your cursor on an eye and refresh (hit F5). Head direction, age, etc doesn't matter. You'll always be on an eye.
Edit (I'm bored): Apart from that... you can't do shit. Yes, some images have artifacts. Especially when jewelry / other people are present.
But eventually (took me like 30-40 refreshes and if I would use the code I could probably use parameters) it ends up like this. Sure if we know it's an AI generated image we can talk about her hair on the top left looking slightly strange. When we have this resolution and not a twitter avatar.
And algorithms probably exist that can detect AI generated images (the whole idea of generating images likes this is using a GAN basically two algorithms fighting (one tries to create a "real" image and one trying to detect a "fake" image).
Lesson learned: you cannot (as a human) detect AI images of people.
Also that site is old by now. You could easily train newer models that didn't have the restrictions you mention (like eye position.) And I'm sure with some effort you could create multiple images of the same "person" in different environments etc.
Something very fishy going on, it's almost time for the feds to put up another false flag.
Oh that's good. Maybe I'm just used to the feds trying to frame us for crimes, so I'm too suspicious.