Adobe FREAK OUTS! Addresses TOS Backlash Dumpster Fire!
(www.youtube.com)
Comments (22)
sorted by:
TLDR; Adobe are spinning corporate stuff in which they promise this is intended for only gen art and moderation
All stuff in the cloud is for moderation or so they claim, but hey to improve the auto moderation nothing beats using the customer data.
Although they claim all data rights are owned still owned by the customers... Haha
Also, Op you could be kind enough to mark the relevant section, most of these speakers rambles too much, Could get it down to a much better presentation, The amount of docs they are going through are not that long etc.
Bullet point summary of the youtube video generated by youtubesummarizer:
Adobe's new terms of service allow the company to use users' artwork and photos to train its AI
A prominent YouTuber called out the company on this issue, leading to many customers cancelling their Adobe licenses
Adobe has responded defensively, with a blog post and a senior executive responding on Twitter
The move towards online subscriptions has been contentious for some customers who preferred to own the software outright
Concerns remain about rogue employees accessing users' private information, despite Adobe's claims that it will only access files when legally required to do so
Similar privacy issues have arisen at other tech companies such as Google and Microsoft
Some experts predict that users may migrate towards alternative software options or even the Linux operating system in response to such risks.
Didn't reddit claim they had rights to a guy's book because an early draft of it was written and shared on reddit?
This will be used the same way
Gee, sure is interesting why people pirate their software huh?
I'm still confused on the reasoning for Adobe to monitor customer content. If they need to monitor content uploaded to their server or generated using their generative AI feature, then I could give them the benefit of the doubt for that. But if I'm understanding the terms of service correctly, then they're asking for access to all content created with or imported into their software. Aren't their tools supposed to work without an internet connection?
"We may access, view, or listen to your Content"
'"Content" means any text, information'...'images, that you upload, import into, embed for use by, or create using the Services and Software'
No, I think the last version to work offline, is the last version to be cracked. C6 I think it was?
Oh really? I thought people were still cracking it.
I think that's the version I have, that used to work, but it didn't work when I went back to it a while ago.
I'll be as clear as possible.
They need image data to feed new, enormous training sets for neural networks. they need as much data as possible and they need it right now. there is a data gold rush that's been going down for about 10 years now (2015, major breakthrus in NNets), people realize this now, and so they are walling off data sources they control and/or charging money.
Adobe is integrating NN tools into their products like context fill and such. these tools require MASSIVE amount of data for their training sets.
I did think it was interesting that Adobe clarified that they don't train the Firefly AI model on customer content. My first thought when reading that was, "so you're training other AI models?"
I did hear there was also a concern that Adobe could lock you out from your own work if you don't agree to their terms and conditions and it could also assert its right to claim royalty fees for content created with its software that makes money.
Even if you pay, you're the product.
Seems like Adobe had it all with flash. Every single website in 1.0 days ran on some flavor of flash. Be it some kind of animated area, or a game, or the whole site was built using it.
Now they have no idea how to get revenue if it isn't incredible consumer unfriendly.
Flash was a security nightmare in its prime. How do I know this? I got ransomware on my dad's laptop because of a banner ad on YouTube. I never clicked it, simply having it loaded in the background was enough to deliver its payload. This would have been around 2010.
Thankfully, the ransomware was nowhere near as advanced as today, so we were able to remove it by toying around with Windows' shutdown process.
I have so many articles about Adobe proclaiming ethical AI and then doing this. It's very obvious they want control of the info, not the actual ethics they proclaim.
If its not just PS but stuff like acrobat, that should raise major HIPAA issues. Any medical information put on a pdf will be a liability.
I'm curious: what happens with other people's stuff that you've licensed for your work?
Thinks like assets for example. Usually the license says you're allowed to use the assets in derivative work but you're not allowed to redistribute them as is, which this sounds like. Wouldn't that effectively make it illegal to use Adobe for anyone who uses other people's licensed stuff?
You'd be hard pressed to prove that any one piece of art was used as training data in their neural network crap.
But would you have to prove that? If you're the person licensing it out under whatever limited license, would it be enough to say "hey, you agreed not to redistribute this, but then you ran it through Adobe, who says they will peek at and use anything you run through their software"?
May use. They may not. Any use complaint would have to be proven, which would be extremely hard.
By agreeing to Adobe's terms and using their software you have redistributed works in way that likely violates your license.
Move companies don't have to prove that you actually watched a movie or that whoever downloaded it from you did. The mere act of redistributing it violates copyright law.
🤷 Maybe you have a point.
It would have to be challenged in court.