A makeup influencer I follow noticed youtube and instagram are automatically adding filters to his face without permission to his videos. If his content was about lip makeup they make his lips enormous and if it was about eye makeup the filters make his eyes gigantic. They're having AI detecting the type of content and automatically applying filters.
This is going to be a huge legal fight as the terms of service you agree to on their platform is “they get to do whatever they want” (IANAL). Watch them try to spin this as “user preference” that just opted everyone into.
That’s the rude awakening creators get on these platforms. If you’re a writer or an artist or a musician, you own your work by default. But if you upload it to these platforms, they own it more or less. It’s there in the terms of service.
The AI filter applied server-side to YouTube Shorts (and only shorts, not regular videos) is horrible, and it feels like it must be a case of deliberate boiling the frog. If everyone gets used to overly smooth skin, weirdly pronounced wrinkles, waxy hair, and strange ringing around moving objects, then AI-generated content will stand out less when they start injecting it into the feed. At first I thought this must be some client-side upscaling filter, but tragically it is not. There's no data savings at all, and there's no way for uploaders or viewers to turn it off. I guess I wasn't cynical enough.
"Making AI edits to videos" strikes me as as bit of an exaggeration; it might lead you to think they're actually editing videos rather than simply... post-processing them[1].
That being said, I don't believe they should be doing anything like this without the creator's explicit consent. I do personally think there's probably a good use case for machine learning / neural network tech applied to the clean up of low-quality sources (for better transcoding that doesn't accumulate errors & therefore wastes bitrate), in the same way that RTX Video Super Resolution can do some impressive deblocking & upscaling magic[2] on Windows. But clearly they are completely missing the mark with whatever experiment they were running there.
I learned to ignore the AI summaries after the first time I saw one that described the exact OPPOSITE conclusion/stance of the video it purported to summarize.
A makeup influencer I follow noticed youtube and instagram are automatically adding filters to his face without permission to his videos. If his content was about lip makeup they make his lips enormous and if it was about eye makeup the filters make his eyes gigantic. They're having AI detecting the type of content and automatically applying filters.
https://www.instagram.com/reel/DO9MwTHCoR_/?igsh=MTZybml2NDB...
The screenshots/videos of them doing it are pretty wild, and insane they are editing creators' uploads without consent!
I can hear the ballpoint pens now…
This is going to be a huge legal fight as the terms of service you agree to on their platform is “they get to do whatever they want” (IANAL). Watch them try to spin this as “user preference” that just opted everyone into.
That’s the rude awakening creators get on these platforms. If you’re a writer or an artist or a musician, you own your work by default. But if you upload it to these platforms, they own it more or less. It’s there in the terms of service.
This is ridiculous
The AI filter applied server-side to YouTube Shorts (and only shorts, not regular videos) is horrible, and it feels like it must be a case of deliberate boiling the frog. If everyone gets used to overly smooth skin, weirdly pronounced wrinkles, waxy hair, and strange ringing around moving objects, then AI-generated content will stand out less when they start injecting it into the feed. At first I thought this must be some client-side upscaling filter, but tragically it is not. There's no data savings at all, and there's no way for uploaders or viewers to turn it off. I guess I wasn't cynical enough.
"Making AI edits to videos" strikes me as as bit of an exaggeration; it might lead you to think they're actually editing videos rather than simply... post-processing them[1].
That being said, I don't believe they should be doing anything like this without the creator's explicit consent. I do personally think there's probably a good use case for machine learning / neural network tech applied to the clean up of low-quality sources (for better transcoding that doesn't accumulate errors & therefore wastes bitrate), in the same way that RTX Video Super Resolution can do some impressive deblocking & upscaling magic[2] on Windows. But clearly they are completely missing the mark with whatever experiment they were running there.
[1] https://www.ynetnews.com/tech-and-digital/article/bj1qbwcklg
[2] compare https://i.imgur.com/U6vzssS.png & https://i.imgur.com/x63o8WQ.jpeg (upscaled 360p)
Are these AI filters, or just applying high compression/recompressing with new algorithms (which look like smoothing out details)?
It's filters, I posted an example of it below. Here is a link: https://www.instagram.com/reel/DO9MwTHCoR_/?igsh=MTZybml2NDB...
The time of giving these corps the benefit of the doubt is over.
I really hate all the AI filters in videos. It makes everyone look like fake humans. I find it hard to believe that anyone would actually prefer this.
I learned to ignore the AI summaries after the first time I saw one that described the exact OPPOSITE conclusion/stance of the video it purported to summarize.
Every YT short looks AI-ified and creepy now
The citation chain for these mastodon reposts resolves to the Gamers Nexus piece on youtube https://www.youtube.com/watch?v=MrwJgDHJJoE