

Yes this isn’t limited to billionaires. It protects everyone who owns private jets…
Yes this isn’t limited to billionaires. It protects everyone who owns private jets…
The article says that they would just lie too.
Like, they claimed one guy (18) was there to meet a minor (17). When the reporter reviewed the logs it was clear that he was there to meet an 18 year old.
Getting views is more important than catching a bad guy
I don’t get my news from tante.cc
But the fact that I don’t use them for my news doesn’t mean that they’re not lying (“editorializing”) for profit, which is a bad thing for everyone who cares about not being misinformed since people, who do read trash like this, use this kind of ‘news’ as the basis of their opinions.
All diffusion and language models are autoregressive. That just means that the output is fed back in as input until the task is complete.
With diffusion models this means that it is fed an image that is 100% noise and it removes some small percentage of the noise and then then the denoised image is fed back in and another small percentage is removed. This is repeated until a defined stopping points (usually a set number of passes).
Combining images and using one image to control the generation of another has been available for quite a while. Controlnet and IPAdapters let you do exactly that: ‘Put this coat on this person’ or ‘Take this picture and do it in this style’. Here’s an 11 month old YouTube video explaining how to do this using open source models and software: https://www.youtube.com/watch?v=gmwZGC8UVHE
It’s nice for non-technical people that OpenAI will sell you a subscription in order to access an agent that can perform these kinds of image generation abilities, but it’s not doing anything new in terms of image generation.
Either way this isn’t a news article, it’s a blog post. Who cares if it’s editorialized?
People who would rather hear the truth and not fancy lies that appeal to the masses.
Is it really a ‘move to allow’ style prompts? They’re just no longer preventing people from doing that.
It’s weird that people who profess to be staunch defenders of art don’t understand that stealing styles is fundamental to art. If enough people steal a specific style then art history just labels it a ‘movement’. Look on this page: https://magazine.artland.com/art-movements-and-styles/ and you can see that the thing they’re describing is a lot of people copying the same style.
Drum and Bass, a music genre, was essentially built on a “”“”“stolen”“”" clip from The Winstons in a song called Amen, Brother. The Amen break (you’ve certainly heard it even if you don’t know the name) is copied over and over and over.
This is just the latest social media trend trying to shoehorn issues into the ‘AI-bad’ meme. Stealing styles is not unusual or even immoral. It is literally the foundation of art.
This is just outrage farming, because 1. People are familiar with this style and 2. The primary artist who made the style popular is against AI.
They’re really playing with fire here.
So many MAGA supporters are seniors who are entirely dependent on OASDI. If Trump’s minions break this, we’re going to see torches and pitchforks strapped to electric scooters and golf carts coming out of Florida retirement communities in droves.
I don’t know who needs to hear this (everyone, apparently) but it isn’t a virtue to make people angry.
Way too many people abandoned any attempt at conversation and immediately move to trying to “own” the other side.
You didn’t win because you pissed someone off, a child can do that.
Child Sexual Abuse Material is abhorrent because children were literally abused to create it.
AI generated content, though disgusting, is not even remotely on the same level.
The moral panic around AI that leads to implying that these things are the same thing is absurd.
Go after the people filming themselves literally gang raping toddlers, not the people typing forbidden words into an image generator.
Don’t dilute the horror of the production CSAM by equating it to fake pictures.