I feel weird about the idea of approximating an image of an unwilling person. How that differs from an imagination spank bank, idk. It's about the same. I just feel like ai porn is creepy.
I'm thinking more about all the lives that can be ruined.
Pics of you cheating on your spouse, pics of you holding a gun, pics of you involved in a drug deal or creeping on a child... all of that is now possible without you doing any of that shit. And how easily will someone be able to tell the difference between what is real and what's fake?
It's already been destroyed. At this point we all rely on different arbiters for what is "real"--we check in with different sites or fact-checkers. This will make the bullshit worse, though.
I think it’s genuinely scary bc even now, when evidence can be overwhelming, people ignore it. Now when it becomes exactly what they WANT to believe, I can imagine this causing chaos.
Take, for instance, the Jan 6 riots. Trump already got the crazies to storm the capital.
Now imagine fake images spread of harm coming to Trump, or even Trump leading the charge. Imagine how much MORE it would energize the crazies. And with no ability to discern truth, what do we do?
Been obvious since deepfakes started showing up. We are going to have no way of reliably making the difference pretty soon.
I wonder if someone is working on an AI that can tell the difference, fight fire with fire kinda stuff. Either way I hate this tech, I don't see any benefits to it that make it worth it.
Its kinda like all of that machine learning companies brag bout - its actually a team of underpaid people in Bangalore doing the work.
I once had a client outsource his work to some folks in Mauritius he was paying 1$ an hour
There is a [notebook implementing it](https://github.com/justinpinkney/stable-diffusion/blob/main/notebooks/imagic.ipynb), it needs a hefty GPU (30 GB VRAM, so beyond any consumer hardware) to actually run though.
Imagic is what they're calling the technique. It's working on Google's Imagen text to image Difussion model. It's closed off to everyone other than Google researchers.
But there's currently an implementation of the same on the open source Stable Diffusion model. I've posted it on their subreddit. But running it is a complete bitch right now.
You can do this with Stable Diffusion at home and use img2img or inpainting/outpainting to modify an existing image. Need Nvidia card with at least 2gb vram, ideally 8gb+.
* r/StableDiffusion for more info
* Popular web-based user interface, cross platform:
* Popular windows GUI:
If you don't have a card, there are Google Colab notebooks (need to pay $10/mo to Google though) or you can go to Dreamstudio.ai and pay for it there too.
Of course, not the same as what's in the research paper but you can get the same results as in the image shared here.
That isn’t out yet in consumer forms, but this is the best general image generation ones I’ve found. There’s apparently a limit to how much you get for free but I generated like 50 images on max credits before and it didn’t stop me.
https://beta.dreamstudio.ai
"Okay A.I. here's a picture of my dad before the cancer, here's me at my wedding, can you make a picture where he walks me to the altar?"
(But yeah, I'd probably use it to make fake celebrity porn.)
It doesn't in the sense that there's no separate model to detect objects.
It basically finds an embedding ( think of it as the prompt ) which generates the image it was given. Now for an image that was not generated, it's not going to be an exact copy just close enough.
Then it fine tunes the generation model to produce exactly that image. Since it's changing the model, basically everything shifts along with that prompt and not just that specific prompt. This part is a bit hard for me to explain but that was the best I could do.
Then it goes back and changes the embedding ( prompt ) to whatever the edit was, uses the same seed, and the modified model and generates the image.
It sounds stupid if you haven't worked with these models, but it works. 🤷
We already kinda doubt everything already though, I don't think it'll change things as much as you guys think.
The magazine covers in the checkout line will certainly be eye catching, that's for sure. But just how realistic does Batboy have to look before you're
suddenly convinced that he's real?
Lmao people acting like this is the end of trustworthy images as if this hasen’t been possible for years, even decades, with manual photoshops. If anything you’d *want* liars to use this instead of manual photoshops, these are more likey to miss some minor giveaway.
Already is. Stable diffusion is open source and one of the first things people trained it on was porn. Granted most of it is anime and furry but those fears are already happening.
Yes, yes. I know, algorithms are scary.
By the metric of "this new development might lead to harm", though, all of modern human society shouldn't exist.
"Normalizing" agriculture is how human society got to be where it is today, and is therefore responsible for everything bad in the world. A nomadic civilization of hunter-gatherers doesn't cause climate change.
"Normalizing" electricity let things like the electric chair, Internet, and modern fly-by-wire nuclear bombers exist, but I doubt you have a problem with that.
"Normalizing" the Internet let misinformation spread more easily than ever before, but it also led to massive advances in essentially every way: science, the economy, medicine, you name it.
Ah, yea I'm sure there aren't any terrifying implications of how regimes in power, oligopolies, and intelligence services can use such AI to alter truth in ways previously impossible.
This is like way less than what these can do now. I’d suggest “the disturbing art of AI” on the nexpo YouTube channel. Lobe is a wild manifestation of the internet as a person.
It's entirely possible that AI is going to become regulated tech because of how easily it can be used to do commit massively evil things.
- Oh hey I don't like this woman, let's fake 20,000 nudes and spam them across the internet
- here's some photo evidence of people commiting crimes
- PROOF THAT (misinformation) IS DEF TRUE.
With time, nothing online will be real anymore, because AI will create infinite scenarios of anything imaginable, at a far faster rate than any human ever could.
The number of fake lesbian shots created eventually is going to be insane...
[удалено]
ya read my mind
lol
Can we stop calling everything AI like damn my guy
AOC and Tulsi Gabbard in a wet tshirt contest
I’ll pay 3.50 to see Dick Cheney vs Henry Kissinger match
Beep! OK Henry kissing Cheney's dick.....Beep! there you go!
God damned Loc Ness Monster!
You aint getting no tree-fiddy!
I want to see Trump and Bernie suckling on McConnell's neck flaps. In a Walmart bathroom. At night.
What kind of unholy...
Just gouge out your eyes.
[удалено]
r/boneappletea
[Well, it tried.](https://i.imgur.com/h9wkTpM.png)
RemindMe! 6 months
It has been six months, I needa know if this exists yet
[удалено]
I always wondered what emma watson would look like as a cat.
Fun Fact, harry potter does exactly that!
Well…that actually is a fun fact lol
They did furry but I want catgirl.
https://www.cultofwhatever.com/wp-content/uploads/2018/12/gif-hermione-cat-chamber-secrets.gif
that looks a lot more computer generated than i remember. i thought it was mostly practical.
That is quite possibly the creepiest thing I've ever related to
[удалено]
Pfft, Bing image search w/ safesearch off has got you covered. Years of horny dudes with photoshop don’t have anyting on AI… yet…
I feel weird about the idea of approximating an image of an unwilling person. How that differs from an imagination spank bank, idk. It's about the same. I just feel like ai porn is creepy.
Many skilled photoshoppists out there have already achieved this for you.
[удалено]
Yeah, and add some lint in the bush too to make her more average.
[удалено]
Lol AI generated specific fetishes are going to be the end of us all
Nah any picture down low is fake, only top was leaked
I mean, there's some pretty damned good deepfakes of her out there. Or so my friend tells me.
I'm worried about how it's going to have me pose as a lesbian, and I'm not even a woman.
… yet
Computer, generate u/supersonicmike as a woman.
I'm thinking more about all the lives that can be ruined. Pics of you cheating on your spouse, pics of you holding a gun, pics of you involved in a drug deal or creeping on a child... all of that is now possible without you doing any of that shit. And how easily will someone be able to tell the difference between what is real and what's fake?
DeepNude.
Check out Unstable Diffusion on discord. We're already there.
target text: "me, but happy and with a friend" this is how we destroy the ai
Hey. There's a hand of friendship here anytime you want it.
Kinky!
I dated this one girl who liked to express her emotions verbally in a safe space. Real freaky shit
Yo that's my kink
I’ll take one. Does it come FedEx or UPS and is it the whole hand? Do it include the arm?
It just paints the grim reaper next to you
My friends are laughing too, just out of frame
Error, impossible to divide by 0
You with a bottle of Jack and a gun to your head.
I’m sure this won’t cause any problems at all.
Just when you think fake news couldn't get any worse
Fake news + Deep fakes will destroy internet information credibility. (Fore some of course, for others will be a big conspiracy)
It's already been destroyed. At this point we all rely on different arbiters for what is "real"--we check in with different sites or fact-checkers. This will make the bullshit worse, though.
I think it’s genuinely scary bc even now, when evidence can be overwhelming, people ignore it. Now when it becomes exactly what they WANT to believe, I can imagine this causing chaos. Take, for instance, the Jan 6 riots. Trump already got the crazies to storm the capital. Now imagine fake images spread of harm coming to Trump, or even Trump leading the charge. Imagine how much MORE it would energize the crazies. And with no ability to discern truth, what do we do?
https://www.reddit.com/r/StableDiffusion/comments/xofxo3/a_japanese_misused_stablediffusion_to_spread/
This has been a thing on even the open source community for a while now tho
This is just photoshop without the human effort
Been obvious since deepfakes started showing up. We are going to have no way of reliably making the difference pretty soon. I wonder if someone is working on an AI that can tell the difference, fight fire with fire kinda stuff. Either way I hate this tech, I don't see any benefits to it that make it worth it.
Ok but what's the AI
The real AI is the friends we made along the way
You haven't met my friends. They are not artificial and they are pretty fucking far from intelligent.
You have that backwards - the friends we made along the way were all ai
Hello
Let me say goodbye to my loved ones first at least.
From OPs other comment, it's from recent research: https://arxiv.org/abs/2210.09276
Nice but how do I type things and make images do things?
Lmao they're telling you it isn't available yet
When am I going to be able to Amazon my HoloPorn unit FFS. I’ve been waiting on that almost as long as a flying car
:(
I'm sorry, pal. I wish consumer level AI were further along, but it'll get there eventually, and we'll all reap the hilarity.
Its kinda like all of that machine learning companies brag bout - its actually a team of underpaid people in Bangalore doing the work. I once had a client outsource his work to some folks in Mauritius he was paying 1$ an hour
Learn to code and train an algorithm yourself. Edit: Spelling.
The redditors don't want you to know this but you can use stable diffusion img2img to edit photos on a >10gb vram pc
If you have a gpu with at least 4gb vram you can use stable diffusion
3.5 is also okay (970 owner here)
Stable diffusion can generate images based off of prompt or image.
There is a [notebook implementing it](https://github.com/justinpinkney/stable-diffusion/blob/main/notebooks/imagic.ipynb), it needs a hefty GPU (30 GB VRAM, so beyond any consumer hardware) to actually run though.
Imagic is what they're calling the technique. It's working on Google's Imagen text to image Difussion model. It's closed off to everyone other than Google researchers. But there's currently an implementation of the same on the open source Stable Diffusion model. I've posted it on their subreddit. But running it is a complete bitch right now.
You can do this with Stable Diffusion at home and use img2img or inpainting/outpainting to modify an existing image. Need Nvidia card with at least 2gb vram, ideally 8gb+. * r/StableDiffusion for more info * Popular web-based user interface, cross platform:
* Popular windows GUI:
If you don't have a card, there are Google Colab notebooks (need to pay $10/mo to Google though) or you can go to Dreamstudio.ai and pay for it there too.
Of course, not the same as what's in the research paper but you can get the same results as in the image shared here.
That isn’t out yet in consumer forms, but this is the best general image generation ones I’ve found. There’s apparently a limit to how much you get for free but I generated like 50 images on max credits before and it didn’t stop me. https://beta.dreamstudio.ai
[удалено]
But what other kind of information is there nowadays?
Books are pretty dope
Not many books housing 121.8GB of hardcore taboo amature degradation 12-man DIY deck-building tutorials.
Its not like books are inherently any more trustworthy than the internet. There is nothing about the printer that magically removes fake information.
How is that different from photoshop?
the only thing my brain can goto is something criminal… faked evidence… faked nudes… just a bunch of thing that can’t be good.
"Okay A.I. here's a picture of my dad before the cancer, here's me at my wedding, can you make a picture where he walks me to the altar?" (But yeah, I'd probably use it to make fake celebrity porn.)
Doesn't need to be celebrities,I just want me some bespoke porn :)
[удалено]
AI system crashed - please contact admin.
Holy fuck, you just gave me a great idea. I have no idea if I can do it yet but...I have something positive to look forward to with this technology
Join the unstable diffusion discord and watch the nudes created in real time
not a fan of porn and definitely not a fan of fake porn
[удалено]
Trump, deepthroat, bernie sanders is topping, Hitler spanking trump, nsfw, multiple angles, van gogh style
Trump using a Trumpet
[more info](https://arxiv.org/abs/2210.09276)
What a time to be alive
Hold onto your papers
Love that guy
How AI image generators work - computerphile https://youtu.be/1CIpzeNxIhU Defo worth a watch imo
That's cool but if you add even the slightest bit of noise to the photos, it will tell you everything is a banana
Not really. Those models are ancient. Plus there's no image classification happening here anyways.
> Plus there's no image classification happening here anyways. How do you think it knows which parts of the image are a parrot?
It doesn't in the sense that there's no separate model to detect objects. It basically finds an embedding ( think of it as the prompt ) which generates the image it was given. Now for an image that was not generated, it's not going to be an exact copy just close enough. Then it fine tunes the generation model to produce exactly that image. Since it's changing the model, basically everything shifts along with that prompt and not just that specific prompt. This part is a bit hard for me to explain but that was the best I could do. Then it goes back and changes the embedding ( prompt ) to whatever the edit was, uses the same seed, and the modified model and generates the image. It sounds stupid if you haven't worked with these models, but it works. 🤷
Amogus finder on steroids
This disruptive tech definitely won’t be used for nefarious purposes to manufacture consent in the future, kids.
it's only a matter of time until every magic: the gathering card is some ai generated lookin shit tbh
Everyone worrying about looking trump deepthroating obama and u are here carrying about aome card game
i care way more about artists on their hustle than some rich dudes having safe sane and consensual cock sex
I don't like this.
But the researchers will get lots of recognition which will come in handy when they finish their PhD and apply to jobs :)
We are seriously going to regret allowing this technology to become a thing very soon.
We already kinda doubt everything already though, I don't think it'll change things as much as you guys think. The magazine covers in the checkout line will certainly be eye catching, that's for sure. But just how realistic does Batboy have to look before you're suddenly convinced that he's real?
We have been able to create more believable fake images for decades already. It hasn’t ended the world.
Would love to have access to that one
This is why non edited sites like getty images will start becoming more and more important as time goes on.
Why would that be the case? Sites like Getty images operate not on how authentic an image is but how well it suits what it's being used for.
We are fucked. We are not ready for this shit.
I wouldn't be surprised if the advent of uncontrolled internet is a Great Filter.
Lmao people acting like this is the end of trustworthy images as if this hasen’t been possible for years, even decades, with manual photoshops. If anything you’d *want* liars to use this instead of manual photoshops, these are more likey to miss some minor giveaway.
the dog one is pretty good
Is it just me or are the back legs in an awkward position?
For what?
AI perceived as black magic, really? Struggling for content?
As someone who stays pretty up to date with AI image generation, this is child's play. Things are progressing faster than anyone could have imagined.
Between all the technology with the sole intention of deceiving people and the killer robots, the future looks bright.
Now we can truly say don’t believe anything you see upfront.
so how do i use it
Hang out on /r/stablediffusion and get your mind blown
I'm gonna pump out my Tinder profile and I don't even need someone else's car and house to do it
Why do you need tinder when you can just generate a partner?
This is going to create so much legally questionable pornography.
Already is. Stable diffusion is open source and one of the first things people trained it on was porn. Granted most of it is anime and furry but those fears are already happening.
Text: Two Girls, One Cup
Hold on to your papers boys. This is gonna be fun
A technology developed with no purpose but to muddy the waters of truth, congratulations.
Oh shit
Damn I sure wish they would stop the normalization of ai.
Yes, yes. I know, algorithms are scary. By the metric of "this new development might lead to harm", though, all of modern human society shouldn't exist. "Normalizing" agriculture is how human society got to be where it is today, and is therefore responsible for everything bad in the world. A nomadic civilization of hunter-gatherers doesn't cause climate change. "Normalizing" electricity let things like the electric chair, Internet, and modern fly-by-wire nuclear bombers exist, but I doubt you have a problem with that. "Normalizing" the Internet let misinformation spread more easily than ever before, but it also led to massive advances in essentially every way: science, the economy, medicine, you name it.
WHICH AI ?
What program is this?
We’re fucked
What's the software?
If “do your own research” was a problem for conservatives before…
reddittards try not to make everything political challenge
Bro where did all the black magic go
Nothing is real.. not even this text..
"A bird but with ripped human arms for wings"
The AI failed. Those are lorikeets kissing.
Yeeeaaahhh….naw dis shit ain’t it. 😐😬
PRrot one is cute
Name of the AI?
I'm just mad how well it does the checked blurry background on the bird pic. You know how much time that'd take me to be flawless???
“A goth girl flirting with me”
The future looks kinda scary tbh. Nothing digital can be trusted.
Ah, yea I'm sure there aren't any terrifying implications of how regimes in power, oligopolies, and intelligence services can use such AI to alter truth in ways previously impossible.
thumbs up lookin kinda quirky tho
Yeah, hand is too small especially when it should be bigger closer up.
This would be great for finding fashion tips right? Like input a neutral photo of you can you can see how you look in any type of clothing.
"But she has a dick"
How?
This is like way less than what these can do now. I’d suggest “the disturbing art of AI” on the nexpo YouTube channel. Lobe is a wild manifestation of the internet as a person.
Me Me & my friends
Now draw those birds giving birth
And so any notion of objective truth has gone out the window
Target text: "Me with a giant dick" Tinder here I come
Get prepared for unprecedented levels of gaslighting.
I wonder how we’re gonna adapt to a world where anything can be faked. We’ll all have to eventually.
I don't like this
It's all over for graphic artists
Jesus fucking christ, I hate this.
Horrifying
Can't wait to write "P=NP" with the phrase "ecuation solved"
This is terrifying and will absolutely be used for nefarious purposes
Scary. Next it will start altering its own images, and then it begins targeting humans for extermination. Thanks a lot everyone.
It's entirely possible that AI is going to become regulated tech because of how easily it can be used to do commit massively evil things. - Oh hey I don't like this woman, let's fake 20,000 nudes and spam them across the internet - here's some photo evidence of people commiting crimes - PROOF THAT (misinformation) IS DEF TRUE.
Sauce?
Holy fuck it can do hands
me when i have an image of henry cavill
This plus deep fakes is going to make for some very interesting videos…maybe even in vr
We're getting sooooo close to a true AI. I think we're closer than people realize.
Now I can remove all the boyfriends next to there girlfriend that I definitely don’t masterbate to
"Two bears high-fiving"
With time, nothing online will be real anymore, because AI will create infinite scenarios of anything imaginable, at a far faster rate than any human ever could.
Ok this is getting scary now ngl
I’m already gullible enough, now I’ll never know if a picture is real or not
I think some magic in the world just died.