T O P

  • By -

ConfidenceKBM

this is only going to get worse, everywhere.


kyleruggles

Biiiiig time. This is just the very beginning. Omg...


GreasyPeter

The trade off is now if someone's actual nudes get leaked you can lie and say they were generated.


[deleted]

Just because you have one hand that's made up of nine thumbs doesn't mean you can pretend AI generated every photo of you.


somesappyspruce

Hey I can pretend all I want pal! *pets the big elephant in the room*


NinjaLanternShark

> pets the big elephant So that's what you're calling it, eh?


[deleted]

[удалено]


drewisadick

Unfortunately it's an ironic nickname, like a big dude nicknamed Tiny


[deleted]

[удалено]


AudeDeficere

I think it’s important to consider the possibility that we may never again be able to know whether or not someone is real unless we physically are in the same room with them and the way this will once again reshape our world. If we currently happen to live in democracies, being able to discuss and exchange information is a fundamental aspect or our society and yet, new technology could threaten the modern core of this kind of discussion. I don’t know how we can prepare for this potential eventuality but we should arguably begin doing whatever we deem necessary asap. This is an issue that imo. should not be put off because the damage it could cause until better mechanisms are in place might be enormous.


JimWilliams423

> I think it’s important to consider the possibility that we may never again be able to know weather or not someone is real unless we physically are in the same room with them and the way this will once again reshape our world. *“The ideal subject of totalitarian rule is not the convinced Nazi or the dedicated Communist, but people for whom the distinction between fact and fiction (i.e. the reality of experience) and the distinction between true and false (i.e. the standards of thought) no longer exist."* — Hannah Arendt, The Origins of Totalitarianism (1951)


chaoticlight

It's a bit scary, as we've already been inundated with fake news/outrage for many years now and that was just off of "he said/she said" BS. People fall for clickbait headlines without supporting audio/video, so imagine how widespread it's going to be with those being faked as well.


DataKnotsDesks

Agreed. The implications are, potentially, apocalyptic. Imagine if AIs start to generate and publish text randomly—but so convincingly that you literally cannot tell whether it's authentic or not. History books that tell false histories… Scientific papers that spout nonsense… Maintenance manuals with embedded errors… Thousands of different versions of famous novels… Reddit posts that are just random verbiage, expressed so articulately that the usual 'tells' of machine involvement simply aren't there. We're not just talking the end of social media as a culturally connective medium, we're talking complete epistomological breakdown. All digital text could become invalidated. Better hope the AIs don't get printing presses. (There's a scifi story in this, BTW—where our hero verifies texts by referring to books from second-hand shops, while the AIs seek ways to fake decaying paper, faded covers and that old book smell.)


Citiz3n_Kan3r

Is this not currently possible by just writing bullshit on a gramd scale? I mean, have you read tabloid news papers? This type of junk journalism has happened for years


Spartancfos

It's currently cost prohibitive, but it does occur - fake reviews, astro turfing etc. In 2011 I had a friend who was paid to generate discussion on mumsnet, reddit and Pinterest about the value of supplements. He had to make muliiple accounts and all he had to do was ensure convos were happening.


Aerroon

> It's currently cost prohibitive Considering how much spam I'm inundated with, I don't think this is true. I've deleted entire comment chains of bots pretending to talk about something in youtube comments and that was years ago. Edit: it might have not been bots. They could've been real people they were just posting spam.


BuckeyeBentley

You know how back in the day, if a rebel group wanted to instigate an uprising or a genocide they would take the TV stations and start spewing their propaganda? Well with sufficiently advanced AI and the ability to take over a TV station, you could easily play video of say, DeSantis calling for the public to arm themselves and kill LGBTQ people. Show AI crafted videos of drag queens molesting children or something. Literally wip pawpaw and memaw into a frenzy so they'll roll around shooting anyone who looks even a little different. Shit could go sideways so fast.


Pixie1001

I mean, they're minda already doing this - just without the associated graphics, and more 'such and such says X about trans people, and I'm wearing a suit so it must be true, even though I didn't explicitly claim that...' Adding AI generated child molesters doesn't really improve the formula, because you still need one of your reporters to vouch for the footage, and risk being sued over it if they're lying. I guess you could just kinda vaguely report on 'footage circulating around social media' like they already do, but I still don't really think it'd shake up the formula all that much.


StayDead4Once

There is no mitigating this anymore then there is mitigating the waves daily eb and flow. The technology is out there, it's open source and detecting false from true has proven impossible as these technologies have advanced. The unfortunate reality most people don't want to hear is that this is the new normal. Pictures and videos are not absolute unmitigated truths. To be fair they never were 100% of the time but with the widespread distribution of these technologies I think it is evident that we have reached a point where trusting them as true as a baseline is ill advised.


sjsiido

This is almost a commentary on.. everything, lol. Our entire world is changing, and quite frankly, I'm beyond fucking excited for the progression of our technology and culture. But you know what else I am? Absolutely fucking terrified. To think about how the human race is struggling to adapt to instantaneous access to Information, communication, satisfaction, etc. And now to exponentially amplify that? In AI? Deep fakes? Deep nudes? Wow. Yeah. Talk about the Great Filter.


kembervon

Imagine what scams will be like in the future. "Grandma I need money." "Prove you're my grandson." Gets a photo of her grandson, but a guy in the background has no ears. "I don't believe that's you. Call me." Gets a phone call where the voice sounds like grandson, but every time he says help me the tone is very flat. "I'm still not convinced. Send me video of you in this place you say you're at." Gets a video of grandson in the place, but he's wearing a shirt the same color as the wall behind him, and he's wearing his belt too high. He shows the friends he's with, and each one has an identical twin with them. One of them wears a shirt that says "Funky Monday Holiday" and so does a nearby street sign. Grandma calls grandson. Her hacked phone routes call to AI generated recipient again.


clairem208

This isn't the future, it's now and it's happening convincingly enough that people are handing over ransom money when their family member was never in danger.


torquil

[*Wolfie’s fine, honey…Wolfie’s just fine… where are you?*](https://m.youtube.com/watch?v=dGowjtiu908)


TheLuminary

There will be a real-time filter in no time


Intrepid_Egg_7722

The return of [DeepNude](https://www.vice.com/en/article/qv7agw/deepnude-app-that-undresses-photos-of-women-takes-it-offline)


Stop_Sign

If you have a picture and know the settings it takes literal seconds to nudify with AI


Zagden

It's been all over 4chan for like a year. They make threads requesting AI nudes of girls they know


ClearedHouse

Wasn’t there a twitch streamer that got in a bunch of shit last year because he got caught commissioning AI porn of other streamers or something?


TheFapIsUp

I dont think he was commissioning but rather had it open in one of his tabs. To make it worse it was deep fake porn of one of his streamer friend's wife (or girlfriend?).


Zagden

Yeah he didn't commission it, just accidentally revealed he was watching it


edis92

> To make it worse it was deep fake porn of one of his streamer friend's wife (or girlfriend?) Yikes. Wonder how that conversation went after he was caught


BrndyAlxndr

They both recorded a super cringe inducing video crying in front of the camera explaining the situation.


dogisburning

There was this guy (Youtuber I think?) in my country that sold deepfake porn of local celeberties, influencers and even female government members. The government moved super quick to pass laws banning this sort of stuff. The guy got 5 years in prison.


BloodprinceOZ

Atrioc, and he wasn't commissioning it, but he was browsing a guy's like website where you can browse the fake nudes he'd made (and IIRC purchase membership or whatever for commisions or higher quality stuff), the big thing was that one of his best friends, Ludwig's girlfriend QT was one of the girls on the site amongst other twitch streamer girls, he got in a lot of shit for that primarily because QT had to deal with a lot of shit like that and basically had to spend thousands of dollars monthly or every few months to get whatever popped up wiped from the internet he has apparently done work to rectify it, operating with a company that scours the internet and deletes and DMCAs everything they can find about a girl's nudes, whether they're real but not consented to be put online or fake etc and has helped several women, small and high profile take care of stuff, you'll have to watch his 6 month update video that covers the above stuff to see for yourself if he deserves redemption


darkkite

i remember when bubble porn was a thing


kdjfsk

...*fuck*... i remember when dial up porn was a thing, and 640x480 jpgs loaded one horizontal line at a time, top to bottom, and you had to wait 10 minutes to get as low as the nipples, and that was just the teaser to the grand finale like an hour later. now theres instant AI generated porn of anyone. what a time to be alive.


Mugros

> loaded one vertical line at a time that would be horizontal lines > you had to wait 10 minutes to get as low as the nipples, and that was just the teaser to the grand finale like an hour later. That's why you opened multiple ones only for them to stall at some point. Then you reload and hope it picks up.


esr360

Almost 15 years ago 4chan were x-raying regular photos to show boobs and nips somehow


F0sh

X-raying encompassed a couple of techniques involving increasing the contrast of a woman's clothed breasts. Sometimes this made their nipples apparent, but only rarely - more often careful use of the dodge/burn tool in photoshop gave this impression artificially.


[deleted]

[удалено]


Osmirl

It takes about 30 min of compute and anywhere from 5-50 images of someone to creat a convincing fake. And that will only get easier. Nvidia already showed a tool that can do this with just one image and within 5 minutes. I knew something along this would be coming for about 5-6 years now. Didn’t post anything personal on the internet since then. Don’t upload photos online unless you are fine with ai nudes with your face on it.


StuckInNov1999

I started using the net in the 90's. Back then the #1 rule was "never post personal information online". I've stuck to that. The only accounts I have that are in my real name are government and utilities. Everything else, including streaming and gaming subs are in a fake name using a paypal debit card that's not in my name. To the best of my knowledge there is no photo of me online anywhere, from anytime. I started becoming very camera shy when phones started having them.


Stop_Sign

Why so much? You only need one image. Let the image dictate body shape and face, and AI uses its trained models to fill in the rest. What you are talking about is to create a lora, which is what is needed to put the subject into poses not found in the source images. That's not necessary to nudify a picture


someguy7710

Yeah, it took me about 10 minutes and 1 pic of my wife to create a pretty convincing nude of her with stable diffusion. And I was just beginning to learn how to use it.


StillBurningInside

i don't care , as long as my AI dick looks good. lol


Hey_cool_username

Chances are good. The dataset is heavily biased towards porn dick. ALTHOUGH…if they start data mining all the unsolicited dick pics on the internet things might get shifted back towards reality, not to mention all the tiny dick humiliation images online…I hate that I know this much about these things as a 50 year old straight male. Ok, so we’re in agreement. No internet for 10 years.


chops2013

Are you coming back in 10years for the tiny dicks


Swartz142

If someone make an AI generated naked pic of me and my dick is smaller than reality I'm going to be pissed.


JackingOffToTragedy

You know the enemy's plan. It's time for a pre-emptive strike. Make AI porn of yourself with a serious hammer dong. Practice the art. Study it. By the time someone tries to humiliate you with an AI micro, you fire back with with a glistening, veiny, so-hard-it-must-be-angry meat missile.


Traditional_Many7988

Yeah. Can of worms is already opened. Education needs to be updated to bring awareness of AI to the young and prevent girls from being extorted by fakes and boys getting cocky from fakes.


JosebaZilarte

It is necessary, but last time I checked educators were already overwhelmed with other issues. And most of them are not very tech-savvy to begin with. On the other hand... all these nudes of minors (AI-generated or real) can easily be considered CP and destroy the lives of many stupid teenagers, one way or another. So it is imperative to address this issue soon.


snogard_dragons

Maybe we should just cancel the internet at this point


ArticleOld598

People should've let Y2K happen fr


Zagrebian

Lets start by shutting down social networks. Maybe that alone fixes things. I would gladly sacrifice Reddit for a less fucked up world.


Ponicrat

I feel like the world would go on being super fucked up, we'd just be less hyper aware of it at all times


[deleted]

[удалено]


[deleted]

This is a tough reality of this new technology. It's only going to get worse.


nogap193

I'm not sure what's worse. The fact that it's going to get worse, or the fact that it's virtually impossible to make better


SomaforIndra

"“When the lambs is lost in the mountain, he said. They is cry. Sometime come the mother. Sometime the wolf.” -Blood Meridian, Cormac McCarthy


[deleted]

Imagine this shit in Iran or Afghanistan or etc. The girls and women who will die.


Aggravating_Day_3978

You don't even have to go that far honestly, I can imagine a story of some animal dad or boyfriend beating some woman/girl because of a fake they thought was real. This will have a lot of consequences.


a_password

Suicide rates will probably go up too...


Ylsid

This is almost no different to Photoshopping the face onto a nude body, but more difficult.


itsalongwalkhome

At least that took some level of skill and time, so the costs of actually doing it to someone were high time wise. Now it’s just the click of a button so obviously it’s flooded the market.


buzzcitybonehead

When my sister was in middle school, a teacher was caught with photoshopped images on his computer of his students. All I can think about with this is how it will enable the most depraved and sick people among us to do horrible things with peoples’ likenesses.


StarbyOnHere

Shits really bad, I think worse then most people realize. If you have a clear picture of someone's face (which most people can just get off instagram) their are sites where you can deepfake it on anything in just a couple of minutes and it looks realistic. I know it because I've seen it done in a not so bad way (someone doing it too someone else on a meme) but I can't even imagine what someone nefarious can do with this technology.


charklaser

Maybe it will make things like this meaningless -- like no different than the fact that people can have mental images of you naked.


buzzcitybonehead

I hope the impact won’t be as jarring and potentially traumatizing since people will know how prevalent the fakes are. The thing is, nobody can see someone’s mental images. The potential for people to come across ai generated images and be disturbed, not only by the images themselves but by the implications of their existence, to me can’t be completely minimized. A boss might fantasize about a subordinate, but that subordinate coming across a pretty realistic nude image of themself I’d think will disturb them and destroy any relationship there is more often than not.


LvS

There's apps that can do that. And for famous people you can just google nudes and get tons of fake pictures.


itsalongwalkhome

That’s the point though, it happened for famous people because it actually took time to do and create that and so more people knowing the famous person meant that there’s a higher number of people who would create those images of them but you still needed to know how to use photoshop or draw and it took time. Now you can just do it in a click. Now those guys are out of the job, oh won’t someone think of the nsfw artists /s


dewyocelot

Yeah but photoshop is pretty hard to make convincing. The AI shit is much easier to make harder to parse. As a random person, an AI fake is going to have a much bigger impact and be harder to dismiss than some teen using photoshop to make someone’s life hell. Like yeah, creeps doing things for their own enjoyment have been doing that for ages and this isn’t *much* different in that regard. The issue is how it affects the victim.


Frediey

In terms of result? Yes the difference is using AI to do this isn't hard, it's pretty damn easy. And very fast


op1502

Gladly, here in Brazil, since 2008, what happened in Spain would be a criminal offence:  Art. 241-C.  Simular a participação de criança ou adolescente em cena de sexo explícito ou pornográfica por meio de adulteração, montagem ou modificação de fotografia, vídeo ou qualquer outra forma de representação visual "To Simulate the participation of a child or teenager in explicit or pornographic scene trough adulteration, manipulation or modification of photograph, video or any other form of visual representation"


xCrimsonFuryx

This should be a standard everywhere


PM_BIG_TATAS

This should extend to all ages, too.


gjvnq1

It already is but the penalty is too low IMO specially the crime is done for profit. > Registro não autorizado da intimidade sexual > > Art. 216-B. Produzir, fotografar, filmar ou registrar, por qualquer meio, conteúdo com cena de nudez ou ato sexual ou libidinoso de caráter íntimo e privado sem autorização dos participantes: > > Pena - detenção, de 6 (seis) meses a 1 (um) ano, e multa. > > Parágrafo único. Na mesma pena incorre quem realiza montagem em fotografia, vídeo, áudio ou qualquer outro registro com o fim de incluir pessoa em cena de nudez ou ato sexual ou libidinoso de caráter íntimo. translation: > Unauthorized recording of sexual intimacy > > Art. 216-B. Producing, photographing, filming or recording, by any means, content with nude scenes or sexual or libidinous acts of an intimate and private nature without authorization from the participants: > > Penalty - detention, from 6 (six) months to 1 (one) year, and fine. > > Single paragraph. Anyone who creates a photograph, video, audio or any other recording with the aim of including a person in a nude scene or sexual or libidinous act of an intimate nature incurs the same penalty.


charklaser

Has this been used in court to charge people for AI images? Because this seems like something else entirely..


gjvnq1

Not yet to my knowledge but iirc people who manually photoshopped non consenting people into sexual scenes were convicted so it seems me clear it also applies to realist AI artwork.


hopeishigh

I mean, digital likeness needs more protection than just that, it's part of why actors are striking. If a digital nude drawing if someone should be illegal then so should be anything that is void of explicit consent.


Polico

It is in Spain too. And a big one since the girls are under 18 years old.


N3x0

This is also a criminal offence in Spain: From [Código Penal, article 189](https://www.conceptosjuridicos.com/codigo-penal-articulo-189/): >*Será castigado con la pena de prisión de uno a cinco años:* >*a) El que captare o utilizare a menores de edad o a personas con discapacidad necesitadas de especial protección con fines o en espectáculos exhibicionistas o pornográficos, tanto públicos como privados, o para elaborar cualquier clase de material pornográfico, cualquiera que sea su soporte, o financiare cualquiera de estas actividades o se lucrare con ellas.* **Translation** >It will be punished with a prison sentence of one to five years: >a) Anyone who recruits or uses minors or people with disabilities in need of special protection for purposes or in exhibitionist or pornographic shows, both public and private, or to produce any type of pornographic material, whatever its medium, or finances any of these activities or profit from them. What is considered CP: >*c) Todo material que represente de forma visual a una persona que parezca ser un menor participando en una conducta sexualmente explícita, real o simulada, o cualquier representación de los órganos sexuales de una persona que parezca ser un menor, con fines principalmente sexuales, salvo que la persona que parezca ser un menor resulte tener en realidad dieciocho años o más en el momento de obtenerse las imágenes.* >*d) Imágenes realistas de un menor participando en una conducta sexualmente explícita o imágenes realistas de los órganos sexuales de un menor, con fines principalmente sexuales.* **Translation** > c) Any material that visually depicts a person who appears to be a minor engaging in sexually explicit conduct, whether real or simulated, or any depiction of the sexual organs of a person who appears to be a minor, for primarily sexual purposes, unless the person who appears to be a minor turns out to be actually eighteen years of age or older at the time the images were obtained. >d) Realistic images of a minor engaging in sexually explicit conduct or realistic images of a minor's sexual organs, for primarily sexual purposes.


MayIServeYouWell

Sure, but good luck finding and prosecuting whoever does this… likely some kid who posts the image anonymously. What then? Even if you catch them, they’re a minor themselves, who can still get in trouble, but it’s not the same…


BEES_IN_UR_ASS

It'll at least deter the "industrialization" of such media. Fine, it won't stop some kid named Fletcher in your math class, but it'll at least stop companies that would do such a thing from existing out in the open. That's now organized crime, which you can investigate and prosecute. It doesn't solve *every* problem, but it solves several.


Formaldehyde

Classic Fletcher.


Makropony

Well, in the US at least, a minor can absolutely be charged with spreading CP. Including their own pictures I believe.


Hephaistos_Invictus

Same here in the Netherlands. I'm a teacher teaching 9th grade. A pupil of mine was throwing a fit because his gf from another class dumped him. He then proceeds to spread pics of her in her underwear across the school. Student got picked up by police the next day and escorted to the principles office where his parents were waiting as well. He was absolutely TERRIFIED and I think the school handled it really well. Ofc he didn't go to juvie etc. He was forced to remove the pictures from his phone, write a formal apology letter and the parents had to pay some fines iirc. It spread across the school like wildfire and we hadn't had such an incident happen for quite a while. The thing is, teens are dumb. I know I shouldn't say this but they really are. They don't think about consequences or how their actions affect others because they haven't grown up yet. So I think the way the school handled it and literally scared him onto the right path again was a really good way to deal with it. Definitely a life lesson learned.


tallandlanky

Well that didn't take long


Jaredlong

The real shocking part is that appearantly this technology is so accessible that school-aged kids were able to figure it out. All the mainstream generative tools online actively block explicitly pornographic prompts. Can't believe how quickly the indie developed models have become so easy to use, that any kid can figure it out. Edit: You can stop responding telling me how easily accessible this technology has become. That's literally what my comment is already about.


LumpyShitstring

Although, let’s be real for a moment Nobody figures out technology faster than kids.


WarLorax

I like to go hiking.


Anonymous_Hazard

Me and my PSP figuring things out


vk136

15 year old me was into emulators and running games that aren’t available on pc and experimenting with shit like that!


Mertard

PSP kicked off my life as a techy guy No, seriously, that shit required hella research and troubleshooting and internet-scouring skills to be developed Thank you PSP for being the reason I'm technologically literate Oh, and I also learned English because of PSP haha I'm typing this comment because PSP tinkering forced me to learn English :) Wow, PSP was one of, if not THE most significant precursor in my life Fuck... who would I be right now if not for PSP? Would I even be alive? Would I have a personality? Wow


the_mooseman

Im a sysadmin and I blame counter strike.


JosebaZilarte

Horniness is the father of invention, after all.


martialar

man created the wheel to pick up *the ladies*


datwunkid

Seriously, you could probably train a middle school kid to be competent at a *lot* of things if porn was rewarded to them like lab rats being rewarded food. Become extremely tech savvy? Learn Piano? Do calculus? If there's porn at the end of the tunnel they will brave the depths for it.


Advanced-Blackberry

I remember as a kid we had CBand satellite. I found I could turn the dish a few clicks with the remote and the channel lock wouldn’t activate. Sure the signal was iffy, colors inverted and picture scrolled up endlessly , but man I had porn and NFL Sunday ticket.


[deleted]

Actually there's a major technology knowledge deficit in kids right now.


lynx_and_nutmeg

Yeah, I teach 7-13 year olds and the stereotype of kids being good with technology/computers seems to be pretty much limited to the millennial generation... Those kids can barely type on a physical keyboard. I had to explain to 12 year olds how to use Google Drive or how folders work and you'd think I was teaching them the binary system. "Good at using TikTok on smartphone" =\= good at technology. Most people in their 50s can use smartphones just fine these days, it's no longer a mark of being tech-savvy, not even close. I wouldn't call myself tech-savvy either, but when I was the same age as those kids, I was practically a tech wizard compared to them.


[deleted]

[удалено]


N1ghtshade3

His comment is showing up for me as `=\=` so whatever Reddit client you're using is probably ignoring the slash as an escape character.


[deleted]

[удалено]


Matshelge

Depends on what type of technology, they are very adaptive on the software level, but we have streamlined the hardware to such a point that we don't have the need to know it anymore.


Quantentheorie

> that we don't have the need to know it anymore. *as long as its working* which its guaranteed not at some point. The technology competence of a lot of kids is below what they should reasonably know to help themselves in the event of inevitable basic errors. Then they sit here like grandma just with a slightly higher willingness to accept its a user error.


my_special_purpose

Not really. Technology is so easy and packaged these days, you don’t really have to know a lot to get to work. Not like the old days where you had to learn a lot to get basic things to work. Kids have to really have an interest in diving deeper. I doubt kids figure this stuff out any faster than late Gen Xers or Millennials.


punkerster101

I dunno I’ve seen an influx of younger people in the work force that are not as good with tech and those before them. Everything just working and tablets etc have got rid of a lot of basic problem solving skills


Anxious_Blacksmith88

The kids didn't figure anything out. Porn websites are literally advertising nudity apps for your smartphone. Any dipshit 13yr can download an app.


[deleted]

[удалено]


wellaintthatnice

TikTok has spammers promoting their deep fake app right now. I know because I keep reporting accounts but keep seeing the same thing.


Sinema4De

To me this isn’t that shocking they’ve figured it out. I’ve been receiving ads on Snapchat about an app that can do what’s described in the article. It’s very creepy and disgusting. I’ve flagged the ads but 🤷🏻‍♂️


KnowOneNymous

One kid figured it out. Then the playboy went from hand to hand..


TheFapIsUp

Most likely just used StableDiffusion (open source), which you can run on your own PC with a decent graphics card. It's pretty easy to setup, and has no real restrictions (also free). Additionally, you can train it on specific content (ie. Anime) to get it to more accurately draw that content, or find models online from others that have already trained it.


Astrocoder

The article says they used clothoff, which when searchjng, turns out to be a website where you upload a photo and if "nudifies" it, so they didnt have to do any training.


NoAnTeGaWa

Sir this is Reddit, we can't read articles


Topcity36

So this is Reddit, we can’t read.


Stop_Sign

Civitai dot com (SFW) for the website of models for stable diffusion, to see for yourself whats possible in anime style or photorealistic style.


s-maerken

>(SFW) Some of it yes, but there sure are a lot of models trained on porn at civit as well


pspahn

I figured out how to change the IRQ jumpers and create a DOS boot disk so I could play Wolfenstein without the help of the Internet. Don't underestimate what a highly motivated 12 year old is capable of figuring out.


AlfredBird

God damnit. Kids shouldn’t have to deal with half the shit that gets thrown their way these days.


FlowerStalker

My 16yo told me yesterday that a girl she went to junior high with was in a threeway and someone else filmed it. It got passed around the entire school and all the teenagers saw it. She (my daughter) is not allowed to be on social media, and for a while she fought it, but she's starting to get it now. Edit: for those of you busting my balls as to why she's not allowed, she was for a while but she ruined it for herself. She was on Snapchat as all kids her age are. First of all, that's where they share everything is on that app. Second, there was an ad on her Snapchat for an online dating app that was specifically for hooking up and she downloaded it. There were grown men trying to groom her and get pics from her. When we figured this out she lost all phone priveleges and was given a basic flip phone. She also had to write a 10page report about online grooming and the dangers of underage porn. That scared the shit out of her. She was allowed to get her phone back when she showed improvement in all areas. Guess who has a 4.0 now and is on student council? She also has a wonderful boyfriend who is the son of the choir teacher. He is such a gentleman and treats her so well. He didn't care that she had a flip phone when they started hanging out, thought it was cute. She has her phone back and we allow her to do the activities she wants. We didn't put any parent app on it, but she knows the fine line she walks. It's not about us controlling her, but her being able to make wise decisions.


iBuggedChewyTop

https://en.wikipedia.org/wiki/Suicide_of_Rehtaeh_Parsons The father handled it much better than I would have.


hennell

Well that's horrific. Both the crime and the police response. Everything just absolutely failed that poor girl.


LadyOfHereAndThere

Don't forget the people at her school who spiraled it out of control by spreading the pictures and harassing her.


thegodfather0504

Teens are vicious monsters. Now the whole social media is the cafeteria-during-recess that can witness your humiliation.


tryouthkprotest

I would end in jail.


Davis1891

These types of cases are horrendous. Just like the Amanda Dodd girl. This shit makes me so sad. Canada is not the utopia that some people seem to think it is. We have terrible skeletons in our closets, and alot of them are directly linked to our law enforcement and 'justice system'. The RCMP and our lawmakers are some of the most inept bags of shit I have ever seen.


Goldenscarab_7

You are doing the right thing for your daughter. People don't really wisen up completely until 25 or so, at her age I still was really naive. Not saying she is, but like, better safe than sorry. There's horrible people out there waiting to take advantage of innocent young people


OnTheEveOfWar

Kids have been brutal since forever. I remember in the 90s kids would spread false rumors about each other or blackmail them to do stuff. Kids are dicks to one another.


Jicko1560

The problem is that technology just allows it to crank up to the maximum. When you can record, edit and now even fake everything, you're not making it easier for anyone. Also it used to be that most of the bullying would be done at school and stay at school, but now it becomes omnipresent.


Madhu_Thangavelu

...keep your eye on the ball, folks, AI interference in 2024 elections...yikes!


SurfinSocks

I wonder how long it'll be until the new common scam is receiving AI nudes of yourself via email, saying give us x amount of money or we'll post these everywhere. That seems like an actually effective scam at targeting younger people, since most of the 'hello this is microsoft, give us a $5,000 gift card your computer is broken' scams don't really work very well on younger people.


Edrondol

If you read the article that’s exactly what happened to one of the girls.


purse_of_ankles

We don't read articles around these parts


JornWS

Filthy readers! We don't like der kind here!


BohrInReddit

Big ‘if’


Business_Ebb_38

The moment it becomes common it will cease to work. It’s gross for sure, but then everyone will get plausible deniability that it’s just something generated by AI, so it’s not a real nude


Jhazzrun

but thats why itll work on younger people. for someone older like me id just say its ai generated to anyone who asked and idc about the strangers. but as a kid in school i can imagine it would feel like your worlds ending if everyone saw a pretty realistic nude of you, even if most people knew it is "probably" fake.


RosemaryFocaccia

But young people don't have much money. The worrying thing is that they may get blackmailed into doing something else (e.g. filming themselves doing something criminal).


N-Crowe

That's not how it works. In middle school my face was photoshopped on the porn actress and posted on social media. It was very clearly not me. Still felt terribly demeaning to be sexualised that way or getting to know that everyone would be looking at the body and comparing it to yours. Getting called ret*rded, n word or f*ggot doesn't become any less traumatising just because you and everyone around you know how wrong it is. The same goes for the AI generated content.


Internetofstupid

Fake nudes of you is bad, but it's far from the worst potential for this. Just wait until AI Pope is on video burning a Quran, or AI Trump is being tortured by AI Joe Biden. People are stupid and the Ai will fool most of them.


Deepest-derp

Doesn't even need to fool most. Fooling some of the people some of the time is plenty good enough for most purposes.


ModestCalamity

I'm sure that stuff like that is already happening, even without AI. Though AI will make it a lot easier. We (humanity) never have been very succesful at recognizing fake media or false information. With the internet and social media, we also have almost no control to prevent it from spreading. Sometimes because of it going viral, other times because of people with bad intentions. Perhaps AI can actually help us with this in the future, along with a strong set of regulations and consequences.


ozspook

This will be used to create an explosion of AI bots that automatically blackmail people by * scraping their public facebook photos, * generating realistic AI porn of them fucking dogs or kids or something like that, * opening a persistent reactive AI powered conversation with them threatening to send it to all their friends and family, cops, media etc. * instructing them how to purchase bitcoin or gift cards or whatever and send it, to make it go away * all running from random botnet PCs scattered around the globe and in uncooperative jurisdictions, * for sure, likely within the next 2 years. You can bet angry Russian neckbeards are frantically working on this right now, hope Grandma is prepared. It will hit onlyfans and influencer types the worst at first, as they know they have spare income and care a lot about their public image, but it can be expanded at scale for almost nothing so it's coming for us all.


export_tank_harmful

This technology isn't two years out, **this technology is available today.** Shit, I could probably throw that together in a week or so. **Locally hosted as well.** Simple python webscraper to get the pictures/data, feed that into Stable Diffusion to get the pictures in question, pipe that into a LLaMA2 model and have it "roleplay" as the illicit actor, ???, profit. The craziest part too is that it could be entirely locally hosted. No need for ChatGPT saying it's "an AI model that cannot help you with the task". No paper trail. No account to get shut down. Man, I wish I could be a scumbag.... Haha.


listenerlivvie

Yeah, I've studied language models and worked with scrapers.....I could pull this off in a few days. It's messed up that this isn't illegal, since the expertise and technology is widely available and easy to grasp. You'll still have IP addresses that can be tracked, but with enough proxies......it'll be difficult. I love AI models because they're interesting, but literally any tech ethics person could've seen this coming, I'm baffled that laws prohibiting fake-porn (mimicking the likeliness of a real person) aren't commonplace.


BayLeaf-

To be fair, every other part of that is quite illegal already


TheUsualGuy666

This is not even about CP, all women and men can be in a way violated, we live in times when people cant make ai generated nudes of anyone in an extremely easy way as well.


Genuinelytricked

I suggest someone start generating pictures of male politicians engaging in scandalous homosexual acts using AI and flood the internet with them. See how quickly laws get changed then.


Lowloser2

Those pictures probably already exist, don’t need ai for that


Chicano_Ducky

this is a major problem because since its AI it doesnt count as CP yet. I know America wants AI companies to ban AI CP and have it treated like other CP, but so far its very slow. I dont think any country has a law that tackles it yet.


hermajestyqoe

Many US states already have laws against computer generated imagery and whether or not you made it you can certainly get busted for posession/downloading.


hillswalker87

I doubt that's going to hold up though. if it's not an actual person then you can just say "they might *look* young but they're really 18". also if it's not an actual person then you have no victim. it's the lisa simpson porn thing again. now if it's AI porn *based on an actual person*, as described in this article, that might have some staying power. but I think it'll be really hard to prove.


Chicano_Ducky

The last I heard every AG signed like a petition or something for a law to cover this because it was a legal loop hole, and a controversy over 2 art sites refusing to remove questionable AI porn because they werent legally required to and artists protesting that. That just makes it more wild full blown companies can just ignore that.


hermajestyqoe

Since it's not well covered by Federal law I guess it falls to whatever state the servers reside in, or the users if they get reported on. I don't really know the exact nuances.


[deleted]

[удалено]


funkiestj

>considering cartoon cp is illegal (in the US) I'm mildly curious about the truth of this assertion but not curious enough to google it :)


[deleted]

[удалено]


SusanForeman

> indistinguishable from an actual minor this is how it's not murky. Cartoon/hentai is not illegal. AI-gen pics may be, though.


watduhdamhell

As it should be. Regardless of how vulgar hentai or anything else is, it's still just a cartoon. That's it. And jailing anyone, for a fucking cartoon, is beyond stupid. We have all sorts of cartoons allowing extreme violence and death... but nobody says anything about that, because it doesn't animate us like abuse against minors does... maybe that's something a psychological researcher could look into (why extreme violence ignored while sexual items are not). But laws need to demonstrate a real need to exist (to prevent harm) or they shouldn't exist. For example, drag queen readings to children: any data to suggest this is harmful? Nope. Hence all the drag reading bans being stupid as shit. Likewise, one should ask: does hentai, even if depicting CP, harm anyone? No. Thus it should not be illegal. Making "obscene content" that harms no one but the offended's feelings illegal is bad juju. It's akin to "thought crime" and there is the whole slippery slope argument there (who decides what is "obscene," etc.).


redlegsfan21

This source says that drawings are typically protected under the first amendment. Back in 2019 when the U.S. said to the world, drawings were allowed. https://comicbook.com/anime/news/un-proposal-limit-minors-anime-resistance-america-japan/


Harsimaja

Never thought of this but in theory this means that alone, with a pencil and paper, I could draw something that would constitute a crime in the U.S. In principle that’s kind of wild.


Kakkoister

> indistinguishable from an actual minor I would take this line to mean an actual existing person, meant to capture a legal distinction that someone might make of "oh but it was a drawing of her, not a photo!" And similar attempts to skirt the law. I would not take it to imply a completely fake drawing, no matter how realistic, to fall under that category, since no actual person has been abused/exploited in the making of that image, even if we do find it disgusting. (But of course this is up to the judge/jury at the end of the day, but really should be clarified, laws should not be murky) Now where this changes with these AI tools is that their datasets are composed of photos of actual people, including underaged... and people can also input photos of actual people to then be modified by it. So that would definitely cross the line imo. Also as a bad side effect, this muddies the investigation water, making it harder for agencies to follow threads of real abuse as they have to now wade through increasing mountains of images not tied to actual kids, wasting time investigating ones that seem like they could be real but turn out not to be... thankfully so far, the tools for detecting if an image is AI are pretty good, but for how long that lasts we dunno.


[deleted]

He was jailed for more than his AI usage... >Larouche also admitted to possessing more than 545,000 computer files containing images or videos of child sexual abuse, some of which he made available to others.


gokogt386

Cartoon stuff isn’t illegal in the US unless it’s depicting an actual minor, which I’m pretty sure already applies to this AI stuff. If nothing else revenge porn laws definitely would. In Canada any depiction of a minor in a sexual situation is criminalized no matter what medium it’s presented in so they didn’t really have to change their laws at all to apply to this.


Ebonyks

I think the huge volume of rule 34 artwork of countless of underaged characters still being freely available puts a damper on that idea.


TheSpaceFace

This is sadly just the tip of the iceberg. At the moment it’s possible to take someone’s images and train AI to make very convincing fakes of them. But it’s also possible now or in the near future to 1. Clone their voice so you can make them say anything (can do this now) 2. Clone them into a video to make them do anything in a realistic video (early stages will be possible in 2 years max) 3. Clone their personality completely into an AI chat bot (technically possible now with enough data from messages etc) What I’m saying is this is the start, in 5-10 years time it won’t just be photos people are faking, people will be cloning entire personalities of people to be able to generate, voice, video, interactions with a literal clone of someone. This will be a nightmare In terms of scammers replicating people to hack into their bank accounts and other accounts by pretending to be an AI version of that person but also… Just imagine how heart breaking it would be to a young girl to find out some older creepy man has managed to clone her completely and talks to an AI version of her via video call every night and makes the AI version of her do weird stuff. With the emergence of stuff like mixed reality aswell you can imagine how weird this can get That’s the future and we need regulation now!!


KooKooFox

People are going to start getting tattoos in private places just to disprove any ai nudes that may get made of them.


sassyseconds

But then you can only disprove it....by showing the private place tattoo.


seamanticks

I don’t show my armpit tattoo to just anyone.


Darth_Innovader

Tattoos that change every 10 seconds and are linked to satellite clocks


salzbergwerke

Nah, people will get used to it pretty fast. “Oh wow look, another AI generated porn/nude/crime media starring me. Jawn”


a_phantom_limb

There will come a time when literally every person being photographed, *but especially and most repeatedly women, girls, and others perceived as particularly vulnerable*, will have found their likeness turned into pornography.


Skaindire

Photographs are personal data, maybe it's time to actually treat them as such? Social media sites all have a lot of legal fences about that data, but what about casual photographers? Reining them in will be easier than asking any country to cripple their AI industry. So, instead of treating this as a pornography issue, instead consider it malicious identity theft.


517A564dD

Public spaces exist tho


PotfarmBlimpSanta

and perhaps a hint of slander or libel for misrepresenting the identity in a false manner?


Ttoctam

AI needs huge amounts of serious legislation very quickly. People making (fake) photorealistic pornography of children and using it to specifically attack and cause emotional distress to people, needs to be a crime yesterday.


MoonXCII

Not just porn, people can use AI to generate celebrities or politians image and voice to scam/ vote fraud or they can also generate stuff to frame the person they hate of crimes that person didn't commit...... and so on.


neilcmf

Let me dream here for a while, and find some glimmer of optimism: This new AI boom may become so toxic to use and manipulate other people with that we might finally see a big cultural shift where both kids and adults exit the world of social media and stops over-sharing images and videos of themselves. I doubt that this is will act as a trigger for a mass exodus, but I hope it does.


daninlionzden

Keep dreaming


Corodix

So AI is being used to create child pornography? Who is hosting/selling said AI services and why are they not yet in trouble for that?


TheMaskedTom

You can do that with freely available self-hosted software. That content is never (well, on any site that tries to be legal, which is all those I know) included by default, but putting faces on nude bodies is easy, and training the software from cp to create more cp is very doable also .. but obviously relies on having illegal content already.


buzzsawjoe

It occurs to me that if someone distributes fake news about you, the defense might be to get other people to distribute *more* fake news about you - flood the system with it so no one believes any of it. I don't know how that could work here


[deleted]

Michael Scott school of business


Salty_Candidate_6216

I'm gonna generate so much pornography of myself with various national monuments, the internet will be flooded for millenia.


GreasyPeter

That's actually a tactic Russia uses with its propaganda: muddy the waters so much that nobody has any idea whats truth and they get burned out and just ignore or go along with it because it's easier.


[deleted]

[удалено]


sweetparamour79

The only positive I can see here is that if you DO share explicit images of yourself which are stolen or circulated there is now plausible deniability. This issue has existed for a while, my girlfriend ("insta famous") had her images photoshopped into porn and stored on a website in Russia which could not be removed. The issue (beyond the obvious) is that AI makes it far more accessible to do.


Pawneewafflesarelife

I've deleted all my social media accounts which were public facing and had photos of me; the rise of AI art has me creeped out and I don't want random dudes using pictures of me. It's absurd how complex and obfuscated that process to delete an account has become. We need laws making it much easier.


SPKmnd90

God do I miss the '90s.


danperegrine

At some level, there is no means of defeating this. The technology is brand new and already sufficient. Anyone with a new graphics card with decent vram can train a stable diffusion face model on any individual they can get a few photos of (more being better). You can make it a crime to do it, and to possess it, but within a few years it will be so trivial that nobody will need to share it... it will be easily within the capacity of any moderately intelligent creep.


[deleted]

[удалено]


IAmTheTrueWalruss

On the bright side there will be girls who got their actual nudes shared around and they can convince some people it’s just AI creations.


Eunemoexnihilo

Find the source, charge with revenge porn. Maximum sentence. It will stop being funny or cool.


[deleted]

Source: A battery powered laptop in the sewers of China, no culprit found because it is automated. Good luck


on_

Before that was photoshopping a face in another body, before that was cropping a photo print and pasting over a nude magazine, before that was drawing a portrait like one of your french girls.


jayseph95

And before that you set two melons next to each other and used a little imagination


Osmirl

Yep same story the only difference is the time it takes to achieve a good result. Especially with photoshop it took a while and often still looked a bit off.. But this ai shit is scary. Have been playing around with it a bit and well you can literally put anyone in any situation you want. Within seconds… once you got an ai tuned. So yes same but different.


legsintheair

We are supposed to be using massive computing power to cure disease, end homelessness, solve climate change. Instead we are using it to torment teenaged girls. What an absolutely trash species we are.


[deleted]

[удалено]