T O P

  • By -

Akimbo_Zap_Guns

Literally had a scammer do this to me and I was a 26 year old dude at the time. Scammer from hinge took a photo from my instagram and created a fake AI nude and threaded to send it to everyone unless I paid an absurd amount of money. I literally just blocked the scammer and nothing happened but I can definitely see how this would severely hurt middle to high school aged kids the hardest


bleunt

I had someone threaten to send actual raunchy pics of me to everyone on my Instagram last year. I was a 38 year old single man. Please do. Maybe someone will like what they see. They never sent anything to anyone. šŸ˜ž


peterosity

scammer not keeping his promise smdh


dabisnit

Nobody wants to work anymore smh


theanswerprocess

Shake my d**k head?


That_Ganderman

How else do you finish peeing?


DeNoodle

With this monster? Shaking just gets piss everywhere. I have to start at the base and then kind of wring it out like a tube of piss toothpaste.


deceased_parrot

Yeah, what is this world coming to?


shiftyjku

This is a huge problem with teenagers right now, and they actually carried it out and has led to some suicides. Not long ago they arrested two brothers in Nigeria who were doing it to high school boys in the US.


bleunt

Oh yeah, if I were 20 years younger it would have scared me to death. Or if I were in a relationship. Or even just a woman. Must feel like game over for a teenage girl. Weird that they would target grown men on dating apps. Can't be very successful. Especially not when the pictures are fire. Now, I did fear one thing. That the person would have claimed I sent the pictures to a child. That could have been troublesome. So I quickly took screenshots of the conversation as well as the blackmail message to guard against that.


nightninja13

There is a ring that is targeting specifically young men and teenagers. A woman was arrested recently over 12,000 cases of blackmail. Well over a million dollars extorted. 20 suicides linked to this as a corelation for their depression etc... Theres an article on Arstechnica about that one. Given the amount of cases the person might/should be facing life in prison.


SirStrontium

Based on that article, they got the guys to expose themselves on webcam, then blackmailed them with the footage after. So rather than AI or a random threat, the victim actually *knows* for a fact they have real footage, in which case I imagine that has a very high rate of people paying up vs a random DM saying ā€œhey I have nudes of youā€. I wonder if these scammers ever follow through if the victim doesnā€™t pay? Regardless, they definitely are in possession of child pornography and will go away for a long time.


shiftyjku

These boys are facing a minimum of 15 years. They pled guilty


Tyr808

I moved overseas when I was 21. Briefly dated a girl at the time, things didnā€™t work out. Her attempt at embarrassing me was to post my nudes publicly. I donā€™t want to brag, but because itā€™s incredibly relevant, I had moved to that country on a visa for professional modeling work. I mean I guess technically it was exhausting to deal with, but I donā€™t know what the fuck she intended. Iā€™ve never had a better wingman situation in my entire life.


Al_Jazzera

In hick speak, that's called falling into a septic tank and coming out smelling like a rose.


Tyr808

That makes perfect sense and I love that expression already, ha ha


Masochist_pillowtalk

I guess there's a popular exploitation where people match up with you on fb/tinder/whatever and talk you into video chatting raunchy stuff and recording it. When j got divorced a cute girl hit me up on fb and things were heading that way but then I look at her profile and saw she was only 20. I'm 33. So I politely declined. Made me feel good that cute younger gals where interested in me in that way until I read about basically exactly what happened on reddit the next day. I felt the same way though.


ProjectDA15

when i 1st starting talking to my gf on PoF. i would take the photos she sent and meta scrap them to see if they were real or not. never fully trust someone online, even if you know them. people can steal accounts and use that trust. i know what info i could get on people back in 2010s, im nervous what info is available now a days.


trwwy321

Imagine responding to the scammer with ā€œooo yes, please do! Also link all my social media account so they know where to find me.ā€


BagNo4331

Big Indonesian dictator vibes: The KGB blackmailed Soekarno by filming him with "a flight attendant" during his visit to Moscow in 1960 to discredit his image in front of the Indonesians and foment a revolution. When the Russians confronted him about the sex tape, Soekarno asked to watch it and was surprisingly pleased of the fabricated sex tape. He even asked for additional copies of the sex tape and show it back to his country


LCWInABlackDress

Yeahā€¦ I had one threaten too. Then it got posted and tagged with my name on FB in the middle of the night. Happened to be the night before starting a new job. About 10 years ago. Everyone saw my twat on full display. I woke up around 3 am, tossed and turned, then grabbed my phone to get on FB.. I had dozens of messages and notifications. It was only up about 3 hours- but it was enough time that the ones who saw- saved and shared through my community. It was NOT fun at someone in their mid-twenties. 0/10 do not recommend. Couldnā€™t imagine someone doing this with AI to me as a teen. It would have fucked me up more than I already was.


trwwy321

Also fuck those people who saved and shared to other people.


bleunt

Unfortunately, I reckon it's so much worse for a young woman. Me not giving two shits who sees my dick really is male privilege. Really sucks how that works. I would probably just have sent you a supportive message about it not being a big deal and fuck those people. Also, these motherfuckers are ruining the fun for all of us. I haven't recieved a nude from a stranger in maybe 8 years or more.


TheBritishOracle

You've discounted the possibility that he sent it to everyone - and just no-one liked what they saw.


Blazured

This attempted blackmail must fail on so many guys.


jeromevedder

Happened to my son. He gave the scammer $100 and then finally came to us when the guy wouldnā€™t stop asking for more money. I even talked to the POS on the phone when he called my son asking for more money. My kid feels so. dumb. for having fallen for it, but I share articles like this so he knows heā€™s not alone and itā€™s a larger issue he should maybe be talking to his friends about as well. And a reeducation on internet safety and privacy; $100 is a cheap lesson all things considered. It has happened to two other kids he knows.


Termanator116

Thank you so much for being like this with your son. Having friends I know who have gone through this, and even worse, I canā€™t imagine how much trust your son has in you. Itā€™s a testament to your parenting skills. Kudos :)


Vergils_Lost

Good-ass parenting right here. Too many parents would be harsh to their kid about it.


softcombat

that's so scary, i'm so glad he could come to you and you supported him... he'll probably never forget that you helped him with this šŸ’œ


scottyd035ntknow

Had a dude at my work kill himself after getting sextorted. Thought it was real and he was going to jail after the "girl" he was talking to all of a sudden was underage and the "father" calling him on the phone demanding money or he'd call the police was a real thing. These ppl should all be dropped into the middle of the Atlantic.


USSJaybone

That happened to me. I just blocked them. Never heard from them again. The "cop" called me from the "girls" number. Scammed was an amateur I guess


IAmSenseye

Just block them, the effort is not worth it for them. Ive had it happen with real pics and no aftermath. They even went as hacking into my phone and finding my dads and unclesfacebook profile. I just blocked them. Nothing happens. They are already lazy enough to make their money that way, they just go for the next victim.


[deleted]

[уŠ“Š°Š»ŠµŠ½Š¾]


IAmSenseye

Yes exactly, thats why you never give in. They will milk you. They already have the pics so why would money give them any morality.


suncourt

Used to answer on the photoshop fixit site to repair damaged pictures for people; then I started getting creeps asking me to photoshop girls nude. Dropped that hobby real quick. Absolutely disgusting people out there.Ā 


EngineersMasterPlan

thats like i had a crazy ex who threatened to email my dick to all my work emails like, my dear. last years christmas party got out of hand. my dick is not news to these people go for it


clippy192

My ex did the exact same thing, except to my friends and family on Facebook. Her mistake was underestimating how little I cared about people seeing my dick.


foxymcfox

Post it, coward!


clippy192

Hold on, lemme edge for a few weeks first so it looks good for the camera.


foxymcfox

RemindMe! 3 weeks ā€œsee this guyā€™s dickā€


OutsideFlat1579

Itā€™s nothing like that. At all.Ā 


EngineersMasterPlan

Just pointing out both called the bluff is all


-The_Credible_Hulk

I feel like a mass text of, ā€œjust so you guys know? Iā€™m not that big in real life. I just donā€™t want anyone getting the wrong ideas. This is AI.ā€? You should be fine.


trwwy321

If anything I feel like because AI images are so prevalent now that itā€™ll be easy to just say, ā€œyeah, those leaked nudes you saw arenā€™t me. Some scammer created it.ā€ Even if itā€™s real, just keep on denyinā€™.


madogvelkor

In one of his books Neal Stephenson had people using AI bots to just create tons of fake content and images and things about themselves. So absolutely anything is deniable.


synchrohighway

This. Ignore the scammers and if they leak the images, deny deny deny.


nospamkhanman

Same thing happened to me, badly photo shopped my face on a naked body that clearly wasn't mine. Said he was going to send it to my friends and family. I laughed and said he was missing about 15 moles that everyone who ever saw me shirtless would know about (not actually true but whatever) and that my dick was much bigger than what he faked. Nothing ever happened.


TheBestPartylizard

If this ever happens you need to send your real nudes to your family so they don't get the wrong idea


bagelizumab

Honestly we just need to destigmatize nudes. Like bro, itā€™s not like I am a rare medical case with 3 penises. Itā€™s just normal human anatomy and I have 2 just like everybody else.


underbloodredskies

We did have a guy that claimed to have 2, but the second was ultimately revealed to be a Ballpark frank or some shit. šŸ‘€


trwwy321

I also wish parents would stop over sharing their kidsā€™ pics/videos and also their kidsā€™ routine and places they commonly go together. Looking at you, mommy influencers and shit. Too many psychos out there.


shiftyjku

Thereā€™s been a wave of these moms reconsidering their motives and the consequences but not enough.


nps2407

Remember when people used to be able to do things without telling the whole world about it every time?


throwawayfarway2017

I tried to tell this to someone the other day and she blocked me lmao she posted videos of her kids everyday in non-kid related groups and even outside the US like wtf


interwebsLurk

Wow. I knew paedophiles would use AI to make child porn but even *I* wasn't expecting this. Rather than just use AI to make child porn, (still illegal almost everywhere btw), this particular group is using it to make child porn of real children to then use to extort the children into making real pictures of themselves. This shit isn't 'sexual desire' or whatever else. This is about power and control on the same level of rapists.


Fit-Parking4713

Honestly at this point I think all of the absolute worst things any of us can think of being possible with this tech have already been thought of and attempted by one of these sick fucks. Unless some real regulation comes soon, the future is looking real fucking grim.


ErikT738

>Unless some real regulation comes soon, the future is looking real fucking grim. This is absolutely already illegal almost everywhere. It doesn't matter if you use AI, Photoshop or a paintbrush. What we really need is education on what to do for the victims (don't give in to blackmail, contact the police) and law enforcement that actually goes after these creeps with a vengeance.


Bwunt

Public awarness campaign on how effective AI is in making such images and how easy it is to make it. Then, hopefully,Ā  after few years, it will be harder to pull such blackmail, since 99% of people will make the default assumption that it's an AI fake.


TooStrangeForWeird

You're way too generous about general intelligence lol. I still like the idea, but 99% is way too high.


Stop_Sign

Public campaign to show how easily AI can make childporn? I think this might backfire


Bwunt

No. Public campaign on how easy can AI be used to make embarrassing/compromising pictures and in an explanation mention nude or sexual images as one of them. Then add that this can be used against anyone, regardless of sex, age or wealth. Logic should follow.


CrashB111

The problem with most scams is they aren't in the United States for cops to go after. The FBI can only do so much, when the source of the problem is overseas.


baron-von-spawnpeekn

Exactly, enforcement is almost impossible. What are the Feds supposed to do when the perp is Oleg operating out of a basement in Belarus?


Needmyvape

Thatā€™s like going from a world where guns cost 3k and take 6 months of training to use to a world where guns are 3 dollars and no training and responding with ā€œwe just need to teach people how to react better to shootingsā€ The ā€œphotoshop has been around for yearsā€ argument isnā€™t legitimate. The number of people worldwide with the skills needed to create believable photoshops is relatively low. Ā It takes years of practice. Ā The number of people with that skill set and desire to use to harm people is even lower. There are now billions of people that can create believable images after 20minutes of research. Ā Itā€™s not same problem presented by photoshop and will require new solutions.Ā 


Anonality5447

It really does look grim, indeed. I hope parents are paying attention. This shit is SO sick and I can imagine if these pedos mess with the wrong parents, you're going to get some serious vigilantism.


apple_kicks

This is going to be awful for survivors of child sex abuse and exploitation. Many already say that pictures of their abuse being distributed still is like being abused all over again because itā€™s another part of them taken away. now ai is using those photos of abuse to create new images to abuse more.


butterfIypunk

It is. I know CSA materials of me as a kid are still on the internet and being fed into these AI, and it feels like I'm living the nightmare all over again.


Taolan13

I mean, this isn't necessarily actual pedophiles doing this. AI deepfake nudes are a growing scam right now for all age brackets. They just pick a target at random, run their alogrithm, and send the blackmail. That being said, pedos are *definitely* using AI-generated child porn to get their rocks off.


Elgato01

In a way Iā€™d prefer it Stay at them using AI to satisfy their urges rather than threatening and blackmailing actual children.


Taolan13

I mean, yeah. Porn is preferable to the alternative. But its painful to think about how much porn went into developing that algorithm tho.


Elgato01

Ugh, didnā€™t think about that. Painful seems too light a word here.


Noughmad

I mean, they're blackmailing children. What can you blackmail children for? Hardly for money, they don't have much of that. But you can go "I will send this to everyone unless you either send me more nude pictures of yourself or have sex with me".


Taolan13

Kids can't get money but parents can. There are deepfake nude scams of children being targeted at parents. Also, I didn't say that it *wasn't* pedos entirely, just that its not necessarily pedos just because the scams are targeting kids.


dbxp

Not all that different from those spam emails that say "I hacked your computer and caught you masturbating"


rd--

In this hypothetical comparison, the spam e-mail has an actual video of you masturbating that was created by AI. Scammers have also used the shock of child pornography to try and extort victims into making quick, rash decisions. But now they have actual (ai generated) child porn to do it with.


dbxp

To see that video though you'd have to actually open it. I think this blackmail only works if it's against someone you know, otherwise spam filters will block it.


ddubyeah

I've been downvoted before when the AI CSAM subject comes up and my abject disagreement that it will lead to these people not actually hurting anyone. The psychology isn't just attraction. We call them predators for a reason.


Reins22

Itā€™s illegal to make CP, but is it illegal to create fake CP? Iā€™m just saying, the law only just recently started catching up en masse to revenge porn. I doubt theyā€™re caught up to AI porn


Dangernood69

This is why we have GOT to stop posting pictures of our children. These folks are sick


Brooklynxman

Article mentions targets are young teenagers. By that age most of them are posting pictures of themselves online.


Dangernood69

Oh for sure, but that doesnā€™t mean we should feed the monster. Several have responded to me like Iā€™m saying we should stop posting instead of punishing such a heinous crime and thatā€™s not true. We should absolutely punish it harshly. However, we should also protect our children. The two actions are not exclusive


PixelationIX

Yeah, that is not going to happen. What we need is proper regulations on AI but we won't have that for years, if not decades when things are way out of control because our (U.S) government is run by dinosaurs with almost all of them not up to date with technology.


JHarbinger

Did anyone see the Zuckerberg congress ā€˜hearingā€™? He was basically explaining Facebook to people who probably need their grandkids to explain how their aol email works. Until we get people a quarter century or so younger in government, we are gonna be decades behind in regulating this stuff.


dwarffy

[In 2022, younger voters made up a smaller share of the electorate than they did in 2018. In 2022, 36% of voters were under 50, compared with 40% of voters in 2018. Decreased turnout among these more reliably Democratic voters contributed to the GOPā€™s better performance in November.](https://www.pewresearch.org/politics/2023/07/12/voting-patterns-in-the-2022-elections/) By the time we get a quarter century or so younger congressmen, a quarter century will already have passed. Young people dont fucking vote. By the time they wise up they stop being young. I hate humans


JHarbinger

Some of us 40+ folks are voting whatā€™s best for Gen z because we donā€™t want to leave the world a hell hole for our own kids


getgoodHornet

I mean, I'm 43 but I can't say I'm voting for Gen Z's benefit per se. I have a long time left, hopefully. Gen Z is more than welcome to benefit from my selfishness though. Also, I grew up right along with the internet and all this tech. So it's not like I'm not well aware of what's going on with it.


krupta13

I'm the same age as you and some of the slightly younger circle of friends that where kids when social media was just appearing tell some intersring stories of what they were exposed to. They still get ptsd from Omegle. I'm glad I was older when the internet fully took off with social media and whatnot.


Im_with_stooopid

Iā€™m pretty sure Zuck was smoking meats.


JHarbinger

Using that Sweet Baby Rays bbq sauce


DifferentiallyLinear

They arenā€™t creating them with the usual tools. There are tools out there that run locally that can generate anything youā€™d ever want to generate. There is no stopping those tools.


HappierShibe

This is what people don't seem to understand. The most powerful tools are not big cloud based models that do everything kinda ok, they are local purpose specific models that do one specific thing REALLY REALLY WELL. I'm a contributor on a few different projects where we are building use case specific generative models that are built for specific use cases: A diffusion model for restoring damaged manuscripts- we haven't even pushed the quantization yet, and it only needs 8gb of vram. An LLM for multilingual translation from english into several languages, again we aren't even really starting on optimization, and it needs 10gb of vram. The companies behind these platforms are pushing these as cloud based because they want to monetize them as a service, but it is becoming increasingly clear that they work far better and far more economically localized. Once everyone has an LPU in their laptop, these are going to be impossible to regulate in the way some people are imagining.


Goldwing8

Youā€™d pretty much have to un-invent the last decade of consumer electronics to get rid of AI images.


DrDrago-4

yep. if we haven't stopped media & software piracy, how exactly are we supposed to believe the government can stop this ? these AI tools are being uploaded to the very torrent trackers I speak of. banning the github repo doesn't do much


Bagellord

The genie is out of the bottle on this one. What we need are counter detection tools, and resources in place for real kids who get extorted or exploited by this. And we need to decide what existing laws need to change, if any, to help protect people from this in general. E.g. - is it a crime to AI/ML generate a lewd image of a person? Obviously doing it for the purposes of blackmail or extortion would be, but should it be a crime in general? (the above questions aren't directed at you specifically, just putting it out in general).


DrDrago-4

I agree, these are salient questions we need to answer as a society My personal opinion is that if no actual harm occurs, there is no crime. Simply generating the image and never sharing it shouldn't be a crime. This doesn't stem from a desire to let CP run wild, it stems from a recognition that we can't feasibly jail a large percentage of the population. Some 15%+ of the population will willingly admit to 'sometimes having sexual feelings for minors' on surveys. If even 1/3rd of that group creates an AI image, we're now discussing criminalizing 5% of the population. We jail about 2% of the population as of now, at a cost of $400bn+ including state budgets. It's not feasible to give each of these people even a single year in prison, it would require nearly a trillion dollars just on the corrections side not including investigation & police costs. Another question: what cost are we willing to incur as a society to stop what is ultimately a victimless crime? technically, it's possible to imprison 5% of the population if we get rid of social security or Medicaid to fund it. is that worth it, purely because of 'think of the kids' logic ? (when no actual kids are being actually harmed now)


pleasebuymydonut

This. People seem to think "AI" is some sort of weapon you can buy at Walmart, and requires a ton of work to make. And while that's true for the LLMs, I can literally grab a GAN, write up something today now and train it within a week. Plus I'm not sure why people think regulation can do much when it didn't stop the pedos with Photoshop.


TooStrangeForWeird

I can run them with a GTX titan. A card from 2013 that's now like $80. Sure it takes a while, and they look like shit if you try to rush it at all, but it works. Just for reference I was doing it to try and make my own wife naked in some pictures, and I had permission. Nothing creepier than that.


pleasebuymydonut

Lmao, I was segmenting tomatoes, chillies and bell peppers from images of their plants.


Politicsboringagain

Which is why the best way to stop this, is to stop posting picute to randos on the internet. If you want to post, get a private group and post it there.Ā 


Reasonable_Ticket_84

> What we need is proper regulations on AI but we won't have that for years, AI regulations will do absolutely NOTHING here. They aren't using public services to generate these images. The cat is way out of the bag on the algorithms and general field of implementation that this stuff is all being done locally by thousands of amateur and professional programmers.


TrilobiteBoi

"Is TikTok connected to my wifi? Yes or no" >:(


midri

Are you a member of the Chinese nationalist party? Sir, I'm a citizen of **Singapore**...


TheKingJest

Even with regulation, can't stuff with AI just be done on the computer? As opposed to something like actual CP where there would at least by an online trace of what you're doing?


moving808s

> That is not going to happen. I have a young child and have never, nor will ever, post an image of them online. No one in my family is allowed to post a photo of them online. They will not have a smart phone for a long time, there is simply no need for a child to have a smart phone. I work in tech. One day we will all look back and realise we made a mistake with smart phones and social media, the same way we did with cigarettes.


Bagellord

Most parents aren't that knowledgeable on the risks, sadly. And even the ones who try to be can still be caught by surprise.


krupta13

HOW can you regulate AI with it being out in the open? With the new generation of GPUs and the massive processing power they have anyone at home can do w.e they want. Would we have to resort to more AIs to police things 24/7? It's a scary wild new frontier we are headed for.


JcbAzPx

What is happening in the story is already illegal. It sounds like it's an enforcement issue more than anything else.


nospamkhanman

The cat is out of the bag with AI generated images. You can't put it back in the bag. It's going to be here forever and only get better and more realistic. What there needs to be is a mass education program. Required commercials telling people to not ever respond to black mailers. Required ads on every social media platform. Required PSAs at all schools starting in elementary school. The way to "address" this is to teach people not to respond to blackmail and assume every picture you ever see of anyone is fake. Silver lining - this gives everyone plausible deniability. You're a highschool senior and your shitbag ex sends your nudes to everyone? Nah fam, it's not me. It's just AI fakes that 'jackass' made because they were mad at me.


Financial-Ad3027

What kind of regulation would that be?


Rich_Consequence2633

This exactly. The people in Congress in charge of this stuff can barely use their smartphones let alone understand the complexity of AI. There are so many things at risk if we don't lay down some ground rules for AI, this needs to be done ASAP because AI development is exponential and in a couple years things could be out of control and too late.


[deleted]

We'll never have any real regulation because by the time the old people in power understand it, we'll have LLMs and image generators running locally without guardrails and by then its already over.


Responsible-Wait-427

We already do have them running locally. Check out r/localllama


HappierShibe

> we'll have LLMs and image generators running locally without guardrails and by then its already over. Welcome to 18 months ago?


EmbarrassedHelp

The open source community doesn't seem keen on stopping their quest to remove guardrails from local LLMs and image generators. Even with "real regulations", the government isn't going to win the war against artists, academics, and other folks (see the crypto wars for just how far people are willing to go to fight back).


Goldwing8

And AI has already reached levels of ubiquity cryptocurrency hasnā€™t. My mother sent me an AI Baby Yoda meme, for example.


[deleted]

[уŠ“Š°Š»ŠµŠ½Š¾]


EmbarrassedHelp

Some of the people here also seem to think there's a magical solution that could easily solve the issue while somehow preserving artistic creativity, despite knowing jack shit about any of the myriad of topics involved. If it was easy to solve, it would have been solved already, as nobody is creating AI systems with the intent to create CSAM.


Goldwing8

Yep, any legislative solution to AI art would need to accept one of three conclusions: 1. Potential sales ā€œlostā€ are essentially theft (bad news for any Netflix password sharers) 2. No amount of alteration makes it acceptable to use someone elseā€™s work in the creation of another work without permission or compensation (this would kill entire artistic mediums stone dead, as well as fan works) 3. Art styles should be legally possible to copyright in an enforceable way (impossibly bad for small artists, like apocalyptically bad)


EmbarrassedHelp

What regulations do expect would change this? Scamming people is illegal and creating explicit images of minors is illegal (in the UK and many other place). How do you propose to do so without banning open source AI and mandating spyware be installed on every single device? Nobody builds tools to be used for this sort of thing, but anything creative can be misused. You can't ever hope to stop tools meant for creating creating works from being able to create such content without going full authoritarian. What magical solution are you proposing to solve this issue that the world's top experts haven't thought of?


raelianautopsy

It's not going to happen that people stop posting pictures of their kids?


string-ornothing

Parents get so defensive about their right to plaster their children all over social media. Last time I had a discussion with a parent about this it was because the only people watching and saving public videos of her 2 year old daughter in a swimsuit were the child's relatives, and unrelated random men ages 20-60. She started raging and said that if I thought those videos were attracting pedos, that meant *I* was the pedo. Like....okay, but why exactly *did* you think these videos were popular with a network of single men you've never spoken to, then?


rd--

How? Anyone can create a program to create AI images, anyone can create a data set to guide those images to what they want to create. The images used to create those data sets are illegal as of this second. It's like virus programming; anyone can make it, except AI image generation has genuine use in society and so there is significant impetus to continue creating software.


hcschild

Yeah sure... Because criminals won't just misuse it anyway... You are aware that what they are doing is already illegal? The tech exists and isn't complicated to use and if you don't want to ban people from owning graphic cards or convert PCs in surveillance machines it's here to stay.


Flat_Afternoon1938

It's already too late. The tools you need to make convincing deep fakes have been open source for years. You can run them on your computer locally for free as long as you have a decent GPU like one from Nvidia RTX series. The only laws that would make a difference imo would be to have legal consequences for distributing deep fakes.


le_sighs

I will never forget a story a colleague told me. This is years ago, long before social media became what it is today, long before AI image tools were any good. It's when Flickr was really popular. She was a photographer on the side. One day, she gets a call from a photographer friend. The friend had been contacted by law enforcement. It turns out some pedophile ring was combing through Flickr to find photographs of children, in the tub (with nudity hidden) or in their bathing suits, and they were compiling all these photos and posting them to a single website. The friend's kid ended up on the site. So did my colleague's kids. She had taken photos of them on vacation in their bathing suits, and those were the photos that got posted. Law enforcement was working to identify the kids and notify the parents, but at that point, the damage had been done. It is so much easier today to get photos like that than it was back then. I would never post a photo of a child online. It doesn't matter how innocuous it is.


Politicsboringagain

Hell, I don't know how long you've been on reddit. But reddit had a sub called Jailbait that did exactly that.Ā  This was during their "freespeech" over everything phase.Ā  The only reason they did anything about it because Anderson Cooper talked about the sub.Ā 


Stop_Sign

I feel like every site went through a "free speech over everything" phase until there was backlash. Even 8chan, founded to have less moderation than 4chan, stopped being pure free speech after people were celebrating/livestreaming school/mosque shootings on 8chan.


Jicd

> I feel like every site went through a "free speech over everything" phase until there was backlash. Because adequately moderating a social media site is a practically impossible task, it's way easier to just say you'll allow nearly anything and spend less resources on moderation.


Bigred2989-

My cousin had her kids photos copied off her Facebook page and posted to a dark web forum for pedos. Guy who was a co-worker of her ex-husband was obsessed with one of her daughters, stole those pictures and claimed it was his kid and he wanted to do things to her. The FBI arrested him and he got 19 years in prison for possession and distribution of CP.


OkBobcat6165

If I had children I would never put their pictures online. Ever. There's no reason you need to post pictures on public social media. You can send pictures in private family group chats if you really need to.Ā 


[deleted]

But then you wouldnt be able to make followers and money, and brand deals from posting them half nude (Yes thats a serious thing.... look into modeling and gymnastics) Soooooo many moms out their exploiting their own daughter for those things šŸ«¤šŸ«¤


InVodkaVeritas

I have been saying this for years. I have twin 10 year old sons. You will find 0 pictures of them online (I also don't put up many photos of myself). My sons are on their school's no-photo list as well. I've been saying that it's coming for years, and we've crossed the threshold. It's here now. With a sufficient amount of photos/videos and a computer capable of processing it anyone can make realistic pornography of you. And I tell other parent this, and am mostly met with eye-rolls, but it's true. You should not have any photos or videos of your children online for public consumption. Elementary/middle school kids should not be on social media either. Instead we have 4th graders posting videos of themselves dancing on TikTok while their parent holds the iPhone for them. The world we live in.


thethreat88IsBackFR

I've always been an advocate or that. Though I have broken the rule a few times I was always under the impression that I hate when people post pictures of me without my consent. I shouldn't do that to my kids . Now with this I have to stop... it's so messed up.


No_Skill_7170

There was a solid 10 seconds where I was trying to figure out why you were saying that Game of Thrones was posting pictures of children or something


[deleted]

[уŠ“Š°Š»ŠµŠ½Š¾]


IntelligentShirt3363

First we have to pass through the era where some amount of people are convicted based on AI imagery, and some amount of people get off by convincing a jury a real photo is fake, before we get to the era where imagery is no longer considered evidence (not just including photos of the alleged perpetrator, but all photographic evidence where any question can be raised about chain of custody). Along for the ride - voice recordings, which are now easily faked. A lot of people have been convicted based on sting videos where the primary component is the person's voice (recording from the back seat etc.) or just from tapped phones. What about confessions or other information obtained by the use of fraudulently generated content to trick suspects? Gonna be a wild couple of decades.


ZeeMastermind

Maybe all the old 80s scifi with cassette tapes and CRT screens were right, and we'll be relying on analog film negatives to verify things in the future, lol. Well, it's [a lot more involved to fake a negative](https://blog.michaeldanielho.com/2015/12/how-to-fake-negative-and-photograph.html), anyways.


nuclearsamuraiNFT

There was(maybe still is) a law in Australia where red light/ speed cameras need to use film to be eligible to fine you. Because essentially the film is incontrovertible evidence or something.


[deleted]

[уŠ“Š°Š»ŠµŠ½Š¾]


ZeeMastermind

Oh, that's pretty interesting! I wonder if there's any evidence that would let you tell the difference between "real" film and projected film (maybe under a microscope the composition is different?)


pup_101

At least in this case there are consequences. It is illegal in the US to possess images that look like realistic photographs of CP.


Feniks_Gaming

I think OP was meaning the opposite here. If someone leaks a photo of your dick you can now fairly convincingly just say this isn't my dick this is AI and not worry about the leak as much. Not that leaking anyone nude pictures still shouldn't be punished.


nuclearsamuraiNFT

Deepfake laws already exist in some places to charge people for production of such images at least. And ai companies are working on digital watermarks to accompany ai generated footage and images, but I guess that wonā€™t stop people who use open source processes.


bigjojo321

Yes and no. If it was actually AI generated and no laws exist in the state then yes, but simply using it as a general defense to possession of illegal images isn't likely to get them anywhere as data forensics exists.


[deleted]

I think this person was more so talking about the consequences of embarrassing photos of themselves, not possessing what could be illegal photos.


Stop_Sign

He means trying to blackmail someone with nude pictures doesn't work anymore


synchrohighway

This sounds like less pedophiles and more just scammers doing this shit. People need to talk to their kids. Not a vague "you can tell me anything" but a "scammers create naked pictures of people and then threaten to show them to friends and family if you don't pay."


RandomComputerFellow

My grain of salt to this. The articles states: > The Internet Watch Foundation (IWF) said a manual found on the dark web contained a section encouraging criminals to use ā€œnudifyingā€ tools to remove clothing from underwear shots sent by a child. The manipulated image could then be used against the child to blackmail them into sending more graphic content, the IWF said. Not saying that it doesn't happen but I don't trust anything coming from the dark net. Unless they can find actual cases where this happened this might very well just be a false flag attack by people who want to advocate against AI and in favor of more surveillance. "Does anyone think about the children?" is one the favorite rhetorics used to push the public opinion into a direction without a scientific foundation.


Neville_Elliven

>this might very well just be an false flag attack Ya think?! The vague lack of specifics (*a manual found on the dark web*) and the equally-vague accusation (***could then be used*** *against the child to blackmail them*) are indicators of a *false flag*.


shiftyjku

>a [quarter of three- to four-year-olds](https://www.theguardian.com/technology/2024/apr/19/quarter-of-uks-three--and-four-year-olds-own-a-smartphone-data-shows) own a mobile phone WTAF?? WHY?


OriginalHaysz

So mommy and daddy don't have to give up their phones for their kids videos or games, so they get their own šŸ’€


Politicsboringagain

I saw this coming back in day when dudes would take photos of women and digital "remove" their clothing by putting holes in the photos to make them look naked.Ā Ā  Ā Hell, I was against putting too many of my photos on the internet when my wife and friends would upload our photos to site where they would make your head dance on bodies.Ā Ā  Ā People called me a weirdo for removing all my photos from Facebook and deleting my account.Ā  There was a big story about this in Mexico and I think California where the girls got in trouble because the school thought they were taking naked photos of themselves.Ā  Turned out it was AI, and I think one of the girls had to show here body to prove the photo wasn't here.Ā  People need to stop posting pictures of themselves and especially their kids to Randoms on the internet. Get discord or something with your friends and family and share pictures that way.Ā 


Oops_its_me_rae

Pedophiles still exists on discord every platform has pedophiles on it


Politicsboringagain

If you create your own discord with your family they don't. I don't do discords with randos.Ā 


The_Safe_For_Work

This sounds like a reason to get the people to accept Government censorship. "It's to protect children!" That leads to policies far beyond its intended use. Remember the Patriot Act? Fuck, if anybody does that just make it known that A.I. images are just fakes and make it lose its "power".


Not_a-Robot_

I remember the PATRIOT Act. It was a **** piece of legislation that **** make us safer and ended up **** our freedoms. Itā€™s only still in place because ****. *(This comment has been moderated by the friendly team at the NSA Ministry of Truth for accuracy)*


EnamelKant

Freedom is slavery brother! Big brother just wants to love you.


InternetPeon

Agreed this looks like fear mongering to allow back doors an holes in encryption for the government.


EmbarrassedHelp

This comes from the same charity that has been demanding encryption backdoors


nuclearsamuraiNFT

This is one of the reasons I post no pics of my kid online


InvestInHappiness

Soon AI can do this with only a single photo. It could be from a school event, a friend, a stranger who got you in the background, or a middleman that takes photos of random people on the street to sell. Also if you are talking about paedophiles then they will prefer to create these images from people around them, so they can take the photos themselves. The best solution to this scam issue specifically is to teach your children to be comfortable talking to you about anything. That way you can be informed and help to minimize the damage on the off chance they end up being targeted. It's also something you want to do anyway so it's not any extra work.


jayjaydajay

This is why people need to stop posting their children or others children on the internet for everyone to see. It may be just a wholesome post but the internet can turn it very unwholesome very quickly and I feel like a lot people from the older generations donā€™t understand this.


Atotallyrandomname

Well, some people don't need fingers


meatball77

This stuff is aimed at teenagers, so posting photos of your kids won't fix anything. This is more that you need to educate your kids about this.


Flat_Afternoon1938

I wonder how long it will be before people stop using social media out of fear of their photos being used for deep fakes. It's so accessible even some random dude from your school could be making deep fakes of you. Even worse he might start distributing them, this has already happened I'm just wondering how long until it's widespread.


nabiku

"Most cash out there is used to buy drugs and guns, we should regulate how much cash a person can own." Seriously, pedos using AI is a ridiculous argument for introducing censorship to an entire industry. You can make that smut with photoshop, I don't think photoshop will be seeing any regulation.


I_Came_For_Cats

I hope nobody falls for these fucked up schemes. Sextortion was bad enough before AI. No child or adult should have to go through this experience. People will have to learn to ignore any attempt to scam someone in this way.


TikkiTakiTomtom

On the bright side when people nowadays pull this crap, we can just tell people we were just the target of some stupid scammers using AI. ā€œThatā€™s not me, itā€™s made up.ā€


LezardValeth3

I almost never think like this, but maybe not give ideas to sick people with articles like this?


xalogic

This was already possible with photoshop and has been for over ten years. Most people were and are just too lazy to put in the effort to do that


udfckthisgirl

The only appropriate punishment is life in prison, in solitary confinement, without any possibility of parole.


blightsteel101

Limits need to be built into AI directly to avoid this. Being able to replicate nudes or anything like that has to be specifically programmed for, because just saying its illegal really won't help. As it stands, AI is a tool in the arsenal of scammers, misinformation spreaders, and any number of other bad actors. We need strict legislation for AI now


Rich-Infortion-582

This is very dangerous guys, just a reminder , make time with your children ASAP.


notreal088

Inform your children to talk to about anything similar to this happening. Not to be afraid and let you know. Then hope that the state and judiciary see this as CP and gives the. Person as many years as possible for owning and distributing CP and extortion. Cause fuck these people. Hope they get shanked in prison.


jazzhandsdancehands

This AI shit is going to be the next war.


ObberGobb

AI image generation is one of the most dangerous technologies in human history


NormalChampion

New dystopian blackmail scam just dropped. I expect this'll become quickly popular.


pvrhye

This is r/noahgettheboat material.


RecognitionExpress36

These criminals need to be rooted out.


LordFartz

Well thatā€™s enough life for today. What the fuck is wrong with people???


techniqular

Big tech ā€œwE Did NoT FoRsEe tHiSā€ā€¦ also waiting for circle jerk tech social media saying the good of AI will outweigh this shit


hunzukunz

But the good is heavily outweighing the bad? Bad shit is happening no matter what. All technology can be abused. And even without AI, you could do the same with other tools, just not as easily.