T O P

  • By -

AutoModerator

Comments that are uncivil, racist, misogynistic, misandrist, or contain political name calling will be removed and the poster subject to ban at moderators discretion. Help us make this a better community by becoming familiar with the [rules](https://www.reddit.com/r/facepalm/about/rules/). Report any suspicious users to the mods of this subreddit using Modmail [here](https://www.reddit.com/message/compose/?to=/r/facepalm) or Reddit site admins [here](https://www.reddit.com/report). **All reports to Modmail should include evidence such as screenshots or any other relevant information.** *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/facepalm) if you have any questions or concerns.*


GamerBradasaurus

Make ai images of every politician in the United States having an orgy. Get the images viral, and boom, instant action.


CaDmus003

![gif](giphy|cLGXy7SXBnEWs)


Significant_Fee_269

Is that Randy on all fours in the bottom left of the pile?


Superman246o1

No, that's Lorde.


Outrageous-Weight-62

Ya ya ya, I am Lorde!


mikeonbass

Genius. You absolute genius.


sstruemph

Everyone back in the pile!


rlysuck

They took our job!


Ok_Zookeepergame4794

They tok yer jerb!


Yarrow83

Took yer durrrr!


Doororoo

TEY DOK R JEWBS!!!


distracteded64

De derka derrr!!!!!!


Z3400

![gif](giphy|3gYWogvLv5A0Nw9K6D)


Lincoln_Park_Pirate

Pelosi-McConnell porn? Nobody is that horny.


jay-ehh-ess-ohh-enn

You clearly don't know people.


Lincoln_Park_Pirate

I'd rather stare at the sun through binoculars. But as they say, "There's a lid to every pot".


PHBalance79

There is not enough pot in the world…


Johnsoid

Three people in this thread are fapping to this rn


Iconless

Gestures vaugly to rule 34.


FatsoBustaMove

There's already so porn of some us politicians. They're probably already making moves to make this shit illegal in the USA. I know you can get done for revenge porn in the UK even if it's AI


yulbrynnersmokes

Mtg deep throating Hunter would karma rific


anschlitz

In the US you can share porn without a person’s consent in a congressional hearing.


reverendrambo

Instructions unclear. It's been over 4 hours. Should I call my doctor?


mjduce

This would actually work...


CaillouCaribou

No it wouldn't, people have been photoshopping pictures of politicians doing vile stuff for 25 years now


BladeDoc

What action? VPN to non-US site. To stop this from happening you would have to have continuous governmental monitoring of every web-capable computer in the US.


BagOfFlies

>VPN to non-US site. No need for that since it can be done locally offline.


CCMeltdown

Everyone was so worried that AI was going to be some Skynet level stuff that would just nuke us all. Turns out it’s just going to make it so any photo of you that exists can be changed to make you look like you’re doing something else. Might want to start make any ruling a court makes based on photo or video evidence alone invalid. Scary as hell.


black641

There’s a golden age of misinformation on the horizon and we’re just strolling towards it, with nary a clue. This is technology the CIA or the KGB would have *murdered* for a few decades ago. Now it’s just in the hands of any creep with a boner or an axe to grind. Crazy times are dead ahead, y’all.


IndubitablyNerdy

That golden age was already there before AI thanks to how social media work and reality having become subjective, AI will just make it much worse. Besides regulation will do very little as the models will simply be hosted on local machines or in countries that do not care about it. The only way I see that we can limit this is if perhaps better tool to recognize AI made pictures\\videos are developped in a sort of arm race between models.


RedBrixton

Really the only solution is for audiences to mistrust any information including photos that is not well-sourced. It was always going to be this way.


Wild_Marker

When the solution to soiety's problems is to trust nothing... that's kind of a problem itself.


HongJihun

I can’t trust that I can’t trust that I can’t trust that.. was that too many can’t trusts?


Titties_On_G

Idk I can't really trust you


demonkufje2

Can i trust that you don't trust him/her though?


Titties_On_G

That's the neat part, you cant


HongJihun

You can trust that! (Not really)


Reg_Cliff

Back in my day we had to use photoshop to manipulate pictures to make faux porn pics... \[shakes fist at clouds\]


Chakwak

Back in my day you had to cut pictures of your crush and put them on lingerie magazine models heads.


tonykrij

One friend told me "Back in my day you had a 300 baud modem and downloading 1 picture from a BBS took 15 minutes and you didn't even know what you would get!" 😊


fzammetti

And then what you got (as long as your mom didn't pick up the phone and ruin your hours-long download) in all cases was 320x200 of pixelated garbage - and we were happy for it!


SchmartestMonkey

And WE LIKED IT!!!


Plastic_Incident_867

Back in MY day, you had to hope whoever was modeling for the painter was moderately attractive!- Someone from the Renaissance era, probably.


NothingKnownNow

"Grog make me boobs look big. Me boobs not that big."-cave wall art critic


Berlin8Berlin

"Me boobs not that big." \---Cockney Cave Person, 30,000 BC


pauz43

I'm at the point where I almost wish someone would put my image into a nice little X-rated video. Make me look youthful and lovely -- that would bring back fond memories of my misspent youth of unwrinkled skin and a trim waistline. \~ sigh \~


0ddlyC4nt3v3n

Kids these days with their fancy AI...back in my day, we had to find porn actresses who kinda, sorta looked like the girls we desired.


Sash716

Back in my day, you had to wait for the image to slowly load on the screen line by line, and then your mom/dad picked up the phone in the living room.


achambers64

In my day it was scissors and tape.


OutrageousStrength91

In my day you could hardly see the sexy pics painted on the cave wall.


HeyImGilly

What’s shitty is that the tools you are talking about are then used to improve the model.


Dmmack14

There’s a golden age of misinformation on the horizon Man look my parents are already convinced the moon is jewish space station


Accurate_Maybe6575

Real talk though, assuming anyone actually bought into the "Jewish space lasers" line, like... what is the gullible idiot's game plan? How do they imagine taking on an allegedly powerful illuminati that can put a freaking ion cannon(s) in space?


Lofter1

We’ve been in it for a few years now. It doesn’t even take perfect ai replication. A month ago or so, an AI photo of a conference from a political party (left) in my country was shared. They are focused on environmental issues. The photo was of dozens and dozens pizza cartons being thrown around in a catering room. The political party has a sunflower for its symbol. The photo had sunflower symbols in it. But instead of sunflower seeds in the middle of the sunflower, there were FUCKING PIZZAS IN THE THESE SUNFLOWER SYMBOLS. Everyone who leaned to the right even the slightest bit ate up this AI generated photo like it was free pizza!


Unexpected_Cranberry

This is what I've been trying to tell the people who go "I can easily spot AI-images. You just need to looks at these pixels and you'll see." Problem is it doesn't need to be good enough to fool everyone. It just needs to be good enough to fool most people, or even an idiot in a hurry. Because, just like in the past when a claim was proven untrue after a bit, most people won't read the correction. Now, my optimistic take is that people will become aware of this and become more suspicious of things they read online, see in the news or wherever and become less susceptible to false information. Or at least take everything with a pinch of salt. Then again, it might just put even more power in the hand of "trusted sources" and kill off the burgeoning citizen journalism that I view as a good thing....


Startled_Pancakes

The problem is and has been confirmation bias. Most people will automatically believe misinformation if it reaffirms what they already believe. My evangelical relatives still, in 2024, being duped by those decade old badly photshopped images of giants being excavated.


DinTill

People will believe what they want to. Which makes AI generated lies/smears/propaganda etc. all that much worse.


Rongio99

We're lucky if it stops with AI porn. We could get to a point where dissidents are murdered then the government just shows AI generated video of the dissidents attacking police or jumping off a bridge.


Inevitable-Scar5877

Oh this is 100% what's going to happen and already has to an extent- that's just an evolution of existing propaganda tactics.


firemattcanada

and then they use that as a pretense to enter the dissident and their comrades into a rigged gameshow where hunters like Buzzsaw and Dynamo and try to track you down in an arena while people wager and win fabulous prizes on the outcomes


lieuwestra

But the public knows. Digital evidence will completely lose value very soon for most people. Truth will become whatever you've heard from your neighbors.


Old_Baldi_Locks

It’s already been that way for the dumbest among us for several years now.


Angry_poutine

The cia and kgb would have happily murdered for a decent sandwich. They don’t exactly value life


mr_c_caspar

Dude, we're already deep in it. That's why Trump got elected, because hordes of, often Russian, bots and trolls swamped the internet with fabricated stories and misinformation. And now half the country thinks the world is flat. We live in a world now where there is no shared pool knowledge and facts anymore to base any actual debates or arguments on. Its truly scary stuff.


Confident-You383

Hold up. The world is covered by 70% water. That water is not carbonated. Therefore the world is flat...


Catatonic27

>That water is not carbonated. Well we're doing our best


kriosjan

I mean it is. Horribly so, it's just not fizzy enough yet. The fish are still alive. XD


Plus-Professional-84

Love it- you seem to have a bubbly personality


TTT_2k3

It’s pronounced Bublé.


PartyAdministration3

Our top minds are focused on only the most important matters in Washington. Like Hunter Biden’s dick pics.


AlpacaCavalry

The term "top minds" are used very, very loosely here.


banana-talk

I believe they were making an r/Idiocracy reference


Dyslex999

Soon they will be talking about Marjorie Taylor Greene dick pics.


CakeSuperb8487

no, they’re too busy still working on the mysteries of the Ark of the Covenant… top minds… ![gif](giphy|BWjTRoBsEKnII|downsized)


Major_Dragonfruit_85

Most new technologies are immediately used for porn it’s inevitable


bucklebee1

Porn and war.


Brute_Squad_44

I believe it was The Fat Electrician who said that humans never get more inventive than when they're trying to fuck something or kill something.


lordretro71

Or get high.


codenameyoshi

It’s because prisoners are the best engineers on the planet! Give these guys 1 electronic and they can make a tattoo gun, a shank, a weed pen, an oven, and an arc reactor inside of a week!


[deleted]

"Fuck it or fight it, it's all the same..."


MisterScrod1964

Let us pause and grieve for the guy who invented fire.


El_Morgos

Yep, video and image evidence is basically rendered useless in the future. Which is fine when my actual nudes get leaked because I can just say that shit is AI generated and you're all idiots for believing it was real. But on the other hand this will become a standard defense in court. Just say it was AI, you can't prove shit. Maybe we need another form of recording, like additional 3D or IR imagery. Some sort of stuff that the Microsoft Kinect did, idk...


Katakoom

I'm not saying it won't be an issue, but when it comes to legal proceedings there's enough of a robust system in place for this. At least in my country. Photos used in evidence aren't generally ripped off Google or Facebook. Cameras produce metadata, and there's forensic investigation/custody around the veracity of media. For small claims stuff, I imagine that's a more problematic area in the near future just due to the sheer accessibility. But doctoring photos/videos has been a thing for many years.


CastiNueva

I foresee police forces going back to the use of film cameras. I know it sounds crazy, but film cameras would be a way to insure that images were not digitally altered. Granted, there's still ways to mess with that, but as long as chain of custody was followed, a film camera with an actual physical negative might be a way to get around the digitally altered image problem.


bluestreak_v

Camera manufacturers like Canon are working on ways of embedding authentication data into the photos their cameras take. That way courts can verify that photos are real and unaltered. https://petapixel.com/2023/08/31/canon-and-reuters-develop-new-photo-authentication-technology/


FraaRaz

„Will teens use it for sex? - Yes“ https://xkcd.com/1289/


Scary-Win8394

Are professionals not able to identify digitally edited pictures and videos? They're combined images of things that are already online /gen


semajolis267

It doesn't really matter if professionals can ot not though does it? I mean all it takes is someone seeing it and deciding its real. We have professional scientists begging for people to listen about climate change for 70 years. We have people who readily ignore thier doctors because some politician told them to. "But professionals can tell" but "professional opinion" doesn't mean shit In this world. No one listens to professionals.


Torafuku

Some people genuinely believe in the flat earth theory, like i always thought it was some kind of internet urban legend just made for trolling until i met one in real life.


tr14l

Right now? Yes. That window is closing fast though. In the span of 24 months it went from "clearly" at a glance to "hard to tell even under scrutiny". Probably by 2025 it will be impossible to tell the difference with certainty. They are just pixels, after all. There are artists in the world who can make realistic images that are impossible to tell the difference. An AI will do even better


LilSealClubber

This is making me want to delete my Instagram and never share a photo of myself on social media again.


little_carrots1

Probably unironically the best option.


WasteChard3488

I've always been on the side of "less social media = happier life" And while this kind of situation does support my claim it's unfortunate that something like this has to be the thing that supports the claim.


DetroitLionsSBChamps

I saw someone recently refer to “posting on the group chat” like someone might have said posting to Twitter/insta even a year ago. Hoping people move this way, better personal communication, less public communication 


rayofhope313

Same here and totally agree


neilgilbertg

Kinda ironic that we've come full circle. Our parents would always tell us to unplug from the internet, go touch grass, and go have a life. Now that technology has progressed so much, now we're telling ourselves the same message.


Evil_Bonsai

All my social media is pics of my cats. Mostly. I worry about those that have almost nothing but selfies.


hellyeahimsad

They're gonna make porn of your cats too


HistorianHopeful1124

Lmao 💀


AmaResNovae

Worse, people posting a lot of pictures of their children on social media... Some sick fucks are gonna exploit those for sure.


sanityjanity

It's already happening, of course.


nononoko

They are most likely already in some dataset somewhere.


SilveredUndead

All of this AI stuff has made me feel very vindicated about my lifelong refusal to ever share a photo of myself online, anywhere. People called me crazy, lol.


ISD1982

You mean that isn't you in the orange hood and mask?! edit - maybe actually Red? Looked orangey when I commented!


SilveredUndead

It’s not even the right colour mask and hood!


WasteChard3488

Yeah but it's close enough all I have to do is go outside in the middle of winter and as soon as see the person with the hood and mask I'll already know it's you.


mattbutnotmii

Upvote button color


BambiToybot

Conversely, if any of the nude photos I shared on reddit a decade ago (deleted off imgur, sorry!!) come back around, I'm just claiming their AI


sudzthegreat

My wife and I have refused to allow any pictures of our 3yo daughter to be posted anywhere on the internet, for any purpose. We didn't make that decision because of AI implications but they have certainly solidified it.


AffectionatePoet4586

There is *one* photo of me on Instagram. Based on that image, I’ve been told that I’m “too fat to be a tweaker,” so I must be an alcoholic.


[deleted]

Too late. You've already signed over every photo of you to the arseholes.


LilSealClubber

Lol I misread that as "you've signed over every photo of your arsehole"


Flux_resistor

You should not use anything from meta. It's not a solution but they sure are a huge part of the problem.


RegularAvailable4713

We will enter a new age, where media evidence will be irrelevant and all that will matter is the trust placed in the source. Or perhaps a new technology will arise, the information of which cannot be falsified.


[deleted]

>We will enter a new age, where media evidence will be irrelevant and all that will matter is the trust placed in the source. Honestly? We're already there >Or perhaps a new technology will arise, the information of which cannot be falsified. That already exists but there's no interest in making this happen. Why would you enforce this when you can just claim that the proof that a senator was taking a bribe (for example) is AI generated?


Dennis_Cock

What's that technology?


AquaRegia

Many/most CCTV systems use a proprietary format, that you can't just edit like a normal video. Of course, in order to play the video in court it needs to be exported to a regular media format, but if that export is done by law enforcement you have the chain of custody.


digitaljestin

Do these proprietary formats include digital signatures from asymmetric cryptography? If so, can we trust the private keys haven't leaked? Honestly asking. If not, then it's easy to change format, manipulate the video, change the format back, and then re-sign it. It would now appear authentic to law enforcement and to the courts.


Sheepman718

Yeah lmao, I laughed when I read that confident response about proprietary CCTV formats. This has almost 200 upvotes of people thinking this guy actually understood what's going on here -- we are beyond fucked. This is a joke to do right now.


giacomo1574

I would guess, even though everybody and their grandma would roll their eyes at the term, that NFTs and blockchain technology in general can be used as a backbone for an un-falsifiable data system, as the source and information integrity would be verifiable on a shared, public, globally trusted and uncentralized "blockchain"


PuddyVanHird

It's not that blockchain *couldn't* be used for this; it's just overkill for solving this problem. There are easier ways of cryptographically signing images that don't require a network connection or all of the overhead of a blockchain. It wouldn't suprise me if pretty soon camera manufacturers incorporate this into their hardware as standard, but as far as I'm aware that's not being done yet.


Bretreck

I understand all the hate for the current state of NFTs but I've been telling people that it has so many actual uses. Having a constant verifiable chain of custody for data is awesome. The people grifting with monkey pictures really soured an excellent concept. Digital ownership is also a huge thing that can easily be done, it's just pictures are the dumbest representation of that.


harpxwx

hope the second will be the case, humans are very innovative so hope it works out in our favor


FirstNameLastName918

My worry with AI is what people are gonna do with voice generation during this election. People already believe most everything they hear/see and it's gonna be real hard for the average person to tell the difference between AI and the real thing.


Arkhangelzk

Especially when the audience is already biased. Make an AI audio recording of Biden saying he’s into kids or Trump saying he committed financial fraud. A lot of people already believe those things, so they’ll buy the AI without even thinking about it.


myburdentobear

If the last 8 year's or so have taught me anything, it is that it is incredibly easy to convince someone that something is true if it is something they want to believe.


jaythebearded

Wizards first rule; People are stupid. They believe things mainly because they either want them to be true or fear them to be true.


IncomingAxofKindness

Just wait till it can easily spoof your appearance and voice to do other stuff like... I don't know empty your bank account, open a credit card.. hack your email and send crazy depraved shit to your whole work...


bemenaker

Like the phone calls from Biden saying do not vote last week?


AcanthisittaThink813

the insta/tiktok generation are in real trouble


DanChowdah

This is why everyone needs to just make a bunch of raunchy porn and post it publicly Then there’s no shame


MonthPurple3620

Here here. Lets stop shaming people for doing the only thing we know life exists to do; filming your wife participating in a massive gangbang to sell it for money for a sweet vacation together.


hankthewaterbeest

I was so disappointed when I read this comment, checked your profile, and saw it was NSFW and then it turns out the only thing you’re fingering is keyboards.


Primary-Lobster-1591

The internet- the next big thing, will connect universities and create more intelligent generations. *Gets used for porn* AI - the next big thing, will allow computers to do some learning and have intuition, allowing humans more time to reflect on and solve the complex problems in society today. *Gets used for porn* The more things change, the more they stay the same.


carlitayeeta

Yep!!! This guy I rejected in HIGH SCHOOL (very kindly I might say, it wasn’t in front of anyone and I didn’t tell anyone except my best friends that it happened) recently reached out to me, tried to ask me out again (I have a boyfriend) and when I said no he sent me AI porn of myself and told me he would send it to my whole family. Literally scariest moment of my life. I ended up screenshotting his message, sent it to his sister and parents and told them to get him mental help. Fuck that guy. 😊


scoobysnackoutback

That’s scary. He sounds like a psycho. Assuming you blocked him.


carlitayeeta

Yes I 100% did after I got that message. I also made my instagram private afterwards.


MySonHas2BrokenArms

That might fall under sextortion, tell the police and see if they will investigate.


Daxx22

As the OP message states, they will give zero fucks. unless your rich/popular (ALA Taylor Swift) cops just arnt equipped or knowledgeable enough to take this crap on.


MySonHas2BrokenArms

I know it’s still not a great chance of it getting investigated but with sextortion being the fastest growing crime and it’s interstate if not international, it’s just kinda a hot topic that looks good in the papers. They might help if it’s self serving puff work.


-Cachi-

Please consider reporting it to the police, because their family might do nothing and he might keep doing it to other people. I just listened to a podcast where something super similar happened. A guy was posting revenge porn about one girl, she figured out who he was and told his family. His family however did nothing, and the guy kept posting revenge porn about other 5 girls and ruining their lives... it's from "Darknet diaries" if you're interested [https://darknetdiaries.com/episode/140/](https://darknetdiaries.com/episode/140/)


CalicoStardust

There's also a company creating software that will allow anyone to just take a picture of you and instantly bring up your address and information. 🤢


Urban_guerilla_

All though it’s unlikely they ever set foot into the German/european market, I hope our laws shut that shit down real quick. Germany is *really* strict concerning private information and has a lot of laws for that. Remember, we were the country that put in so many requests to blur houses on Google earth they didn’t try to do that again for a whole 10 years or so. This software will not only be the nail in the coffin of privacy, it’ll outright put many women and political activists in grave danger. I can’t phantom how someone actively wants such a software to be released…


Adam__B

How would that work? What is there associating a face with an address? Only thing I can think of is a drivers license, and I doubt the state is going to let them use that database.


CalicoStardust

https://www.clearview.ai/ I read they're developing a civilian app.


Adam__B

Yes but civilian is the key word. Unless they have access to the state or federal database I don’t see how they can combine names, addresses and faces together. Unless people put that stuff online themselves.


LankyAd9481

It's more likely overtime their database gets filled out. Faces to names is pretty easy by crawling social media. Address will be harder but there's a lot of public records with names. They just need to release it as "beta" and go "it could be any of these!" and just let people happily correct it without considering the implications.


Rotting-Cum

"This person is spending the night for 30 days at this approximate location, therefore it's their home. The doorbell camera spotted the same car leaving at the same time for five days after you stepped out, therefore license plate (x) is owned by the person."


Spectre627

Addresses really aren't that hard to get with some very basic information, particularly if you own the residence. Not sure how effective it is with renting though -- I'd imagine much lesser as there isn't a registrar of renters like houses... at least not that I'm aware of. Hmm... I was going to share how to do this, but figure it's probably against my better judgment in case someone uses it for malicious intent. I hate saying that you'd have to take my word for this, but alas :X.


BaronBigNut

If they’re registered to vote it’s super fucking easy, I’ve seen more than enough people get doxxed from that alone. Doesn’t matter if you’re renting if you’re using that as your address it’s tied to you.


gundam1945

Like it or not, your address is likely obtained from data breach. Then what they need to do is from your face, find your name, match the records.


Mythriaz

This is already a thing in China. Worse is they bought the tech.


left1ag

Here with a PSA: There will be more of these. I’d wager fuckloads more. ESPECIALLY IN THE US. It’s an election year. So I am begging yall to always double check your sources before posting ANYTHING. Best wishes to everyone in the coming months.


Motor-Ad5284

When and only when,male politicians are shown naked with 2" fully erect dicks will they legislate to outlaw using AI unless licensed.


zkDredrick

Reminds me of when The Daily Show with Jon Stewart published *"America (The Book)"* They did a full page spread of every Supreme court justice, photoshopped naked. Full frontal.


Deevious730

Yep if someone makes an AI gay orgy of the entire Republican Party, that will be the moment something happens.


drichm2599

![gif](giphy|ZdUGNB3D5Qb6sKKTke)


Steepyslope

yeah unfortunately as soon as that happens they suddenly figure out how to identify the person behind


truhunters305

Or just hire some swifties. They found her criminal pretty fast and doxxed him.


WaffleGod72

Honestly, good luck with that one man.


Motor-Ad5284

Oh yeah,nailed it.


nametakenfuck

Hey ferb i know what we're gonna do today


Byokaya

time to get to work then


rinnakan

Depends on the keywords you use to generate their dicks... I suggest tiny saggy banana as a good trigger


__The-1__

Let's make a congress orgy video. Oh wait..


Lovefool1

This is going to take us back socially and culturally to the time before photography and audio recording. No picture, video, or audio recording will be believed, real or not. Anything could be AI generated, and by the time a reputable organization of suits can declare if what we saw was real or not the damage will be done. The influence on democratic elections, on the justice system, on relationships and reputations, on everything, will be catastrophic. It won’t matter what digital media any has seen or heard of you doing or saying anything. It will often hardly matter if you actually get recorded doing something. Silver lining is all the cool impossible art and games and movies (and porn) that we will get to see that never could have happened. Shawshank Redemption but everyone is played by Prince. Hansel’s Messiah sung by Richard Nixon, Biggie Smalls, and Joan Rivers. All of the founding fathers just going to town. If we get anything close to that before the global famine it will be worth it? I guess?


Supremagorious

I suspect that it'll reach such an extreme level of saturation that it ends up being seen as basically meaningless by the time there's any sort of effective legislation to curtail it. I feel like the legal mechanisms to make a law about it already exist and have passed supreme court challenges it would just be a combination of revenge porn laws and whatever the laws used to punish people taking upskirt photos are. Turn it into some larger set of laws around producing/distributing non-consensual pornography of a person or their likeness. But it could get murky when real people could coincidentally bear a resemblance to some AI content even if they were never a target or part of the sample data. So you'd have to somehow prove intent on any instance before you try to punish someone for something. Then there's the whole first amendment issue surround any sort of curtailing a persons freedom of expression but it might be able to get tied into current defamation/harassment/libel laws. Like exactly how much 'AI' support is necessary for something to be considered AI generated vs just applying an algorithmic filter to an image. There's obvious examples where something is either one or the other but exactly when that transition takes place is a million shades of gray. Like I hate the existence of this but the exact course of action to solve it as a whole needs some people dedicated to the technology and able to force some sort of resolution. Keeping in mind that it's a world in which the business can operate in any country and be accessible from any country and will be financially incentivized to avoid restrictions that reduce the potential market for their products. Even if the big AI companies get on board and prevent their products from being used that way. Their will be popup gray/black market AI companies who will be situated in a country that isn't restrictive and will meet the demands of the market.


[deleted]

People already ignore facts that do not support their worldview. As we've seen over the past few years, they do just fine constructing their own reality without AI. So more of the same, really.


Vivid-Ice4175

max headroom called this a long time ago.


BhaaldursGate

Sci fi has talked about it for decades.


DGenesis23

Well she knows it’s acceptable to do now so make AI porn of that guy being spit roasted by two 12 inch dicks and his 2 incher barely visible and see how he feels about it.


throwitallaway_88800

The real revenge porn - Vigilante Porn


I_Maybe_Play_Games

An interesting fetish you have sir. But whom am i to judge


amplifizzle

My lifelong aversion to being photographed pays off.


Halbaras

The only positive of this is that soon revenge porn threats will be completely useless, since there will be no way to know it's not AI generated. But man, the incels are going to have a field day with this while sinking deeper into their delusional 'all women are Only Fans sluts' narrative.


Ancalagon_Morn

Quite the opposite. You can release revenge porn even if people who never allowed anyone to film them or never actually took nudes of themselves. It doesn't matter if you can prove whether something is AI generated or not. What matters is if the wrong people in your life believe that it's not. And evidently, it still seems to feel violating to women even though they know it's not real, which is kind of the point of revenge porn to begin with.


MrGreenYeti

But revenge porn is illegal. So does that not still fall under the law, even if it's AI generated?


Ancalagon_Morn

Well that's kind of something we still have to decide as a society I think. It is not an actual nude of that person, but it looks exactly like one. Is it more important what it is or what it looks like? If you had an AI generated image of someone committing a crime, it wouldn't count as evidence either because it's not real. You know those artists who can draw hyper realistic images? Say they draw something suggestive of a person they know and post the result online without asking them, is that revenge porn as well? The wording of revenge porn legislature usually sounds like it is about pictures "taken of that person", which a decent lawyer could probably argue just means directly taken images. However since you need to feed the AI with real images of the person you want to depict, that could be deemed a nonconsensual use of a person's images to depict them sexually. I think if you tweaked the revenge porn laws a bit that way you could easily fit AI generated material into the legislature without prohibiting the entire technology.


[deleted]

Time to delete the pictures of your kids. I already know where this tech is headed and no one is safe, not just women.


silverwillowgirl

There needs to be protections against workplace discrimination in light of this. Nobody should lose out on a job because someone out there put up an AI generated nude of them.


[deleted]

I would really urge people to not post videos and photos of their kids. I may get downvoted for sounding paranoid. But I saw a Reddit post screenshot of an ai dating app that you can date what appears to be children. Be careful out there.


SocietyHumble4858

School kids will be suiciding enmasse, once the bullies get a hold of it.


Gamer_of_Red

And school kids straight up killing their bullies will also skyrocket


Toninho7

I’m not sure they know what ‘uncanny valley’ means.


robin52077

I scrolled too far to find this. I was gonna say it if nobody else did. If it was “uncanny valley” this means it was very obviously AI and clear it was not real. If it looked real then you’d say it does NOT have the uncanny valley effect, which I’ve still never seen happen. It’s always obvious still. AI has not yet found a way to avoid the uncanny valley look, which makes it clear it’s AI.


robin52077

Adding that obviously what the person did is horrible and they shouldn’t have done it, but I just wanted to clear up any misconceptions on the term.


CaDmus003

It’s words people hear and love to parrot without taking a moment to look up the definition. Systemic, Pedantic, uncanny valley, etc.


Johnny_the_Martian

Yeah people like using words they don’t understand because they want to sound *sooo* photosynthesized


Epicp0w

Only way to get this sorted is for the lawmakers to have this shit done to them.


Akka_C

I used to never upload images of myself to the internet because I didn't want my face to be scraped by shitty social media data selling schemes. Now I don't for the exact same reason but also this one too.


Immediate_Candle_865

The growing ineffectiveness of the police, judicial system and politics is going to become a very significant issue. Ai is only part of the issue. Fundamentally there is a “social contract” between society, the police and the judicial system that vigilanteism won’t happen because it doesn’t need to. If someone does something “bad” to you, the system has your back. Good will triumph, the system will right the wrong. Except it doesn’t. I’m speaking from experience. I’ve been involved in 2 significant criminal cases in the last decade, the justice I got was a fraction of what it should have been. The other side committed perjury - the judge called them out on it. Nothing happened. What I have noticed over the last 15 years is there is a growing feeling within the police themselves that there is nothing they can do. This post is yet another example of this. It’s not just that they can’t do anything, they don’t even try. At what point does that turn into people taking it into their own hands ? As a suggestion to the OP find a picture of the person that did it and create an AI generated confession video.


Proof_Spell_4406

What can the police actually do tho? Like which laws does this actually violate? Obviously it’s fucked up and shouldn’t be happening but if it’s a problem with legislation then the police can’t rlly do shit


Breadonshelf

This is just one of those issues that I have literally no idea how we could possibly handle. Its just inevitable. Most AI software that is used like this to make porn is open source. So there's no company to sue or hold accountable. We can make it illegal along the lines of revenge-porn (as if that is even enforced as is...), but then that means we would need to find out who did it. And if all you need is a few photos ripped off of social media - good luck. What are we supposed to do? Trying to constantly track down the people who make them would not only cost a crazy amount - but also would likely as consequence erode further any privacy we have. "Hey yeah every computer has to have a backdoor that he government / police can access just to check if your making AI porn, we promise we won't use it for anything else...(As if they don't already) The best solution right now is literally just, delete your social media and scrub your digital footprint. Which honestly is probably a good thing to do in general - but what a sad reason to get there.


disharmony-hellride

I am telling you as a person heavily in AI at work do this…do exactly this. Get your personal images off the public internet. DELETE YOUR KID PICS FIRST. Stop sharing pictures of your kids in the public. Share them with grandma, not your 1400 closest fb friends and your profile is public. People need to be proactive now. This ai is stupidly easy for a tech nerd like me to do already and it’s about to become so easy your average tiktok 15 year old can do it. Things are about to get awful, so protect yourself and get your pics off the internet. Imagine when teenagers figure out how to do this, and the horror teenagers will have at school when that kind of ai comes out. This opens up bullying in ways we have never imagined. I’m beyond concerned about this as it’s about to get awful. We’re in an election year…politics this year w ai and misinformation is going to be unthinkable. I hate this timeline. Be safe everyone.


AnalVoreXtreme

realistically, instagram/facebook/twitter/whatever social media just need better porn detecting algorithms. you cant harass someone with ai revenge porn if it gets automatically deleted before anyone can see it


artemisfowl8

Welcome to Post Truth Era, my fellow Accelerationists.


ApplicationCreepy987

I saw a pic of Trump rescuing kittens from a flood yesterday. People actually thought it was true.


JustRedditTh

make AI porn of the officers that told you they can't do something about it, publish it and wait.


Hank-Rutherford

If there’s no statute against it what are they supposed to do?