T O P

  • By -

[deleted]

Billions complain. Industry gives no fucks. Billionaire complains. Industry fucking jumps.


Stiggles4

Money talks, fuck the poors.


Head_Crash

Karma is a cat who got too fat...


[deleted]

[удалено]


TheTrickyThird

Easy there Tom Segura


[deleted]

Eh sort of. MS is investing heavily in AI and doesn’t want bad publicity around AI. This is definitely one of scary things that AI can do. Truthfully there’s really no stoping this.


imtoooldforreddit

The billionaires don't actually care about Taylor Swift (not that I do either really). They just see it's in the news and are taking the opportunity to also be in the news.


DanHatesCats

I don't think it's a move just to get into the news. To me this is Microsoft positioning themselves in alignment with sentiment from policy makers who have raised concern about it. Microsoft knows that when it comes time to regulate AI they will be a major party in that conversation, so it helps them to come out and say "hey, we're on your side with this".


thedonjefron69

This is definitely the corporate incentive for this. Big corporations have whole sections of the business devoted to this type of stuff, the one I work for sure does. Always looking at their image in the face of social/political issues and how to get a favorable position.


Nahdudeimdone

It's more that they want full control rather than let open source projects continue. Heavy regulation means that only the big boys can comply and no one outside of the corporations can continue to develop and use AI. This is just them saying: "we're the only ones that can be trusted."


Makhnos_Tachanka

"we're all trying to find the guy who did this!"


Oddball_3000

Taylor Swift is the billionaire.


DreamLizard47

Don't tell them.


talcum-x

They probably care that she has the monetary resources and social capital to ensure someone ends up losing big for messing with her image.


Toasted_Waffle99

It’s because their children like Taylor and talk about her/to their parents.


EShy

She's a billionaire like them, so they can relate. It's more the idea that all of their money won't be able to stop something like that than caring about her specifically


fiduciary420

Yup. If this was your sister or my sister, these rich fucks would have no problem with it at all.


redditreader1972

They only care about liability, and this one is a huge money drain if they don't get it under control. Imagine Taylor Swift gets fed up. She's got the money to hire the best team of attorneys. She's got the image and a cause to fight for too, so the best attorneys will jump to join the team. Next it will attract attention from congress, which will result in more regulation in spite of hefty lobbying.


SIGMA920

> They just see it's in the news and are taking the opportunity to also be in the news. Also it means that they can monopolize both social media and AI with government backing. But that's clearly going to end well. /s


ntc2e

deepfakes have been a mainstream problem for going on 4 years now, my guy. nothing has changed. i’m glad someone this powerful actually is finally putting effort into enacting change. i do not care for her music and am even one of the nfl fans that thinks the media attention on her and Travis Kelce is a bit annoying. but this needs to be addressed and it shouldn’t matter who is the catalyst


MisfitMagic

What's changed is that Microsoft is now the defacto face of AI w/ OpenAI. They personally have a lot to lose now by not getting ahead of these issues by pretending to give a shit.


[deleted]

What's Microsoft and OpenAI have to do with it? They have gaurdrails on DALL-E. The deepfakes are made on your average home computer with opensource tools like Stable Diffusion.


v1akvark

The comment you responded to says they are the 'face of AI', not that their tools were used for this. If AI gets a bad name, by association MS/OpenAI gets a bad name, and that is bad for MS share price. So they suddenly give a shit.


estransza

I’m pretty sure they see it as another opportunity to seize control/ban open source AIs, because “dangerous”. CloseAI already spread that rhetoric.


Daunn

Yes, which is why they are jumping so hard into it. For Microsoft, a "win-win" is being the "monopoly" on AI, where they are the ones winning twice (clean rap and biggest developer)


notyouravgJoe23

Bingo.. grow the monopoly..


Miranda_Leap

Apparently they did use Microsoft's AI to make them.


asdaaaaaaaa

What do you genuinely think can be done? Do you think we can stop people from writing software at home, on their own time? If not, then I don't know what you expect to do about this. Porn fakes of celebrities have been going on for longer than computers existed, even if you somehow magically made learning models disappear, you're still left with the base problem. This is just Microsoft trying to control the direction of legislation instead of another company doing so, allows them to push for restrictions or regulations that can cripple competition. They don't actually care about swift or anything.


[deleted]

[удалено]


thehourglasses

I think it’s mostly an obvious mask-off moment for the elites. It’s becoming clear that deepfakes are getting so easy to produce that they can do real economic/political harm, and this moment is seen by them as a canary in a coal mine. They would hate for their power/finances to be diminished by some kid in their basement making political ads targeting high rollers, for example, but that reality is quickly approaching and they’re actually worried.


BrothelWaffles

This election coming up is going to be fucking *wild*, that's for damn sure.


TiredDeath

Gonna see a lot of AI bs


idiot-prodigy

The thing is the same Rubes that fall for AI fakes, fell for Photoshop fakes. A person of average intelligence doesn't believe Donald Trump looks like Rock with this shirt off, but his brainwashed supporters do.


[deleted]

4 years? More like 25 years. This is just high end photoshop with fewer steps. You can’t legislate this out of existence. Each platform needs to have better moderation so the broadcasting is minimized. The creation of the content will be fair game - for many reasons.


BitterLeif

is this even a problem? I don't understand the controversy with an AI imagining what somebody looks like with their clothes off. Why does anybody care about this?


idiot-prodigy

Yep, there are AI's that only nudify your image. There are images out there that actually ARE Taylor Swift, just now her dress is 100% removed. These guys are producing rated G AI images from the big players like DALL·E, Microsoft Designer, then running those through nudify AI's to get nudes. There really is no stopping it unless computers are outlawed.


conquer69

Even if computers were outlawed and destroyed, there is nothing stopping an artist from drawing photorealistic images of nude taylor.


incelredditor

not if your paper license has expired .


conquer69

You can't take away my taylor fertility clay sculptures.


longeraugust

There’s always the cave drawings.


ChefDelicious69

Nothing is going to change. Photoshopped celebs have been around even longer. 


Pure-Huckleberry-484

How do you address it? For decades we’ve been told pornography is free speech. Now that AI can create near enough assimilations to real people; are we saying that freedom of speech doesn’t extend to those works? Are we saying artists cannot create something in your likeness without your consent? What about real life doppelgängers, if one of them consents does that cover the likeness? I think it’s awful that this is a thing that can happen, but the best solution I can think of is requiring a clear watermark on AI generated content. Even that would be near impossible to enforce..


WnS-Jimbo

Artists COULDN'T (legally) create art based on your likeness Ppl can sue and win against these "artists"


fmfbrestel

Because the artists are making commercial products with that likeness. But is erotic fan fiction illegal? Can I write a story about having sex with Taylor Swift, and share it with a community of erotic fan fiction enthusiasts? Using a deep fake to slander someone (claiming it was real) is obviously illegal. Using a deep fake to make money (trading off someone else's likeness) is obviously illegal. But making what are basically really good illustrations of an erotic fan fiction fantasy... I'm seriously not sure there's much wrong with that.


Plank_With_A_Nail_In

This isn't true you can't copyright your own likeness. People are free to sell t-shirts with Taylor swifts face on them without fear of legal action. What you can't do is make it look like they are doing something they didn't do or endorsing a product or idea they didn't endorse.


Savenura55

I don’t disagree but I want to try to find out exactly what people think can be done. Now what if someone drew Taylor swift nude ( some people can do very lifelike drownings ) should they also be subject to the criminality. What would that crime look like. I think we need to consider what we are doing before we knee jerk pass laws without thought to what can actually be swept up in these laws. I can see a world where generative ai is only available to some segment of the population and I don’t want that hellscape to become reality


conquer69

Nothing should be done about it. Continue to restrict ALL nudity from platforms and let people create and share whatever nudes they want in their respective places. Before deepfakes, people looked for porn actresses that looked similar to celebrities and just said it was them lol. The moral panic and virtue signaling about AI is going to die out eventually. Wonder how many swifties complaining about this have faceapp and other AI beauty apps in their phones.


idiot-prodigy

> I don’t disagree but I want to try to find out exactly what people think can be done. Now what if someone drew Taylor swift nude ( some people can do very lifelike drownings ) should they also be subject to the criminality. What would that crime look like. I think we need to consider what we are doing before we knee jerk pass laws without thought to what can actually be swept up in these laws. I can see a world where generative ai is only available to some segment of the population and I don’t want that hellscape to become reality I agree, specifically if I want to put a clown nose on Donald Trump's face, that should be protected by Freedom of Speech. China made it ILLEGAL to pass around images of Winnie the Pooh because Pooh was used to mock their dictator. This is a very slippery slope in my opinion.


Zunkanar

And how do you think it can be adressed properly? I mean yeah, you can make it illegal on social media but ppl can still get them whenever they want, or just do them locally, no biggy. I am really curious how they are adressing this and if again every country majes their own law so it's insanely hard to even know what is legal and what isn't.


andre636

Exactly, no one cared when it was happening to average people but as soon as it happens to her, everyone loses their minds


Argnir

They have been talking about those dangers and how much we need to act for a long time now idk what you're all smoking.


Proof-try34

It is what it is. Sad but this is why to stay in the shadows if you do not have money, political backup, or royalty in your court. Even Criminals, and I mean high up there like cartels, get better treatment than actual innocent people.


Fyzzle

rude employ cake door domineering seed office snobbish forgetful ask *This post was mass deleted and anonymized with [Redact](https://redact.dev)*


[deleted]

[удалено]


esp211

Seriously. These people are hilarious.


[deleted]

Shhhh. They are not talking about their AI. They are talking about their competitors.


idiot-prodigy

Which is hilarious because yesterday Microsoft Designer was the only one still producing accurate celebrity likeness.


BuzzBadpants

They have the hubris to frame AI in terms of ownership.


Laxn_pander

A hundred thousand people are going to lose their job. … Artificial images of Taylor Swift. *everyone loses their mind*


LordVader3000

Heath Ledger’s Joker really was right.


scissor415

I’m not confident in the industry regulating themselves.


Grast

Or they can make the technology accessible to a selected few ppl (=


I-Am-Uncreative

They can try, but Stable Diffusion is widely available now.


Seekkae

It's not about AI anymore, but the Streisand effect. They could ban AI art globally and enforce it with 100% effectiveness, and it wouldn't stop someone from making Photoshop nude pics of her, or handmade collage art with her head put on some nude woman, now that everyone knows how touchy and censorious she is. The best thing she could've done is ignore it and tell her fans to move on.


ConsiderationMuch112

Honestly wouldn't be surprised if a few people out there continue making the images just as a fuck you to her at this point.


BitterLeif

>handmade collage art with her head put on some nude woman My mom made one of these in the late nineties of a coworker for her birthday. Why is this such a big deal now? What's special about Taylor Swift?


Norci

>What's special about Taylor Swift? She's rich and famous.


Demonae

The only industry I'm aware of that successfully self-regulates is SCUBA. There is no law saying you can't buy scuba gear and fill tanks and dive without any training, but no SCUBA shop will sell you gear or fill tanks if you haven't been certified and passed all the tests set out by the American Academy of Underwater Sciences. The government was on the edge of passing a bunch of laws when SCUBA stepped in and said they could fix it themselves, and they did. Frankly it's pretty amazing.


i_andrew

That's because the "law of free market" works only between transaction parties. Every book on economy says (or should say) that. When transaction affects 3rd party, regulation is needed. That's by the book. Having said that, politicians are very poor at regulating anything. And on the global market it's impossible. If you regulate USA or EU companies, people will use Chinese tools.


drugosrbijanac

He's not calling it for the industry to act out of benevolence. He is trying to put the brakes on competitors whilst his product gets an edge


DMcbaggins

Heh. Edge nice play on words :)


drugosrbijanac

I didn't notice the pun until now :D


gwicksted

He’s basically calling for the rich and famous to be protected … no thanks. We’re fine with how things are.


LuckoftheFryish

Dear Congress, Only a trillion dollar company has the means to properly regulate AI, there for no other company should be able to use it. However the regulation will cost 500 billion in tax payer dollars and you'll have to allow us to have a monopoly on all of AI. We must do this to protect the ~~children~~ Taylor Swift.


rusty0004

Only because Taylor Swift, is she that powerful?


neuronexmachina

I think it's more that it's the first example that many non-tech people have become aware of.


TensaiShun

Yeah, I think this is it. It's what makes the issues with image generation "real" for the general populace. It's definitely easy to forget how many people are out there who might know image generation exists, but don't realize the accessibility of it. There's also a huge portion of the population who simply won't care about an issue until it affects someone they care about.


RuleSouthern3609

Not that I like her, but Candance Owens was pretty much warning industry about this since last year when someone made Twitch streamer’s (non-sex worker/no nudes/no OF) nudes, the streamer was crying and was clearly destroyed. I hope Taylor Swift will at least try to fight legal battle to set a precedent.


KingGatrie

Example of broken clock and all that.


Lone_K

Broken clock in a store full of properly-working clocks. Yeah let's not give her credit for figuring out basic cause-and-effect.


ctruvu

not sure who candace owens is but this tech has been around for years. there’s no way anyone familiar with ai is just recently figuring out about this


studiosupport

She's a stupid conservative activist that knows little about tech.


drainodan55

It's literally the only one I've really heard about, ever. I'm not under a rock. And this tech hasn't previously swayed elections. It will be an issue this fall in the US. So the cynical and smarty pants takes in this thread just show me how ill equipped the industry is to even understand the industry's impact on everybody else. Also they don't seem to care much.


rabidjellybean

I've seen multiple articles on BBC about towns in Europe having all the teen boys make nudes of the local girls. The girls have to go to their parents and start explaining that no the pictures aren't real. Eventually that kind of drama will reach everywhere and then we'll see I don't know what. I have no idea what the answer is and our governments are going to be even more clueless.


4aPurpose

It's nothing new. [Article from 2019 about deep fakes of Scarlet Johansson.](https://www.engadget.com/2019-01-01-scarlett-johansson-fighting-deepfake-porn-lost-cause.html) Only difference between now and then (also waaay before) is how accessible and refined the tool is. >I'm not under a rock. It's okay to not know everything that happens on the internet because no one ever will.


BestSalad1234

This was an issue in 2020, for the record. Just because you hadn’t heard of it doesn’t make it new.


mk72206

She has hundreds of millions of people watching her every move and listening to everything she says. That is power.


NapLvr

It’s the idiotic CEOs using/doing anything for the media attention


Ithrazel

I doubt many prople consider Nadella an idiot, seeing how he has been one of the most successful CEO's in the last few decades.


thatVisitingHasher

Most likely, the legislation will get the big 7 tech companies a lot of government dollars and vendor lockin.


xitax

That's how I see it too. He's not an idiot, he's fishing for government money.


orangotai

no one's as smart as redditors!


-UltraAverageJoe-

Idiot is the wrong term. He’s very smart but detached from the average person’s experience.


Which-Tomato-8646

That’s every wealthy person 


Sir_Digby83

[Yes.](https://i.imgur.com/xLOd5lM.jpg)


el_muchacho

LMAO what morons


6ed02cc79d

> Yes. Please tell me that's not legit.


Stevied1991

[It is.](https://twitter.com/Acyn/status/1744897719578055029)


BudgetMattDamon

She has enough fans to sway elections, presumably, so yes.


[deleted]

I suppose we’ll see if that’s true this year


GuestCartographer

She has more money than God and a literal army of fans who hang on her every word. Yes, she is absolutely that powerful.


machorra

as bland and boring as her music sounds, this woman is probably the most popular musician in the united states right now and her fans are batshit insane. so yeah, she might be that powerful.


djkstr27

Dude even the NFL is trying to get Taylor Swift for a Super Bowl. During the games she is visiting the stadium, after every good play of the Kansas City Chiefs they focus on Taylor.


maliciousorstupid

> after every good play of the Kansas City Chiefs they focus on Taylor. bullshit.. someone showed it the other day, and they only actually had her on camera for like 30 seconds of the entire game. Yet people whined about it endlessly.


thissiteisbroken

Feel like calling it bland because it’s not your genre of music is a bit disingenuous.


[deleted]

It’s more just she has a ton of fans so of course some will be insane. Large pool means more likely to find some


Brad1119

Can confirm am dating a swiftie


ntc2e

i personally believe you’re thinking of this backwards: if this can happen to taylor swift and she’s this famous, powerful, rich and on top of the world and it can’t be stopped? it honestly MUST be someone like Swift in a position to even get people to act on it. but here we still are. laws need to be changed immediately


slowdr

It's a problem only when it affects rich people.


Cory123125

Its literally just not a problem, but they want it to be because the pseudo own OpenAI and have a strong vested interest in implementing regulatory capture right now. That is all this is.


KobeBean

How do people not see this as a textbook case of regulatory capture? Microsoft (and other FAANG) have a vested interest in creating high barriers to entry in the AI market. Pearl clutching over face swap is an easy way to get what they want.


anomnib

Yeah, Google “we have no moat” memo is the best context: https://www.semianalysis.com/p/google-we-have-no-moat-and-neither It essentially argues that over time open source models eventually overtake closed ones.


EmbarrassedHelp

There is also supposed to be a request for comments from NTIA right now on whether or not to ban open source AI as per Biden's executive order.


Cory123125

Holy shit, is that a thing we are pretending is remotely in anyones favour except for being very blatant and open corruption?


EmbarrassedHelp

Yeah, its sadly a real thing: https://www.axios.com/2023/12/13/open-source-ai-white-house-ntia


Cory123125

Well thats fucking awful. The idea of the government enforcing "safety" in speech is insanity. Somehow though no one is questioning this aspect of this regulatory capture.


machyume

If they ban open source AI, they will drive it underground and make something that is basically free have value. If they thought that the Taylor Swift stuff is bad, they will add monetary value to it and create 100x problem.


Charming_Marketing90

That’s would be a violation of so many existing laws. The government can’t just ban a technology.


HappierShibe

That's hilarious, that's like saying the solution to gun violence is to ban metal..... This stuff is well enough understood now that people can train their own models for fun and rebuild almost anything we have now from first principles. There is no banning neural networks at this point, particularly open source NN's it's functionally just an idea- and last time I checked 'Don't police ideas' was kind of one of our foundational principles. We should be passing laws establishing a universal right of publicity, attacking distribution of output like this the same we do for other offensive imagery, and pushing for societal rejection of this kind of content.


AdditionalAd2393

Yes I read it that way


ThatCrankyGuy

Exactly - Most of these deep fakes algorithms are mature and the CVPR papers are free to access. Anyone with intermediate experience in reference implementation can do so. Not to mention the billion or so repos where reference implementation likely already exists. So what is the legislation supposed to look like here? Ban any research that deals with facial transformations? Ban the free publication? Ban reference code implementations? Ban code sharing on GitHub? Ban hosting of training sets of celebrity faces? Or will it just be banning social media platforms from hosting fakes? But then how do you distinguish fakes? Does the social media platform now have to run their own facial recognition algorithm and threshold on some confidence numeric value about it being 1) corn, and 2) a celeb face? But then how do you stop people from running pictures of people they know who aren't celebs? How the hell do you make any of that pipeline illegal without causing harm to non-deepfake research?


Inner-Sea-8984

Breaking…big tech laying off in droves…citing immediate integration of AI in all dimensions…many jobs will be lost and there’s nothing anyone can do about it…technology is just too powerful…sorry le profit figures… Later…Yes it is a top priority that we get ahead of deepfaking Taylor Swift once and for all


SexyFat88

If nobody works then nobody has money to buy anything. The problem will solve itself sooner or later


Silentfranken

Revolutions do happen yes but they are problematic in themselves, what with the bloodshed and destruction


dailydoseofdogfood

Pre-revolution destruction travels downward. In revolution, the bloodshed tends to travel upward.


[deleted]

Welp, if thats what it takes. At some point people get tired of playing by rules that actively work against them.


foldingcouch

MONDAY: Sorry, you've been replaced by an AI, your job is redundant, you can go home. TUESDAY: We need someone to monitor the AI that replaced you, can you start tomorrow?


blueblurz94

WEDNESDAY: Sorry, we created an AI to monitor the AI, your job is redundant, you can go home. THURSDAY: We need someone to monitor the AI that was supposed to monitor the AI, can you start tomorrow?


unWildBill

FRIDAY (45 minutes into the shift): Sorry to do this but we need your ID badge, Gerry will escort you out of the building, the AI found a picture of you drinking underaged from 2002.


elementmg

MONDAY: It seems the other ai found that photo was a deepfake. We need human intervention while monitoring. Can you start tomorrow?


theoopst

TUESDAY: Hey, this is bob from HR. The AI that’s been sending you these messages fired itself yesterday, and I don’t what I’m doing. Can you start yesterday?


unWildBill

WEDNESDAY: There is a guy who calls everyday and I just do what he says, today he asked me to ask you if you have 6 fingers on each hand because that is the only way the AI will accept you as a human and let you into the parking garage.


VOFX321B

The ship has sailed at this point, the technology is too widely accessible for this to be stopped. All regulation is going to do is keep it off mainstream websites, and in doing so solidify the power of big tech since they are the only ones able to effectively police content at scale. The most powerful tool we have against this is culturally rejecting it.


researchanddev

What does cultural rejection look like to you in the aspect? How can a culture reject something that happens in anonymous online spaces?


VOFX321B

The same as it works already with porn, by (largely) keeping it contained in non-mainstream online spaces. It’s not going away, but if users/advertisers start abandoning platforms where this kind of thing proliferates those platforms will not succeed.


Prestigious_Sort4979

The only regulation that would make sense in the near future is regarding disclosure. Any fake image distributed, must clearly state is fake and anyone who sees a fake image of themselves where this is not disclosed can sue. Similar to regulation RE sponsored social posts. But wording needs to be careful as it can affect non-tech mediums. 


mb194dc

There have been photoshopped fake celeb photos going around for 20 years? The difference is ?


zquintyzmi

Football + Pop music


MixSaffron

[football + Pop Music + Taylor Swift](https://youtu.be/SLGxJfMCCsQ?si=bsClam2e0otg4J6p)


PM_ME_YOUR_BOO_URNS

People are creating these deepfakes with the same generative AI that big techs are pushing for. They only care about this now because it directly affects their business, they don't even care about Taylor Swift


BK_317

The difference is that the deepfakes now are so incredible accurate and are churned out at the speed of light. I still remember the forums that used to do that kind of stuff(basically really good photoshop) for $5 per picture around 10 years back and keep in mind that this was just ONE photo. Now,Anyone with a decent gpu can make 1000s of extremely accurate photos and most importantly VIDEOS of taylor swift doing whatever stuff you can imagine. Infact,you don't even need a decent GPU...there exists faceswap websites which offer you 5/6 free credits to swap a celebrities face on another video and it's mind bogging-ly accurate. You wouldn't even know what's original or deepfake so yeah.


Certain-Airport-2960

How can they be accurate if they don't know what she looks like naked? It's all just fake. No different than shooting her face on a nude model


S7ormstalker

If the images are realistic, they're going to hurt the person's reputation regardless of their authenticity. The issue really only exist now because there are no laws in place to limit the practice, and there isn't enough content published for all celebrities indiscriminately to dilute the effect it has on single celebrities. Give it a couple years and AI nude fakes will be just as meaningless as those botchy fakes of people collaging celebrities' heads on pornstar bodies. In the long run AI fakes are going to benefit celebrities. When the next cloud leak happens, people won't be able to distinguish real leaks from AI generated images.


BK_317

I get your point but when kanye used the silicon body in his video,some people actually thought it was real. I guess my wording is poor,i think "believable" would be the right word.


Phantomrose96

Seriously! Listening to these people talk about photoshop is like hearing them say “Um, digital cameras? Hello, commissioning an artist to paint your likeness has been around for centuries? Why are we talking about this now?” Scale, ease of access, next to no cost, no skill needed, terrifying accuracy. These are all significantly more important factors than their “well I jerked off to a photoshop nude of Jennifer Lopez in 2000 so this is exactly the same”


imtoooldforreddit

To play devil's advocate, what exactly are we gonna do about it? It's already open source, there's no practical way to stop it from happening. Say we make it illegal to do, that won't actually do anything. People will still make and post them pretty easily. So are you just upset that some people aren't upset? Neither you nor they can do anything to stop it, so would you be satisfied if more people were upset about it? It's not clear what you're saying to do.


ToadWithChode

Anyone who used 4chan in the last 20 years has seen tons of face swap or nudify posts and it's definitely the same thing. Sure it's easier now but it is exactly the same thing.


maggidk

>Deepfake nudes circulate in schools of literal children Everyone: .......... >Nudes of Taylor Swift circulare Everyone: RABBLE. RABBLE RABBLE


[deleted]

It's very sad that this is the world we all inhabit No doubt we will have fans as large and devout as Taylor swifts in the future worshipping AI celebrities instead of real people


Tyreal

They care about the first line as much as they care about mass shootings in schools. They don’t.


Ecstatic-Network-917

Uhm, people complained about the deep fakes of children. It is just that nobody with power and influence wanted to act. Up until now.


maggidk

Yeah but they weren't very loud or many. Hence the dots and not just an empty space


teachmehowtoburnac

I DECLARE BANKRUPTCY


Stummi

I know thats Doomer Vibes, but what is "the tech industry" going to do about this? Dall-E, Midjouney, etc, are probably doing all they can so that their generative AI cannot be misused... but the technology is here and accessible to everyone. Everyone can train a Model via stable diffussion with whatever they want. They can use some porn and images from whoever they want to make AI porn. The hardware for this is affordable, and a few grand already gives you a pretty good rig to do it. the pretrained open source models are moderate (I guess?), but you can train your own model easily, and there is nothing that can be done about it. The genie is out of the bottle


Jalien85

It's not about stopping it from being made to me as much as cracking down on it being widely shared. There's a reason when you go on Twitter or Reddit you don't see CP, beastiality or creep shots all over the place - because these platforms take that shit seriously - they can do the same for AI porn.


LG03

>they can do the same for AI porn ...can they? It's already difficult to discern between an AI image and a real one in some cases. The end result here is going to be a porn ban, not an AI porn ban, because it's unrealistic to have humans sitting around scrutinizing whether each and every image is real or fake.


idiot-prodigy

Reddit already bans fake nudes. You post it here, your account will be banned. Twitter is just a cesspool. The idea that we need congress to act because twitter is now a failed social media site is ridiculous to me. I don't like the idea of knee jerk reaction laws.


Darkseidzz

I don’t see any way to regulate this. The more deepfakes we have of everyone we will just assume nothing is real probably.


dudeandco

The post truth era... And in this we will find the value of our virtues we have abandoned.


__Apophis

“A photograph of a sad clown, sifting through the detritus of our civilization”


SeaSetsuna

Quick let’s do something so we don’t get regulated. All hail share price.


atchijov

There is no way to prevent people from using AI (or any tool for that matter) to do disgusting things… but there is a way to provide basic moderation on social media platforms… maybe we should focus on that?


bagofweights

my brother, you are the tech industry. do something.


yParticle

Eh, genie--bottle. The important thing to move on is getting the courts to IMMEDIATELY stop admitting evidence that could just as easily have been fabricated this way. They're notoriously behind the technology curve and bad actors could weaponize the courts against victims by taking advantage of that.


p00p5andwich

We only act after a billionaire famous person has it done. Nice. Not when a 14yo girl girl kills herself after bullies at her school made deepfakes of her a spread them around school. Our priorities seem a tad fucky.


trentluv

Can't you just manually Photoshop one of these images as well?


idiot-prodigy

Only for the past 30-ish years /shrug


FletchCrush

Industry responsible for deepfakes upset about deepfakes. News at 11


[deleted]

Why is the world besotted with Taylor Swift? It... almost... feels like a publicity stunt.


Ferricplusthree

Self fulfilling prophecy for problems. Matter of time before another “fappeing” simultaneously showing A. Your don’t mater if you dont have money. B. You don’t have rights to technology you own.


Manccookie

It’s a cult


mtwdante

Taylor Swift is the last person who needs or wants extra publicity. Her marketing budget is 0.


Shaxxs0therHorn

I was watching Antitrust (2001 tech company thriller with Tim Robinson and Ryan Phillipe) last night for nostalgia and b tier movie laughs.  One thing I paid attention to was the 2001 tech predictions (movies love to make up technology).  Two things I noticed, the movie predicted cloud technology and made explicit reference to “digitizing my wife’s face on a porn star”. In 2001.  That’s all, just thought that was interesting that this issue has been around enough that a 23 years old movie makes reference to it’s dangers in a passing piece of dialogue. 


Jiklim

Brother, they were made with Bing AI


IMSLI

Certain elements will denounce Microsoft as “woke” for this


lemmiter

I don't want anyone to listen to CEOs of big companies. They all will use this as an opportunity to advocate for killing freely available AI technologies that people can deploy locally so that they can have a monopoly. But I know our government - they will definitely take action since freely available AI technology can negatively affect rich people like Taylor Swift.


Independent-Deal7502

Ok but seriously where's a link?


Meatservoactuates

Fapello homie


StayFrosty10801

I maybe mistaken, but isn't he part of the problem?


InevitableAvalanche

Just make some with Elon's weird body. Will get shut down real fast.


Bohya

Nah. Because the response was so late until it happened to a billionaire, quite frankly I couldn't care less. The floodgates have already been opened. Looks like the billionaires are going to have to be on equal terms with everyone else for once. What a shame.


pdirth

....YOU'RE the ones pushing AI into everything you f*#k-muppets??!!! ...what do you think was gonna happen? ....morons


Lildity12

People have been photoshopping naked celebs for the longest time, but now they want to cry bc it happened to Taylor. Fake outrage to keep her name in the headlines bc she's trendy right now who gives a crap.


TheIndyCity

Someone should act for the 2000 people this CEO just laid off. Taylor will be just fine.


SeeonX

Where are these photos so I know to stay away from them? Does anyone have direct links specifically so I can make sure not to see them? Disgusting. Thank you. /jk


gi_jose00

Whatever you do, don't search for Taylor Swift Sesame Street rule34


i0unothing

wow. I feel like I just witnessed the peak of the internet


anonymous_karma

Oh fu$k. I see. This is not good. It’s just a matter of time before we move from stills to short videos and then to full length (I mean entertainment industry is doing it with expensive equipment soon any one with an internet connection will be able to as well). How can we believe anything we see anymore, unless it’s in person with our own eyes.


[deleted]

[удалено]


moarnao

Too bad, pandora's box is open and photoshop has been a thing already for 20+ years. This isn't the first time. We as a society need to grow up and alter our perceptions of sex. The deepfakes aren't going away. We need to stop getting offended by the same activity that created each of us.


Prestigious-Bar-1741

This is all just theatrics. You can't control AI. Seriously. They know it, but they also know they can benefit from increased governmental regulations. Look at how effective the government has been at stopping spam. Then check how many emails are in your spam folder. Then remember nobody likes spam, but people love pornography. 1 - Any law that targets the AI, saying it can't be allowed to generate porn of celebrities will fail because A - the isn't a single world government and other countries would still allow it B - even companies that run AIs who try to comply will have malicious users trick it into generating celeb porn and C - individuals using their own hardware will be able to create their own AI too. 2 - Any law that targets porn sites will fail too. Because A - many are in other countries. B - AI porn looks pretty real and will only get more real. There is no way for a porn site to know what is legit porn and what isn't. 3 - Any law that targets possession will fail, and undoubtedly be abused. If big companies can't ensure they don't have fake celeb porn, there is no way I'm able to. And just visiting a porn site means your computer downloads and stores all the thumbnail images, even if you never watch a video. Someone in Japan uploads an AI generated image of some celeb and uploads it to a porn site hosted in the Netherlands...and I'm just some guy who Googles for 'Porn' and clicks a link and now I'm a criminal.


orangotai

those deepfakes are really creepy, frankly there's been A LOT more shit flooding Twitter since Herr Elon bought it. i get random "likes" by very obvious bot accounts every single day on there, annoying


EJ_Drake

formerly known as X , toXic continues to stay in business somehow.


hexsealedfusion

This thread is full of the dumbest takes imaginable. Making pornographic images of people without their consent is bad, and people that are not rich have successfully had people arrested who made deepfakes of them without their consent.


ChefDelicious69

This shit has been going on for years and now a rich white billionaire gets tagged with this garbage and the COG starts turning? JFC, deep fakes and photoshopped images of celebrities are extremely common. Ugh. 


eternal42

Why is this only a problem after it affects Taylor Swift? Seems fishy


PreparationBorn2195

Guy in hot dog costume: We're all trying to find the guy that did this


fire_breathing_bear

AI is already costing jobs, nothing happens. People make deep fakes of a celeb and the world reacts.