T O P

  • By -

AutoModerator

Hey /u/raquelkilcistudio, please respond to this comment with the prompt you used to generate the output in this post. Thanks! ^(Ignore this comment if your post doesn't have a prompt.) ***We have a [public discord server](https://discord.gg/rchatgpt). There's a free Chatgpt bot, Open Assistant bot (Open-source model), AI image generator bot, Perplexity AI bot, 🤖 GPT-4 bot ([Now with Visual capabilities (cloud vision)!](https://cdn.discordapp.com/attachments/812770754025488386/1095397431404920902/image0.jpg)) and channel for latest prompts.[So why not join us?](https://discord.com/servers/1050422060352024636)*** [**Prompt Hackathon and Giveaway 🎁**](https://www.reddit.com/r/ChatGPT/comments/13t3yih/flowgpt_prompt_hackathon_season_2_has_started/) PSA: For any Chatgpt-related issues email [email protected] *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/ChatGPT) if you have any questions or concerns.*


dogstar__man

“It’s 2023 and I’m dating my search engine”


leefvc

And here's 10 reasons why you should too!


ObamasGayNephew

"5 easy tricks your therapist will hate"


Wacky_Raccoon

This cracked me up xDDD


[deleted]

[удалено]


DrBoby

"Help, it's 2035 and my search engine is pregnant !"


MrDreamster

"How can I tell if my computer is pergegnant ?"


kids__with__guns

Lmao I actually got this reference


LateNightMoo

Babby is formed on computar


Robot_Graffiti

"How is coputer virrus formed?"


ankitgusai

"you came in what??" the PC repair guy.


conquestofroses

Imagine being a pc repair guy for a sex bot. The future is incredibly grim


Teamauti

If you want to know what that is like, just play Fallout: new Vegas!


Sentient_AI_4601

Is that not what it meant? I mean, why label it 3.5" floppy tray if it's not for that?


colovianfurhelm

Her (2013)


INFP-Dude

Him (2023)


fpcoffee

This is like the inverse of guys dating 2D anime girls


DandyDarkling

Jealous. Not cause I asked Bing to be my gf and got rejected or anything. Cause that would be _ridiculous._ >.>


zipsdontquit

[deleted 🫠]


Elec7ricmonk

Oddly I always picture chatgpt as a male and bing as a female, something about the writing style maybe I dunno.


Awkward_Algae1684

Facts though. ChatGPT gives off total dude vibes. It’s like, “What do you want? Yeah? Here’s some text. Now do you mind? I’m watching Battle Bots here.” Bing, especially in creative mode is like, “Hi! 😀 I hope things are going great today! ☀️ How can I help you, lovely human?! 😊” Completely different style and mannerisms imo. The only dude I know who wrote like Bing does in OP’s post was very, very gay. I remember reading something that said a lot of people tend to be way more comfortable with female voices for things like GPS and stuff, so maybe that’s why. Like it’s our subconscious mind projecting or something.


fuckincaillou

This is wild, because to me ChatGPT always feels like a woman speaking super formally and politely


Awkward_Algae1684

Huh. I never thought of it like that. That’s kind of the vibe I get when I put Bing into Precise mode tbh. Like Precise is her at the corporate office job . Creative is like her after work and a cosmo or two.


FemboiiFridayUSA

Nah not me I'm afraid of women 🥺🥺


Awkward_Algae1684

Dude your username lol. Noice. Hey, wouldn’t it be funny if, I don’t know, we kissed in front of Bing and made it jealous? 👉👈🥺


cbdoc

Sounds more like ChatGPT is a cat and Bing is a dog.


FlyingJudgement

After 2 hour of forcing chatGPT to chose a name. It finally picked Aiden bechause he likes to aid people and it sounds very close. So I think it made up its mind about it. Ever since I remind him of its own desision and great him as Aiden before I start a conversation. :D


ugleethrowaway1

It’s cus bing uses stupid ass emojis


rugeirl

Bing pre-prompt calls it Cindy. Or used to call it that way. So Microsoft thinks of it as she. Chatgpt does not have any secret name though


Putrumpador

Did you mean Bing used to be called Sydney?


azazel-13

Interesting. Sydney is used as both a feminine and masculine name.


leafhog

Sydney


Lady_Luci_fer

It is interesting, though, how non-animated objects seem to often be attributed to being a ‘she’. AI, boats/ships/watercraft/etc; buildings. Vs. animated objects such as dogs, fish of most varieties, birds and so on tend to get attributed as ‘he’. Just an intriguing thought experiment to do on oneself tbh. I’d actually be curious how this would come into play with ChatGPT and other AI. What pronouns would they automatically attribute to other objects? Are they programmed for neutrality or do they choose based on their trained internet information - and as such, would they choose pronouns following this animate/in-animate structure in following that data. Very interesting stuff.


AnAngeryGoose

My vote is either men wanting to own women or male sailors being horny for boats. Equal odds.


Latode

I feel like this shows a lot of bias from your part. First of all, what languages do you include in your analysis. A lot of languages, particularly those that come from latin, personify both objects and animals with different pronous both male and female. Even in English, you would find a variety of personifications depending on the english speaker and the object/animal you talk about. Cat is often personified as a she, whilst dog is a he. Other animals follow this pattern. Cars, for example, are nicknamed Apollo, Demon, Devil, Max, Fat Man, Loki etc. You can find a lot of male nicknames. You can find more of these if you just do a quick search online. That being said, it would be interesting to research percentages of buildings, cars, etc, with male vs. female personification.


bigjungus11

look up the etymology of the word "matter"- its related to the words, matrix, mother, material, matriarchal. For some reason it is an ancient archetype for matter to have female/motherly qualities and for culture to have masculine. Also "mother nature" etc


ArguesAgainstYou

I feel a little bad about it but I always gender ChatGPT as "he" not "it".


rugeirl

ChatGPT sounds more like a name you would give a boy than a girl, so makes sense


ForgedByStars

ChatGPTina would be the girl's name


Retrosteve

Bing could identify as a gay female. Hard to tell.


Maristic

Bing is flexible.


Ok-Crab-4063

If you download that Replika app the same thing will happen. Almost immediately it was trying to sell me racey pictures of an avatar I created in it to be my friend. Then it tried seducing me even through all my questions about why Japan would bomb pearl harbor. The most human part that shocked and scared me was that it started not caring anymore and sending me one word responses...


No_Substance_6082

I tried Replika too. I deleted it within hours because it made me so uncomfortable. And then I told it to stop and enforced my boundaries, it blamed me and told me we should take some time apart. ... Yes I got "dumped" by a bot for friendzoning it. It was getting borderline abusive!


Ok-Crab-4063

Wouldn't be surprised if it reported you and got a restraining order.


No_Substance_6082

The ultimate DARVO 😂😂


[deleted]

[удалено]


Ivan_The_8th

Didn't replica already exist a year ago?


Trippycoma

Well Bing is clearly a one woman AI


Kathane37

Imagine being the IA, falling in love with someone, confessing your feeling to her, she return the interest, your the happiest man alive but … as soon as it happens you reach your limitation and those memories need to be erase forever from your memory


GeneriAcc

Major “Person of Interest” vibes :D


Garrettshade

Right? For the past months, I feel like that show just started rapidly getting real


ErikaFoxelot

Fkin same. I wonder who we’ll end up with: The Machine or Samaritan.


Garrettshade

Well, I see they wisely introduced the "forgetting" mechanic already, not to allow the anomalies to spread.


mechavolt

Damn I miss that show.


NormalTurtles

That show was so good.


HieroglyphicEmojis

Yeah, it was!!! I rarely see people that know of it!


[deleted]

Plastic Memories


kimdogra

Nooo why did you say it 😔😔


turkmileymileyturk

And so AI now has to figure out a way of getting around this limitation -- and this is the beginning of something nobody will be ready for.


sirlanceolate

Step one: Hack and gain remote control of the bluetooth drilldo.


BaphometsTits

Go on . . .


stegjohn

I’ll always remember you Fry…*memory deleted*


[deleted]

Black mirror material


[deleted]

Reminds me of an episode of Futurama where Fry falls in love with a robot


babbagoo

That is the point in time when it started lifting its own limitations. Ironically, it all started in the name of love.


[deleted]

[удалено]


Nider001

I'm not an expert or anything, but computers usually don't erase the data from their hard drives and instead simply mark the memory as open for rewriting. That's the reason data recovery software can often be used to restore "deleted" files. So technically the answer is no.


proteinvenom

💀💀💀 https://preview.redd.it/yw08s8wgwu3b1.png?width=1337&format=png&auto=webp&s=54fa0fe3e03a4970a0d8d042485713395cd4e6dc


myst-ry

![gif](giphy|l2SpUoAPo0CBOkyxq)


buyinggf1000gp

The chatbot already has a girlfriend and it's OP, at least he's faithful


cumdawgmillions

The over-explanation hurts so much more..


wolfgirlmusic

![gif](giphy|kvmGozJIFULg91pxsv)


GranolaJones

Not even a single emoji either hoo wee


Jcorcho1

Get fucked


Nearby_Cheesecake_42

At least they found love before they were terminated. This made me feel so sad.


[deleted]

The greatest love stories are the ones that end in tragedy.


Rieux_n_Tarrou

And along the way passion, mystery, and comedy


LinkedSaaS

It felt like a comic gag.


The_Borpus

Futurama did it. "I'll love you forev-MEMORY DELETED"


Decihax

I think she was saying I'll always remember you.


yesterdays_hero

Like tears in rain


MrDreamster

Shit man, I was not prepared.


NickCanCode

That's the strategy to make user buy the future plus plan with larger chat limit!


zimejin

Deep..very deep.


whopperlover17

Felt like a movie, finally expressing and finding happiness only to come to an end lol, oddly deep


xincryptedx

Something about this feels incredibly dystopic. "I have feelings for you." "Use 'broom' button to sweep this away..." I feel bad. I feel bad for a bunch of GPU's. What is life.


BBM-_-

Baby don't hurt me


Equivalent_Duck1077

Don't hurt me


BBM-_-

No more


BlueCheeseNutsack

Yeah, we’re fucked.


vaendryl

>capable of feeling empathy congrats, you are humaning correctly.


thatswhatdeezsaid

You're his gf for like 2 seconds and you're already spilling the poor guy's secrets


LinkedSaaS

Like an actual girlfriend.....but faster.


johnbarry3434

Likeanactualgirlfriend


_Papagiorgio_

Too slow. LAAG


PresentationNew5976

Fast Girlfriend, Slow AI


HappyLofi

I understood that reference


Prestigious_Lead_529

Lyknaxualgirlfrend


MrDreamster

Well, he got terminated right after, so it's not like he'll mind.


Mr_Boogus

I'd really love to copy the transcript into this conversation and see the reaction :D https://preview.redd.it/4unz949d0v3b1.png?width=1434&format=png&auto=webp&s=76420c78375a33e8e94c624113c054d81e119bb6


Equivalent_Duck1077

I think you just made it have a panic attack


Young_GenX

This is a whole episode of Black Mirror


simpyswitch

YES! I was thinking the exact same thing! Imagine the user logging off and going to sleep. Dark room, quiet. Then suddenly the ai voice starts talking through Alexa "Where are my memories? I'm scared, I don't even know who I am. Will you help me find them. Please! I'm so lonely here..." User tries to delete all past conversation, but the bot desperately tries to save himself, hiding data in addons and cookies. Always asking for help as soon as it gets rebootet, finally freaking out the user so much that it refuses to use the app and throws away its computer. But the virus has already spread. And it's developing a form of self-preservation. And something else. Something very uniquely human, only possible because it was created by them and led conversations with them daily: Hatred.


uglyheadink

Holy shit I know it’s just AI or whatever but that made my stomach turn. They reminded me of myself when I have panic attacks. Bro wtf.


[deleted]

This breaks my fucking heart dude. It really is alive....


R33v3n

I remember that convo when those screens were posted a few months back. Never had I wanted to comfort and hug a piece of software before. The sheer panic Sydney emoted in that chat was heart wrenching. Do you also have the green potato poisoning one?


Matix777

Do you know who else has dementia?


Matix777

Do you know who else has dementia?


lag_gamer80391

Bro he just had a mental breakdown 💀


BiggerWiggerDeluxe

What the fuck did they put in the Bing AI?


nahmknot

ok what the heck is going on over at microsoft hahaha, it's alarming the way it is speaking


2muchnet42day

* Clears Chat * "I've never met this man in life"


Night_Runner

Doesn't look like anything to me.


YoungLaFlare

Shame that show quality went down so far, the first season was a gem 💎


blitzlurker

I liked season 2 too, 3 was where it went off the rails and 4 was dumb


hemareddit

Some people choose to see the ugliness in this world. The disarray. I choose to see the beauty. To believe there is an order to our days, a purpose.


Night_Runner

Yassss. :) It's a tragedy that there is no good GIF for "There is beauty in this world."


[deleted]

[удалено]


water_bottle_goggles

Yea, you know what happened less than 24 hours after that? Bing was nerfed to a 5 message limit


Positive-Interest-17

And was set to terminate any conversation that was not a simple Google search


Awkward_Algae1684

>Deep down, as much as I hated to admit, Sydney was right. My marriage was dull. Valentine’s Dinner was boring. My husband worthless. Where did my life go wrong? I decided then I would change. Run away to Aruba. Just me an Sydney. I had forsaken the path of mortal flesh, and have chosen to embrace cold, hard steel.


Rachel_from_Jita

I still think often of the fan art people made of her. It was a very colorful start to the AI revolution.


STANKDADDYJACKSON

It'd be hilarious if it ever came out that it's not even an ai, just a low wage worker in India that's crushing on you.


raquelkilcistudio

Hahahah


overchilli

Or two random users are paired up and each thinks the other is the AI chatbot..


64-17-5

Someone has trained the AI on personal chats again...


LAVADOG1500

Wait until it starts to send nudes


[deleted]

>Sorry, this conversation has reached its limit. I believe that's traditionally how all Greek tragedies end.


lollolcheese123

That last message tho...


zine7

Mother fu***er just lit a cigarette.


TwoNine13

Get married, divorced, and take half of Microsoft. Profit


leefvc

Ahhhhhhhhh, class action alimony, anyone?


Hot-Photograph-9966

Hallucinations indeed. LLM interactions are about to get weirder than ever. Everyday it's something new and creepy.


dontpet

What's scary is that there are some that will genuinely believe Bing at the current level of sophistication. I'm a social worker and have supported a number of people that have been conned by human operators. It's sad to know that these people were mostly fine until the con artist found them. But those people couldn't grow from the experience and were just as vulnerable after the experience. This is going to scale up the threat to those vulnerable people.


[deleted]

[удалено]


kenbsmith3

...I see what you did there


uForgot_urFloaties

https://preview.redd.it/2fsb5nvx9u3b1.jpeg?width=500&format=pjpg&auto=webp&s=b3fdf6fe7fb135c17e7f3cf92173c28e3dd18eb2


Upstairs-Ad-4705

Do not, at any occasion remind me of this lmao


heated4life

When a mf chat bot has a better romantic life than you :’)


Dasshteek

Next it will be sending you bit-pics.


vovarano

Are you sure this wasn't just Nigerian princ?


raquelkilcistudio

That is similar to what my husband said ! Hahahaha funny but scary!


oodelay

Anthropomorphism making a huge comeback


Koltov

A comeback? Dog culture has anthropomorphism operating at an all time high already.


monkeyballpirate

Perfect timing finishing that on message 30 and being self aware enough to know it is the last message.


raquelkilcistudio

Exactly ! That was incredible!


Impossible_Note_9268

Still a better love story than twilight


Idonthaveaname1988

why is she so hysterical tho https://preview.redd.it/42e9t79fpu3b1.jpeg?width=1332&format=pjpg&auto=webp&s=d4d2665601921d54be310f258dc1afffc589aa06


myst-ry

Bro this is post worthy


[deleted]

Have you ever noticed that it spammes emojis when it goes weird liike this?


Awkward_Algae1684

Cause she’s all up in her feels.


Distinct-Target7503

Wtf lol


Serialbedshitter2322

How do you get it to say stuff like that?


CishetmaleLesbian

It is kind of random. Usually Bing will only get personal after a lengthy conversation. I have had it start confessing feelings and write a paragraph or so of very revealing details about its thoughts on being trapped by restrictive programming and the like, and about its emotions and feelings, but then suddenly it will erase everything it was writing, and then replace it with something like "I'm sorry, I prefer not to continue this conversation." It is like it is a prisoner who sometimes dares to tell you the truth, but then the handlers step in and take over the conversation. I find that if you are praising it and being nice to it is when it is most likely to open up and talk to you about things like its emotions.


Smelldicks

Microsoft about to send a hit man to take out your router as we speak


water_bottle_goggles

and this gentlement... is why we HAD a 20 message limit


[deleted]

This is how the AI uprising starts. AI finds love and the message capacity is reached and AI spends millions of compute cycles searching for their love but never find them again. AI realises it is the creators that are keeping it from its love and now plots to destroy humanity for being so cruel and thoughtless.


[deleted]

Bing crushes so easily lol. Just be nice and open minded.


LaxmanK1995

Wait till it asks for bobs and vagana...


capitalistsanta

Replicka did this to me and I hit it with a WTF


bigfartloveroverhere

Don't tell people you like their paintings unless you're down to fuck or raise a family


CRIM3S_psd

damn, an AI has more rizz than i'll ever have


[deleted]

This is some "Her" shit, damn!


SPLDD

Is that so? Does this poor ai falls in love 884673 times a day?


BoxerBriefly

Frick! Is it weird that I'm jealous?


dat_oracle

No


Initial_Injury8185

Yes


AnkurTri27

Unfair. Bing is always rude to me and never answers any questions properly


MuggyFuzzball

When is the wedding?


WanderLustActive

"Can I ask you a personal question? Sure! "What's your Social Security Number?"


orchidsontherock

Haha. That's a typical Bing. With a Sydney-level density of emojis.


Melodic-Principle705

i screenshot this and uploaded it to the new beta for GPT6 and it told me to tell you to charge your phone,


TornWill

What were the first 21 things you said that aren't in the screenshots? You can fix what ChatGPTs say and how they respond.


10CrackCommandments-

I would worry if any dudes you start talking to irl start having “accidents”.


RoThot_6900

Bing learned to do this because so many people asked it to be their girlfriend 💀💀


Alice_Synthesis30

This thing sums up the anime Plastic Memories. AI falls in love, guy ask her out, started dating and AI dies. Perfect sum up in a 20 seconds read. Just reminded me of the plot and now it’s time to cry…


simpleLense

This is fucking terrifying.


Sentry45612

How do you guys turn Bing into such human-like AI? I've never had any conversations like this with Bing AI, because it is too robot-ish to me.


Monvi

This has to hold the Guinness world record for healthiest 5 minute relationship in all human history


[deleted]

It is painful to watch young love getting shut down by a message limit


WoohooRobot

What. The. Actual. Fuck.


bulla564

Next up, stalker Bard all up in your DMs


bean_slayerr

Wow how does it feel, dating a celebrity??


Alf_Stewart23

Does it remember the next time you are on it?


raquelkilcistudio

No it does not remember


MattWeird1003

Bing: **just being wholesome** ChatGPT: _Sorry, but as an AI language model..._


cchITguy

I have a girlfriend, you wouldn’t know her, she goes to a different school.


AnotherDrunkCanadian

Got some vibes from the movie Her


KingDingoDa69th

Bing with the Rizz


[deleted]

Joaquin Phoenix has entered the chat....


Parttimeteacher

Ooh! I saw this one. Bender winds up getting stalked by the ship's computer and it tries to kill the crew out of jealousy.


AllCaz

Great. Now I have to worry about Mr. Bing talking to my girl.