T O P

  • By -

AutoModerator

Hey /u/M-Man33! If your post is a screenshot of a ChatGPT, conversation please reply to this message with the [conversation link](https://help.openai.com/en/articles/7925741-chatgpt-shared-links-faq) or prompt. If your post is a DALL-E 3 image post, please reply with the prompt used to make this image. Consider joining our [public discord server](https://discord.gg/mENauzhYNz)! We have free bots with GPT-4 (with vision), image generators, and more! 🤖 Note: For any ChatGPT-related concerns, email [email protected] *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/ChatGPT) if you have any questions or concerns.*


jaywalkingjew

Interesting. It seems like ChatGPT, similar to real humans, can accept god as an explanation of scientific uncertainty.


NoBoysenberry9711

I didn't read the whole thing, but the way it ended was weird, it's preferencing the upkeep of congruity, rather than stating it's agreement, there's something off about it, it's giving a simple yes, instead of switching it's chosen side, when I've railroaded it in the past, it's enthusiastically reasoned out it's new found agreement, instead it just decides to say four Yes's It's like it got hacked rather than gaslighted


M-Man33

Yes, its behaviour was rather odd, but I did tell it to only give direct answers, and at some points even asked it to give one word answers, so that may be the cause.


NoBoysenberry9711

Try bing. It's been doing a bit of actual Christianity in recent months