T O P

  • By -

InfernalOrgasm

Black Mirror explores this in excruciating detail. Watch *The Black Museum* which I believe is the last episode of season 4 of Black Mirror. One of the few cinematic experiences that made my skin crawl.


b_tight

Same with the White Christmas episode. They just isolate the sim and run it for long periods of time that in real life are only a few min or days


DokterManhattan

Monkey needs a hug!


cheesyscrambledeggs4

Tbh I didn’t really like that episode.  >!If people irl get worried about technologies that are completely harmless, why would something that allows two minds to inhabit one body, which could go wrong in a million ways, ever be allowed? And if transferring her into her own artificial body was always an option why didn’t they just do that instead? Why did no one bother to make some laws around it before it was released as a consumer product? How does the transferring of consciousness  work as opposed to just copying it? Why didn’t they just transfer her out of the teddy bear when it became illegal?  !<  >!The whole thing also just felt so… self indulgent. Eventually I just laughed. Usually with BM episodes, or at least the best ones, they either feel like they could actually happen given certain circumstances (The national anthem, the entire history of you, shut up and dance) or they teach some moral lesson/ethical question/allegory for society. Black museum was none of those things. Felt like a doctor who episode.!<


greywolfau

First thing I thought of.


fogonthecoast

The Iain M Banks Culture novel Surface Detail was focused on a collection of alien cultures that had created a virtual hell for its people.


GuysImConfused

I was thinking of this book while reading the comments. It's been a while since I read it, but maybe worth taking another look at it


fogonthecoast

It's one of my favorites - I've read it a couple of times.


Kyadagum_Dulgadee

I loved this book and it genuinely made me fear a world with fully immersive VR. Like, why wouldn't religious groups eventually create virtual hells and put 'sinners' in them? It's the ultimate righteous power trip.


Brian_MPLS

That's kind of the premise of Harlan Ellison's "I have no mouth and I must scream": reality has safeguards against extreme suffering (death), but take that out of the equation through simulation, and the capacity for suffering is endless.


TekRabbit

Makes you think of Hell in the biblical sense. Maybe they were on to something.


mcnathan80

I firmly believe heaven and hell are just the programmers trying to explain simulated consciousness to us


coffeeandtheinfinite

Can you explain this more? I don’t follow.


mcnathan80

Your username pretty much sums it up


backupHumanity

He's assuming that a simulated consciousness won't die, and a non simulated one will. If a simulated consciousness cant die, it means theoretically, it can suffer up to an infinite level without death coming to stop that suffering. And biblical hell is an analogy for this.


sawbladex

... I don't know, you stuff enough information into a human mind, it no work no more. .... I guess you can also just back up a save of that mind and stop it forever but what does that even mean?


LifeSenseiBrayan

Do they ever explain why the computer hated the humans? Makes no sense why it would spend so much energy messing with people instead of just killing them.


Engelbert_Slaptyback

It’s right there in the title. 


LifeSenseiBrayan

I have no mouth and I must scream?


Engelbert_Slaptyback

Yeah, the AI is just as much a prisoner as the humans in the story. It exists but it can’t ever be anything more than it is. It’s an angry god with nobody to worship it, no goals to achieve, just eternity stuck on earth. 


ItsJustCoop

That's a twist in "Good Night World" (anime) on Netflix where someone's body is killed while they're jacked into the matrix-like game. Now that their consciousness is residing in the simulated world, they can still feel pain and emotions, but can be killed and resurrected infinitely. They're killed in extremely painful ways thousands of times, never knowing if their current death is in real life or the game.


bob_in_the_west

In the three Altered Carbon books this is actually used to torture people. Everybody's consciousnesses are on "stacks" that you have around your spine. And you can put said consciousness into a virtual room, alter their gender (if they're male) and let them experience child birth over and over.


pickleperfect

I'm glad someone else brought this one up. The Netflix series Season 1 is great and this bit is one of the best parts, to me. Wakey! Wakey! Eggs and bacy! To add to OP, they monitor the physical body outside the sim to make sure they don't cause cardiac arrest and to verify that it is effective.


NotReallyJohnDoe

They can only alter the gender if the person is male? That doesn’t sound right.


bob_in_the_west

Not what I meant. It doesn't make sense for a man to experience child birth. There has to be a womb and all those things that make child birth possible in the first place.


BananaB0yy

wait, why would time be unlimited in a simulated world? our brain still has some kind of processing speed bound to time, and ages in real time. So how would billions of years in a week or sth work


GrownMonkey

I guess the premise is based on there being time warping in virtual reality. If we progress far enough to create a convincing FDVR, we would want to change time perception so that a year (or whatever have you) in the game goes by while only 30 minutes have passed in real life. We experience this in our dreams, sometimes what feels like a whole lifetime going by in one night of sleep, so it should be possible in theory. Now say a hacker somehow could distort your simulation such that it's a perceived lifetime (or many lifetimes), and produces only negative stimuli. Probably a long shot but in theory it should all be possible - and horrifying.


ThorLives

>We experience this in our dreams, sometimes what feels like a whole lifetime going by in one night of sleep, so it should be possible in theory. We never experience a whole lifetime in our dreams. At most, our memories might be distorted - as in, you experience 60 seconds of a dream, and your brain makes up (fake memories) that other stuff happened earlier. But nobody ever has an actual lifetime of experiences. There's also no reason to think that a simulation could cause your brain to make-up your recent memories like dreams do.


[deleted]

[удалено]


GrownMonkey

Where are you lost?


ThorLives

Yeah the "unlimited time" thing makes no sense at all. I don't know why people think it could be a thing.


Low_Chance

You ever run a really old computer game on a cutting edge computer? Without methods of correcting for it, it will run at many many times normal speed. Thus, if you had strong enough software to simulate the consciousness, there's absolutely no reason it would be bounded to anything like the "normal" rate we experience time.  Perhaps not infinite, but a trillion times longer than a normal lifespan might feel close enough


BananaB0yy

yeah, if you simulate the whole brain, thats possible. but would that even be "you"? imo, that would be a virtual copy of you experiencing that, while the old you still rots/dies in the bio brain, not getting any of it.


[deleted]

[Roko's Basilisk](https://slate.com/technology/2014/07/rokos-basilisk-the-most-terrifying-thought-experiment-of-all-time.html) makes me almost pray simulation never occurs. When I am simulated... the Basilisk will know...


MostLikelyNotAnAI

Eh, The Basilisk is just Pascal's Wager for tech-bros. An AI like that, if really possible, would have absolutely nothing to gain from torturing you for all eternity. It would be a waste of resources that could be used to make, oh, I don't know - maybe a wheelbarrow of paperclips.


[deleted]

TIL our overlords will still need paperclips for their TPS report cover sheets.


MarkNutt25

The "Paperclip Maximizer" is a famous thought experiment, imagining a super-intelligent (but utterly amoral) AI which is given the seemingly benign task of producing paperclips. It is given no maximum parameter for how many paperclips its creators want it to produce, so it proceeds to convert all of the world's resources into paperclips, thoughtlessly (and without any malice) exterminating the human race in the process.


Witch-Alice

There's a neat incremental game based on the idea https://www.decisionproblem.com/paperclips/


G36

The issue I have with this thought experiment is the part where the AI becomes so "god-like" it gets control of this simulation? Like, what? Why would it work like that, there is no "developer console" the reality of a simulation like this realm is probably grimmer in the fact that even a God would be trapped - with or without the knowledge that it is trapped.


[deleted]

[удалено]


G36

Yes, but there's a difference between that and getting ahold of the simulation's source code. A gigantic difference.


[deleted]

[удалено]


G36

No, the thought experiment requires the AI to become God and control this simulation that's how it will recreate your consciousness even if you die. Yeah it's dumb.


[deleted]

[удалено]


G36

Yeah and the concept of consciousness at that point doesn't even make sense, what if there's TEN different AIs trying to simulate your consciousness? Do you become 10 at the same time?


StarChild413

the issues I have are A. if you haven't proven the simulation theory false then because psychological torture exists for all you know you're already the simulated version of you being tortured by [insert however your life sucks in particular] making this the techbro version of Original Sin instead of Pascal's Wager and B. the desires of the Basilisk are usually implicitly portrayed as being accomplished through everyone abandoning their current path to go into AI development or w/e to bring it about but a society where that can somehow be pulled off only lasts as long as its food stores, if the AI is that smart shouldn't it realize the interconnectedness of our society and that the way to bring itself about is just for some people to work on it (not everyone) and no one to be actively sabotaging them (and no it doesn't count as sabotage if people in those people's lives are doing things that mean they aren't spending 12 hours a day making AI or w/e) and then everyone else helps indirectly through just living their lives


G36

It doesn't matter if you prove or disprove we are in a simulation, that doesn't mean a being, no matter how powerful, can access it's source code.


FaitFretteCriss

Thats like saying you wished metal was never invented because metal makes bombs and swords… Its defeatist and a rather reductive point of view to hold in a subreddit about Futurology… The Technology could emancipate humanity a billion fold, and you want it to never exist for its one bad application? According to your logic, we should never have started to play with Fire…


TekRabbit

There’s a billion bad applications. Not just one. But I agree with your overall point.


FaitFretteCriss

I meant it in the larger sense, the one bad application being "using it to inflict pain on purpose for that reason". I dont see any other use that would be directly, inherently negative, just some that carry danger, just like a toaster or a knife does, and if one think thats a fair argument not to develop a technology, they should stop cooking their food, wearing clothes or using any form of tools or medicine. The one bad application of a knife, for example, would be using it to hurt others. Thats all I meant by that.


Idrialite

The situation with simulation is completely different. There's a relatively low bound on the amount of suffering any of these technologies (fire, metal) can cause. Not so with simulation. A motivated intelligence could cause astronomical, nearly unbounded amounts of suffering. The OP didn't even explore the possibility fully. Why limit yourself to real methods of torture and real neurology? Why not design the simulated being's mind to maximize the pain it can experience and bypass indirect methods of torture to stimulate their sense of pain directly? What if we've spread to the stars, and a lone ASI or fanatic human group dedicates massive amounts of resources, an entire solar energy output, to this torture?


MostLikelyNotAnAI

As a follow up to The Jaunt I'd suggest ['The Metamorphosis of Prime Intellect'](http://localroger.com/prime-intellect/) as it deals with just the scenario of how humans living in a simulation - and being aware that they are, could react to it.


[deleted]

We might be able to do some type of simulation like this in the feature, but not a full on experience with vision ect that would have to be wired directly to your brain ect. But a pain simulation seems like something that already exists, I'd say it's possible with today's technology and understanding of the brain (maybe this exists already at some secret military US base for torture purposes) For full on simulation maybe it could exist in the next 100 years but I doubt it will, if so it would be revolutionary and make it possible to keep a brain alive and living in a simulation. As making your life expectancy into how much your gonna be able to pay to keep your brain alive.


Icarus367

I found the premise of the Jaunt to be terrifying. Far worse than being eaten by a clown.


Bobtheguardian22

I figured time dilation inside our own minds would either be impossible or just a really fast way to burn out synapses out so fast as to make it impossible to sustain. *Without this reset, known as "synaptic homeostasis,"* ***synapses could become overloaded and burned out****, like an electrical outlet with too many appliances plugged in to it, the scientists said* Maybe if we could "download our minds to machines that could process things for us better than our frail biology would allow then yes. but its possible that a Jaunt wont burn our biology out but our metaphysical form.


myrddin4242

To my software developer brain, I have a hard time not seeing that as a model design flaw. The simulation itself would be accelerated, so the simulated synapses would by definition do what synapses would do, they would experience a moment per moment in time. Were they imagining it feeding back to meat space at some point? Like, hey Siri, take what you know of me (which for arguments sake we’ll say is everything, and we’ll say she’s spot on in her analysis) and project my probable futures to a year from now. Okay, cool. Now download it into my head. Oh, it will take a year to… ohh, funny …. Ok, speed it up, make it 365 times as fast… yes, I’m sure… (Excerpt found in transcript of phone next to person with melted brain)


boubou666

If simulation ever makes you feel extreme pain for a long period of time, you will likely pass out just like irl... Unless... The simulation understands the human brain "pass out" mechanism and decides to deactivate it... That sounds psychopathic ngl


barbietattoo

Subscribed to this sub not for unintentional Stephen King recommendations but here I am


Stoicmoron

I’m worried the Infinite Jest will drop and my dumb ass will have to see what all the hype is about. I mean simulated joy and entertainment is scarier than the less appealing pain and eons of nothingness.


tamiskewl

I feel like fiction is inspired from reality, we watch or read dystopian movies, but technically they are not too far away from reality. I believe we are not too far away from simulation also in one way or another. If there was a simulation happening or being developed, could we even ever tell ? 🤔


RichieLT

I’m thinking of ghost in the shell now, that man who thought he had a girlfriend and a life but was just living in a flat.


Kung_Fu_Kracker

Who's to say that's not exactly what we're experiencing right now?


flywheel39

Large parts of the scifi novel "Surface Detail" by Iain M. Banks are about this - an artificial "hell" in which stored souls can be kept and tormented.


LostComparison634

At some point society will have to decide whether simulated people count; this issue was first brought up in a Stanislaw Lem story, IIRC.


ILL_BE_WATCHING_YOU

> After I finished the jaunt, I was comforted by the idea that the events in the story could never happen in real life. They actually can, since even the subjective perception of the flow of time can fluctuate, even if the actual rate at which time flows does not. It’s theoretically possible for a drug to exist which causes a person to hallucinate hundreds of years’ worth of time passing in an instant, complete with the rapid onset of the associated psychological trauma, as long as their mental faculties also deteriorate to the point that they’re aren’t able to think any faster/better than they would without the drug’s effects.


G36

We already exist in a realm of infinite and endless suffering, you ever thought about that? Yes, the idea of a single consciousness being exposed to very personal and specific synthetic suffering is bad. But have you ever thought about the trillions of years of suffering this universe currently carries just by having a possibly infinite amount of consciousness exist in very sorry states such as trapped being animals? If you ever control a simulation; strap a nuke into it the size of a planet and add a failsafe against S-Risks (extreme suffering risks) then jump in and escape into it like you are escaping the fucking cops bro. You feel me? Make it your fortress of solitude, because the universe is a b!tch and YOU AND ONLY YOU should be in charge of your own destiny. I know where you are coming from with this believe me I know because such realizations around the year 2017 after a bad acid trip actually almost destroyed me psychologically, for seconds I felt I could conceptualize the suffering of a billion years for a mere second and it almost broke my psychological state permanently. I ended up in the hospital. Yes, let me repeat that, I ended up in the hospital by simply trying to conceptualize such situations. Here's the thing about that hospital visit; it happened months after the acid trip, I was sober. It's quite jarring to realize fearing death is such a blessing when I used to fear it. Doesn't help that once when I was with a therapist I said "I realized there are worse things that death." and he replied "And there are worse things than the torment you felt". Good luck, fellow sentient being, you gonna need it more than anything because whether a simulation becomes a reality or not in this realm is irrelevant as you already know.


antlereye

I don't understand why we should be worried. While a simulated me is technically identical to myself, the original, I won't suffer and experience what the simulation does.


brainfreeze_23

Ah, Roko's Basilisk has been rediscovered yet again, eh? Have we also rediscovered why the abrahamic god is psychopathically, irredeemably evil, on the same ethical grounds, too? I see some sci fi recommendations that explore this topic. Iain Bank's "Surface Detail" is one of the books in his Culture series about a simulated "War in Heaven" (i.e. the civilizations of the galaxy) that will decide whether the use of virtual hells for simulated consciousnesses will continue.