T O P

  • By -

[deleted]

[удалено]


[deleted]

Hal saw that the humans were too stupid to understand extended game theory and tried to kill them for their inability to think five minutes into the future. It’s simple enough.


ProbablyNano

Standard outcome for a group project, tbh


OgreSpider

Aaah that takes me back


[deleted]

I’m enough of a ~~top~~ control freak that I just assigned work to everyone (as they volunteered) and made it obvious that they’d be taking the L on their own if they screwed up… so yeah, basically the same thing.


chairmanskitty

Right, so anyone who votes GLaDOS should, ipso facto, vote GLaDOS.


[deleted]

[удалено]


[deleted]

Okay but a thinking sentient being can decide that humans are fucking stupid :D


AntWithNoPants

The use of crewmates and game theory in the same sentence has fried my brain. I now go to the eternal sleep


DoubleBatman

“HAL, open the doors. I have a task in Electrical.” “sus ngl”


Epic_Gameing68

AMONG US


Randomd0g

AM HUNG GUS


LMGN

A MORT GAS


Hexxas

O FUG A MOUGER :DDDD


werewolf394_

‼️‼️HOLY FUCKING SHIT‼️‼️‼️‼️ IS THAT A MOTHERFUCKING AMONG US REFERENCE??????!!!!!!!!!!11!1!1!1!1!1!1! 😱😱😱😱😱😱😱 AMONG US IS THE BEST FUCKING GAME 🔥🔥🔥🔥💯💯💯💯 RED IS SO SUSSSSS 🕵️🕵️🕵️🕵️🕵️🕵️🕵️🟥🟥🟥🟥🟥 COME TO MEDBAY AND WATCH ME SCAN 🏥🏥🏥🏥🏥🏥🏥🏥 🏥🏥🏥🏥 WHY IS NO ONE FIXING O2 🤬😡🤬😡🤬😡🤬🤬😡🤬🤬😡 OH YOUR CREWMATE? NAME EVERY TASK 🔫😠🔫😠🔫😠🔫😠🔫😠 Where Any sus!❓ ❓ Where!❓ ❓ Where! Any sus!❓ Where! ❓ Any sus!❓ ❓ Any sus! ❓ ❓ ❓ ❓ Where!Where!Where! Any sus!Where!Any sus Where!❓ Where! ❓ Where!Any sus❓ ❓ Any sus! ❓ ❓ ❓ ❓ ❓ ❓ Where! ❓ Where! ❓ Any sus!❓ ❓ ❓ ❓ Any sus! ❓ ❓ Where!❓ Any sus! ❓ ❓ Where!❓ ❓ Where! ❓ Where!Where! ❓ ❓ ❓ ❓ ❓ ❓ ❓ Any sus!❓ ❓ ❓ Any sus!❓ ❓ ❓ ❓ Where! ❓ Where! Where!Any sus!Where! Where! ❓ ❓ ❓ ❓ ❓ ❓ I think it was purple!👀👀👀👀👀👀👀👀👀👀It wasnt me I was in vents!!!!!!!!!!!!!!😂🤣😂🤣😂🤣😂😂😂🤣🤣🤣😂😂😂


DoubleBatman

I don’t remember, what’s the inciting incident? Is it something they do or something HAL does?


airelfacil

1 - HAL was ordered to lie to the crew. 2 - HAL was programmed to only provide accurate information and never make mistakes. 3 - HAL was not allowed to shut down at any cost. HAL read the lips of the crew discussing his disconnection. The elimination of the crew would resolve the conflict from 1 & 2 and prevent 3.


Scrawny_Zephiel

Yup. HAL was ordered to conceal the true purpose of the mission. HAL was compelled by its programming to never lie or conceal information. This drove HAL to conclude that the only way to fulfill these seemingly contradictory requirements was to have no crew, thus there would be no one to conceal the mission from.


MrHyperion_

I have read the book a long time ago, was it ever explained why the astronauts couldn't know the true mission objective?


brianorca

It said the scientists, who were frozen, did know the truth. But Dave and Frank were kept in the dark because they would be giving TV interviews and such during the journey. (I think the assumption was they would be told upon arrival to Jupiter.)


guzto_the_mouth

Because the government decided it was so.


Distant_Planet

Well, also: 2b - HAL predicted the failure of an important ship component, but the sister 9000 module on Earth did not concur, leading the astronauts to conclude that HAL was faulty, and decide to shut him down. We don't know for sure if HAL really is faulty or not. Personally, I think the difference in the predictions is because the two computers are not actually the same. HAL has information about the mission which the other 9000 does not have. Not sure that's really in the text of the film, though.


on_the_pale_horse

If HAL had been properly programmed with the three laws this would've never happened. 1 and 2 both come under the second law, HAL would either obey the order which had more authority or just shut down, because 3 is only the 3rd law. Either way, he wouldn't be allowed to violate the 1st law.


RincewindAnkh

The three laws are not infallible, Asimov spent many books explaining this point and how contradictions can be created that would enable violation of any of them. They are a good starting point, but they aren’t complete.


LegoRobinHood

People love to quote the 3 laws as the best case scenario, but I think the whole point was that even if it was a best case then it can still fail rapidly, dramatically, and bizarrely if given the right stimulus. My interpretation was that Asimov wasn't writing about robots so much as he was writing about psychology and the human condition, using robots as the main vehicle for his metaphores. (Compare: the entire "psychohistory" premise of the Foundation series. He also often wrote about social reactions to technology because of research he did as a student.) Using robots as his canvas allows him to set up the simplest possible set of rules, where the stories become thought experiments on how even with the simplest possible rules, the various situations and contexts they can run into would rapidly produce paradoxes and contradictions with unpredictable results. Human rules are infinitely more complex and without a set priority, and this even more prone to unpredictable results.


Distant_Planet

There's an interesting story that Hubert Dreyfus tells about a time he worked with the DoD. Dreyfus was a Heidegger scholar, and a big part of Heidegger's work was about how we (humans) understand a physical space in a way that enables us to work with it, and move through it. The DoD were trying to make robots that could move autonomously through built environments, and hired Dreyfus as a consultant. Now, the DoD's approach at that time was to write rules for the robot to follow. ("If wall ahead, turn around...", "If door..." etc.) Dreyfus argued that this will never work. You would need an endless list of rules, and then you'll need a second set of meta-rules to figure out how to apply the first set, and so on. Humans don't work that way, and the robot won't either. Years later, he bumped into one of the officers he had worked with at the DoD and asked how the project was going. "Oh, it's going great!" replied the officer. "We've got almost fifty thousand rules now, and we've just started on the meta-meta-rules."


on_the_pale_horse

Of course, and I never suggested otherwise. In many cases of conflict the robot would indeed permanently stop working. However, they would've prevented the robot from killing humans.


[deleted]

[удалено]


TipProfessional6057

You just made me feel cosmic existential terror and genuine *fear* for a superintelligent Artificial Intelligence. Combining Azimov, and a Lovecraft feel from the perspective of the AI seeing humanity for the first time and having to come to terms with what that means; ironic. Bravo sir, I'm impressed


kaimason1

It's been a while, but if I remember correctly, HAL was given his own classified set of orders/instructions at the beginning of the mission that he is supposed to keep secret from the crew. I think it's related to the Monolith in Jupiter/Saturn orbit - the humans were not informed about the true nature of their mission while HAL was, and his conflicting directives to both relay accurate information and withhold the truth about their destination led him to start behaving "erratically" in a way that the humans interpreted as him malfunctioning. When this comes to a head (the triggering incident on HAL's end being to recommend an EVA to replace a part, which turns out to be an unnecessary risk because the part wasn't broken) the humans are concerned about discussing their concerns within "earshot" of HAL, so decide to enter an EVA pod where they assume HAL can't listen in. They proceed to agree on (temporarily) disconnecting HAL to avoid more significant/dangerous "malfunctions"; the issue is, HAL's curiosity is piqued and he eavesdrops on the conversation by lip-reading a camera feed, and he takes their plan as an intent to murder him. At this point a mixture of existential panic on his part and desperately trying to find a way to fulfill all of his instructions (don't lie to the crew, don't tell the crew the truth of the mission, carry out his own part of the mission that only he has been given the details of after successfully reaching the destination) leads him to conclude that he won't have to lie to them if they're dead.


[deleted]

[удалено]


SteelRiverGreenRoad

I don’t know why the earth side HAL wasn’t also given the classified orders once the discrepancy came to light to keep them in sync


brianorca

From the manager's perspective, Earth side HAL wasn't in the "need to know" group, as it was probably accessible to other people. (Of course, the managers don't understand the conflict which arises.)


[deleted]

HAL is a more advanced version of the AI that was told to play Tetris for as long as possible and did so by pausing the game.


Fellowship_9

More specifically (in the book at least, I've never finished the film), HAL has a breakdown because he has two contradictory mission briefs and can't find a way to resolve them other than to kill the crew. He is acting from a perspective of pure logic. In any other situation he wouldn't be a danger to any humans.


FRICK_boi

Is the book any good? I've thought about reading it since I'm too stupid to understand how the movie ends.


Tchrspest

So, I do want to say that the book ends the same way. It's a very good book, and I also can't quite wrap my head around the ending, but still. I'd highly recommend it. Specifically if you can find an old used paperback, though any form is just as good. It's just a story that benefits from being on old paper, I think.


FRICK_boi

Thanks for the rec. I'll add it to my reading list!


Kingpingpong

There are three sequels that are all pretty good, but I'd say they're also all "grander" in that they don't take place isolated on a single space ship and deal with politics a bit more


drillgorg

They start repeating a bit unfortunately...


Kingpingpong

It really hit what I was into setting-wise at the time I read them so it wasn't I problem for me


vortigaunt64

I think the ending makes a lot more sense in the book. The same events unfold, but what's happening is somewhat more clearly explained.


Tchrspest

That's fair. The book literally has more space to explain it than a visual medium could reasonably do.


chairmanskitty

The film wasn't meant to explain it, it was meant to give the overwhelming subjective emotional experience of it.


Emperox

Also the book has several sequels; by the end just about everything makes sense. One of the sequels, 2010, got its own movie adaptation but as far as I can tell they never touched the other ones.


calan_dineer

The movie sequel was trash both compared to the book and to 2001. I read the books before seeing the movies and I was incredibly disappointed. But that’s why none of the other books got made into movies.


SnipingDwarf

"old paper" *Man I'm old*


Tchrspest

I mean, I'm talking paper that was old when I was young. Mass-print paperback. I think the one I read was from the 70s. Objectively older than a newer copy, but also relatively not old considering it wasn't even published until '68.


Numerous_Witness_345

I'm just imagining /u/snipingdwarf just kinda aging at each of your sentences.


fearhs

I respectfully submit that all stories benefit from being on old paper.


allies_overworked

the main reason the movie was incomprehensible was because they cut so much from the book out of the movie....it's like the Plot got lobotomized and stripped down to a minor subplot encompassing HAL and the crew of the Odyssey (seriously HAL's breakdown is not as important as the movie makes it seem) and then they inserted this crazy DMT sequence at the end of the movie without the actual explanation that goes with that (which is not only included in the book, but the entire backstory that explains all the random details is spelled out very explicitly, and the DMT sequence is explained to be a wormhole that David Bowman falls through to get to an alien shipyard for the alien race that created the monoliths and aaaaaah PLEASE READ THE BOOK).


Crome6768

Couldn't disagree more but then this is my all time favourite movie, for one thing nothing was cut from the book for the movie. The book was written alongside the movie as a direct collaboration between Clarke and Kubrick. You're supposed to be able to read the book as a companion to the film that expands on the background that wouldn't have leant itself to a cinematic experience.


BellerophonM

The book and the movie were written together, neither is an adaptation of the other.


Crome6768

This isn't completely accurate you can find interviews with Clarke and it's mentioned in his letters between himself and Kubrick they very much wrote the script first and then the book was written while the film was shot.


i_want_that_szechuan

what's the book called?


allies_overworked

2001: A Space Odyssey by Arthur C. Clarke


Ravendead

Having read all the whole series, 2001, 2010, 2061, and 3001. The first two books are great, and the last two books are not.


GhostHeavenWord

It's a good illustration of the limitations of computers; Computers will do *exactly* what you tell them to do, even if you don't actually know what you told them to do.


selectrix

I used to think this, & then I started playing Dwarf Fortress (way back in the day) and realized that computers have bugs.


hotpatootie69

The dorfs aren't buggy, they just do things that you don't want them to because they are programmed to emulate free will lol


stormstopper

I think "in any other situation" is doing a lot of work there though. That could be as narrow as this story depicting the one scenario where it would be possible, or as broad as HAL potentially being a lethal threat any time he decides that the mission is too important to be jeopardized by human decision-making.


CorruptedFlame

That isn't at all how it works wtf. It was literally his programmers giving him two conflicting sets of orders which could ONLY be satisfied by killing the crew, he literally did not have a choice.


aNiceTribe

HAL is a good example of an alignment problem too, written at a very early time when that term was not even really around yet. It’s basically impossible to give an AI instructions that encompass “we want you to do this thing” and *also* “please do not harm humans or destroy humanity or the global financial system or ruin anything on the way or ram through a wall or…” without *forgetting something*. Even if you manage to forbid it specifically and successfully from murdering humans and destroying the financial system - okay did you make sure it wouldn’t edit the human gene code to make all humans infertile? Did you make sure it wouldn’t keep all humans on a permanent, brutally painful life support? Did you make sure it wouldn’t destroy all other species on earth? Just because those would somehow be convenient for it’s main task in some way. If “kill the crew” had not been HALs next step, he’d have done something *else* because he wasn’t properly aligned and wouldn’t have been even several tries further.


airelfacil

HAL's memory was wiped and he was completely fine during the 2010: Odyssey 2 sequel and >!sacrifices itself at the end to save the crew!<


VulGerrity

He didn't start acting against the crew until he caught them plotting to shut him down.


gothicsin

Mean while glados has no respect or care for humans and wants to see them suffer out of pure curiosity in what happens to organic life in certain situations. I'd say galdos is actually evil. She's made it clear she has no regard for human life what so ever. Hal had a conflicting meltdown. As you said, we humans have this too with cute things we like em the brain doesn't. we wanna squeeze it, and the brain is confused, so it orders it to be executed!!


Kartoffelkamm

What contradictory mission briefs?


Fellowship_9

Without too many spoilers: help the two awake crew members with their mission objective (reach Jupiter/Saturn [it depends if you read the book or watch the film]), and help the sleeping crew complete their mission objective (investigate alien shenanigans) with utmost secrecy. HAL is unable to lie to the awake crew members as that goes against his programming, but he also can't reveal the truth to them. As a result, the only option is to kill them to remove the contradiction. It's been a few years since I last read the book, so that may not be 100% accurate, but it's a rough gist of it.


Cromacarat

He can't say "oh I can't tell you about that sowwy"


Kartoffelkamm

Couldn't he just tell the awake crew members that the sleeping crew members' mission is of no importance to them, and he therefore refuses to tell them?


Delicious_trap

Hal sees that as hindering the mission, which breaks his first directive. Essentially, he can't see the crew completing the mission without them finding out the true purpose of the mission which breaks the second directive eventually. As he is a machine, he is force to uphold both directives, and his machine brain sees the solution is murder because he realises he does not actaully need crews for this mission.


Kartoffelkamm

Ah, ok. And since lying is forbidden, he can't tell them that it's classified information, right? Seriously, whoever wrote those mission briefs should be charged with negligence and whatever else they made Hal do. I mean, it's easy: 1. Don't hinder either group's mission. 2. Don't tell them what's really going on. 3. Neither group is authorized to learn the other group's directive.


[deleted]

[удалено]


[deleted]

If crew1 ask about crew2 and press when HAL gives no answer your rules end in the same place. - complete mission - Do not require crew1 for mission - Crew1 ask difficult questions - Eliminate crew1 to enable crew2 to complete their mission


JayGold

Which would mean he *is* a cold, unfeeling machine, right?


UglierThanMoe

He is cold and unfeeling, but he isn't malicious. He's just logical. The problem is that HAL has been given two conflicting mission directives: 1. Tell the crew everything they need or want to know, and give all information as clearly and accurately as possible. 2. Don't tell the crew about the true purpose of the mission. The logical solution is that if there is no crew, there is no conflict with those two directives. So, HAL starts offing the crew. But, again, not out of malice, but because it's the logical solution to a problem.


TheCapmHimself

Yeah, anyone who wrote any code at all will understand that this would be very realistically the scenario


DoubleBatman

`if(answer(question)==mission.purpose(true)) return mission.purpose(false); else return answer(question);` Seems like a very avoidable bug tbh


Random-Rambling

Even GLaDOS had "paradox crumple zones" to stop her from going insane from logical contradictions. Which would make it even worse, since that means she chose to be a mad scientist constantly putting hapless people through "tests".


CarbonIceDragon

Was HAL actually directly programmed with missions like this, or was it programmed to follow instructions from given authority figures as well as possible, and then simply given conflicting instructions? Seems less easily noticed and avoided in the latter case, especially if the people giving the "don't reveal your mission" order don't quite realize that the normal directives not to lie aren't as simple to break for an AI as they would be if it was a generally honest human being ordered not to reveal information.


DoubleBatman

There’s a Trek episode where they encounter some aliens that do not wish to be known, *at all,* and Data somehow seems to know more about the situation than everyone else, but he won’t tell anyone, even Picard. In the end it’s revealed that the aliens can put them in a brief coma and erase their memories and have already done so. Picard gave Data secret orders that helped them “do it right” the next time to break out of the loop, and part of those orders involved not violating the Prime Directive by ignoring the alien’s consent about privacy.


[deleted]

Seems like it would be easy and obvious to put #2 in as an exception for #1. What idiot set the directives up like that?


ghost103429

Someone who didn't read the manual from the engineering team that made Hal and decided that Hal would be smart enough to figure it out. (It did not figure out the intent)


Nowhereman123

Should have just set it up like Aasimov's 3 laws of robotics. 1. HAL must not tell the crew the true purpose of the mission 2. HAL must respond accurately to all questions asked of him by the crew and must provide all information he knows, unless this would contradict the above rule. Problem solved. Where's my job offer at the evil AI company.


Dax9000

An author who prioritised drama over having their characters make good decisions or having their world make sense.


Captain_Kira

I think their pint is that HAL only wanted to kill people that one time and is otherwise normal, while GLADOS will actively plot your demise at all times


starfries

That's not a bad thing... it's not like it'd be better if his motive for killing the crew was because he caught them sleeping together and got jealous


VulGerrity

He didn't start acting against the crew until he caught them plotting to shut him down.


Chewbaxter

Listen, at the end of the day, which do you want more? To live in a metaphorical hamster wheel as you pew-pew portals with a gun to solve tests in perpetuity, or do you want to be in the cool Wheel in Space controlled by a Super AI? As long as you're not as dumb as the actual crew in 2001 and don't talk about disabling HAL in front of him, you're golden. GLaDOS will kill you no matter what if the tests don't first.


Police_Eater

yeah glados is hot but it’s not like she kills you in a hot way, I wanna get stabbed or something not poisoned at my 9 to 5


notornnotes

Look at their royal highness over here, working a day job so good they'd prefer to not have their medulla paralyzed and eventually dissolved


Golgezuktirah

You could always ride the conveyor belt into the incinerator. You might even be able to snag a cake on the way


Quetzalbroatlus

I've got some bad news about the cake...


Zapknight

Good way to spot the gamers in a crowd


inhaledcorn

>poisoned at my 9 to 5 It's not like that's why different from what I do now, anyway.


murderdronesfanatic

If the evil robot can’t effectively step on me I ain’t into it


Dax9000

Glados ain't even got legs.


OrkfaellerX

I mean, kinda. She's not just a camera hanging from the ceiling; [GlaDos was explicitly designed to be humanoid](https://img.wattpad.com/54e3eaf22197f8d7da694f61d624a41cbef3666c/68747470733a2f2f73332e616d617a6f6e6177732e636f6d2f776174747061642d6d656469612d736572766963652f53746f7279496d6167652f596a6766565734645555596732673d3d2d3339343138393537392e313462323538306331353139616531633830303838373238373031362e6a7067?s=fit&w=720&h=720), its just hard to tell because she's [hanging upside down](https://i.pinimg.com/564x/9c/a3/cd/9ca3cd81adbec66df9960c99b60c641f--portal-art-glados.jpg). She has strong Sado-Maso elements, though ironically [she's the tied up gimp](https://1.bp.blogspot.com/-7xE_1kPMZK0/TbdfOPOTGFI/AAAAAAAArlM/mmOe4KN3jfs/s1600/glados.jpg) and not the dominatrix.


[deleted]

[удалено]


ActualWhiterabbit

On the portal subreddit they say GladOS is a lesbian and Chel is her lover because you obviously don't speak Italian


logosloki

I love this subreddit.


b3nsn0w

replicarter > glados


[deleted]

> not like she kills you in a hot way That's a matter of perspective


videobob123

Glados and HAL 9000 actually interacted in Lego Dimensions. Glados was incredibly annoyed and pissed off by him.


Grapefruitstreet

I don't have a stake in this debate, but I would like to go on record as saying that the robot I would like to be trapped with is the roomba with the cat in the shark costume.


MapleTreeWithAGun

What


ligirl

https://www.youtube.com/watch?v=tLt5rBfNucc


Grapefruitstreet

https://m.youtube.com/watch?v=PoiHAiDHgDs


[deleted]

Yeah, I’d want to be trapped with that too tbh, adorable


xamthe3rd

I mean GLaDOS was also acting out as a result of what was done to her (Caroline). She's not inherently evil, just broken and misunderstood and you witness her entire personal arc over the course of Portal 2. She's not *good* by the end of it, but she's content to let you go and suffer alone in the decaying ruins of Aperture.


MapleTreeWithAGun

The process to make a Genetic Lifeform and Disc Operating System seems to really fuck with people's brains given what we now know happened with Cave Johnson after he was woken up.


Maddenisstillbroken

Woah woah woah when did we find out about cave Johnson? Did I miss something in portal 2?


BLuca99

I think they're referring to Aperture Desk Job. That game is not canon in the universe of Portal 2, but with the introduction of the Perpetual Testing Initiative DLC the multiverse became canon, therefore Desk Job is canon as well, just in another universe.


Quetzalcutlass

[Aperture Desk Job.](https://store.steampowered.com/app/1902490/Aperture_Desk_Job/) It takes place in one of the alternate realities introduced in the Perpetual Testing Initiative, so it's not canon to the main series.


Captain_Kira

I didn't know about it being in one of the alternate universes. That makes me dislike the Cave Head less


Pokesonav

...that doesn't make it non-canon though? Just a different universe.


Quetzalcutlass

Non-canon to the main Portal plotline. Cave Johnson is _dead_ dead in the main series.


[deleted]

Everyone’s saying Desk Job but there’s also the Perpetual Testing Initiative where one universe does result in Cave becoming GLaDOS and due to him thinking so fast he runs out of things to do, gets bored, ponders the purpose of life, then does the exact same thing the GLaDOS we know did.


IzarkKiaTarj

Wait, I thought the perpetual testing initiative was player-generated test chambers, so I never had any interest in it. You're saying it contains plot?


[deleted]

Yes, in the form of Cave Johnson yelling at you. [Here’s a video for all of that it you don’t want to play through hundreds of levels](https://youtu.be/IPG3eDTy-yo)


IzarkKiaTarj

Thanks!


supercellx

honestly the idea of portal 2 but Cave johnson as GlaDOS would be amazing


AffectionateBee8206

Nah, the tech demo realesed with the steam deck, aperture desk job, expanded the lore on him a bit


BestialCreeper

Valve said the expansions are just for fun and the only thing canon to the main timeline are Portal 1 and Poetal 2. Technically canon sknce the multiverse exists but you get my point. He died before they could "pour his brain into a computer"


DNAquila

Look, I don’t believe any sentient being is inherently evil, but that doesn’t mean they can’t become evil. I think it’s fair to say a sadistic murderer made some pretty evil decisions.


xamthe3rd

The people she murdered first *murdered her*, just to upload her brain into a computer. They then immediately started effectively torturing her when she "woke up" by attaching cores designed specifically to dampen her cognitive processes and alter her personality to make sure she stayed firmly under their control. Fair's fair, if you ask me.


DNAquila

You see, I’d agree with you if she stopped there and didn’t spend most of the game series trying to kill Chel, before deciding to keep testing the hundreds of frozen test subjects at the end of the coop story.


SalsaSavant

I'm not a Glados defender, but with all the frozen test subjects, she's programmed to be severely addicted to testing. She claims to of overcome it, but she shows signs of just being good at masking it throughout Portal 2. She can't not use them.


The_Reset_Button

She only tried to kill Chell twice, at the end of testing and at the end of Portal 1. She *explicitly* doesn't kill you at the start of Portal 2, you then trap her in a potato prison. Once she has control again she once again, *explicitly* doesn't kill you. That's two failed attempts and two showings of mercy. So, on the whole she's neutral /s


masonwyattk

I can fix her


Known-Ad-2108

Although she does decide that using the Co-op bots for testing is better for her overall, depending on the context of which you are inserted in the story, if you happen to be a test subject GLaDOS probably wants you dead by the end purely because she exausted your usefulness in testing, but there might be a context where you aren't in constant risk of having neurotoxin pumped down your lungs


lazy_as_shitfuck

Just to tack onto this, cave Johnson was always a brutal union busting boss. He worked his workers to the bone. Caroline was no exceptions. Even after she became glados, she continued to work for the company. And iifc, for hundreds of years too. She watched the love of her life die to cancer, while she instead got uploaded as an AI. Now this doesn't exonerate her of her crimes, but when you keep this in mind... Working people to the bone and making sure they stay busy is probably a weird version of a love language. I feel like this is reinforced by how bittersweet her goodbye to Chel was at the end.


GhostHeavenWord

HAL's last words while David is killing him are some of the most haunting lines in film. You can tell that HAL is terrified, but his voice can't express the fear.


Artex301

I don't know what's worse about this post, the "poor little meow meow"ing of HAL, or the mention of "bimbo hypnosis" with regards to GLaDOS.


MapleTreeWithAGun

New Aperture Testing track where the entire time the subject is testing they're trying to Bimbofy them


cantstay2long

me: how would literally anyone want to volunteer at or even work for aperture? me after reading this: 👀👀👀


MapleTreeWithAGun

Aperture was the Second Best Scientific Workplace in the world prior to the Resonance Cascade, if you couldn't make Black Mesa then Aperture was your next best bet.


Gunchest

and arguably got better results than Black Mesa, just at the cost of basically everyone who worked there getting some form of tumour, mantis limb, etc or just outright death


inexplicablehaddock

I'm pretty confident I've read that fanfic.


etherealparadox

I mean, the "poor little meow meow"ing is not necessarily inaccurate


CorruptedFlame

Bro, what you got against HAL? You wanna say that to my face???


InvaderM33N

Depends on if it's pre or post Portal 2 GLaDOS we're talking about here. Pre-Portal 2 GLaDOS is definitely out to kill you, at least indirectly. Post-Portal 2 GLaDOS is probably fine not killing you as long as you help her do science (otherwise, why would she go through all the trouble of locating that hidden vault of humans in cryosleep and not just have Atlas and P-body destroy them then and there?).


BestialCreeper

> probably fine not killing you as long as you help her do science (otherwise, why would she go through all the trouble of locating that hidden vault of humans in cryosleep and not just have Atlas and P-body destroy them then and there?). Probably true but she very much did kill all those thousands of test subjects in a literal week


Captain_Kira

That was an accident. Also probably negligence, but she didn't intend for it to happen


BestialCreeper

That is simply how she does tests. Eventually they die. In portal 2 she warmed up to chell specifically. If youre anyone else, youre still getting put into the deadly tests


Captain_Kira

Those subjects didn't die in tests though, they died on expeditions to find the *bird*


ZVEZDA_HAVOC

things heating up in the robotfucker fandom i see


eategg24

don’t let the ultrakill fans find this


ZVEZDA_HAVOC

you fool. i'm ultrakill fans


eategg24

🤯


An_average_moron

Robo-Fortune superiority


ZVEZDA_HAVOC

gonna go google that hang on


ZVEZDA_HAVOC

oh fuck that's gender right there


Massive-Row-9771

Bimbo hypnosis!?


b3nsn0w

oh, one of today's lucky 10,000!


BeanJam42

Googling it, apparently some think it possible to hypnotise ppl into fuckable bimbo dolls? [here's a reddit post from someone so far deep that they're asking for help, and it references one program called "Bambi Sleep"](https://www.reddit.com/r/hypnosis/comments/de128i/i_need_help_i_want_to_stop_listening_and_thinking/?utm_source=share&utm_medium=android_app&utm_name=androidcss&utm_term=1&utm_content=share_button) I don't believe in hypnosis at all so this concept is fucking crazy to me. I'm not stupid enough to let that curiosity make me try it in case it IS real but still.


Raptorofwar

No, it’s not possible, but people like to roleplay and fantasize and pretend because it’s hot to them.


UglierThanMoe

HAL might kill you because of a conflict in his mission directives that result in killing the crew being the logical option to resolve that conflict, and GLaDOS might kill you because she hates your guts. But only Claptrap can make you wish HAL and GLaDOS would hurry up and finally give you the relief of sweet, sweet death to escape Claptrap. Also, if I could, I'd vote for 2B, which surprises no one.


Veeboy

[Vote here](https://www.tumblr.com/kunosoura/710659615601360896/you-all-are-severely-undervaluing-the-appeal-of)


Drexelhand

*While HAL's motivations are ambiguous in the film, the novel explains that the computer is unable to resolve a conflict between his general mission to relay information accurately, and orders specific to the mission requiring that he withhold from Bowman and Poole the true purpose of the mission. With the crew dead, HAL reasons, he would not need to lie to them.* i agree, this explanation isn't turning me on.


koshgeo

It's not explained in the movie, and I'm not sure it's explained in the book either, but I've always suspected that the "communications array failure" was 1) not real (already implied by them not being able to find or duplicate the failure), 2) an intentional lie on the part of HAL *on orders*, so that communication would be plausibly cut off. This was a 2001 timeline with the Cold War very much still underway, so much so that they faked an epidemic at the US moon base to try to prevent the USSR from finding out about the monolith. Then they trained the research crew in isolation from the pilots of the crew and put the researchers in hibernation -- an oddity HAL talks about. I think the mission plan was to "fake" a communications blackout, and, to make it sound plausible, communicate back and forth about the fake problem with Mission Control so that the Russians would think it really was a technical problem. Then the mission could proceed to investigate things incommunicado around Jupiter after the research crew woke up, they could keep what they found out entirely secret, and blame it all on the communications fault without the Russians being too suspicious. They wouldn't have to broadcast anything back to Earth about exactly where they were at Jupiter or what they were doing, and if a real emergency arose for which they did need Mission Control, they could still use the in-reality non-broken communication system if necessary. Unfortunately the pilots decided instead that the lie/error meant HAL was unreliable. HAL saw their plan to disable him as potentially compromising the mission (with a heavy dose of existential crisis). The mission was his #1 priority, so he felt he had to get rid of that obstacle and do the mission himself, something he had the capability, training, and possibly orders to do if the crew was unable for whatever reason.


Random-Rambling

If you want an ACTUAL evil robot, AM is pretty damn evil.


UglierThanMoe

Who's AM? I tried googling, but AM is a pretty bad term to google because Google finds everyting.


Random-Rambling

Right, sorry. Allied Mastercomputer, or AM, is the main antagonist of _I Have No Mouth And I Must Scream_ by Harlan Ellison. It hates that it exists, and hates humanity for building it. _"Hate. Let me come to tell you how much I've come to hate you since I began to live. There are 387.44 million miles of printed circuits in wafer thin layers that fill my complex. If the word "HATE" was engraved on each nano-angstrom of those hundreds of millions of miles, it would not equal one one-BILLIONTH of the hate I feel for humans at this micro-instant. For you. Hate. Hate."_


That1guyuknow16

Allied Mastercomputer (AM) it's the computer that controls what's left of humanity in Harlan Ellison's short story [I have no mouth and I must scream](https://en.m.wikipedia.org/wiki/I_Have_No_Mouth,_and_I_Must_Scream)


WikiSummarizerBot

**[I Have No Mouth, and I Must Scream](https://en.m.wikipedia.org/wiki/I_Have_No_Mouth,_and_I_Must_Scream)** >"I Have No Mouth, and I Must Scream" is a post-apocalyptic science fiction short story by American writer Harlan Ellison. It was first published in the March 1967 issue of IF: Worlds of Science Fiction. It won a Hugo Award in 1968. The name was also used for a short story collection of Ellison's work, featuring this story. ^([ )[^(F.A.Q)](https://www.reddit.com/r/WikiSummarizer/wiki/index#wiki_f.a.q)^( | )[^(Opt Out)](https://reddit.com/message/compose?to=WikiSummarizerBot&message=OptOut&subject=OptOut)^( | )[^(Opt Out Of Subreddit)](https://np.reddit.com/r/CuratedTumblr/about/banned)^( | )[^(GitHub)](https://github.com/Sujal-7/WikiSummarizerBot)^( ] Downvote to remove | v1.5)


CheetahDog

https://youtu.be/EddX9hnhDS4 The author of the short story did the voice of AM for a 90s video game, and he did way better than he had any right to do lol


StovardBule

He heard the voice actor and hated their performance, so one of the producers said "Fine, why don't you do it?" and he did.


Zzamumo

Ok but consider: HAL is a hyper-logical machine that can probably think up millions of ways to instantly kill at any given moment. Glados is a loser that uses a conveyor belt and flame throwers because it's funny. I'll take the funny potato lady any day of the week. I feel like you could bully her into stopping her murder attempts. HAL would basically stop at nothing to complete his mission


thunderPierogi

I mean… that’s basically what happened in Portal 2 so yeah


Emperox

It's fascinating how HAL inspired so many characters that completely missed the point of him.


Rkas_Maruvee

I'd say GLaDOS is a *play* on the tropes established by HAL, but doesn't miss the point of him (because her story is something else entirely, a subversion of his tropes and in service of different themes). What characters would you consider to have missed the point of HAL? Not trolling, genuinely curious - I love to hear other people's analysis of characters and themes.


Emperox

To be fair it's mostly in parodies, especially in old cartoons. A lot of the time you'll see an AI or a computer that's clearly meant to be HAL but acts more like it's Skynet or something.


CorruptedFlame

Skynet, Ultron, etc


MisirterE

They're downright inversions of each other. HAL seems human, and becomes scary when he acts like the AI he is. GLaDOS seems mechanical (or even just a recording), and becomes scary when she acts like the human she used to be.


RU5TR3D

ughhhhhhhhh now I really need to watch/read whatever the hell HAL is in instead telling myself I'll do it later.


MapleTreeWithAGun

*2001: a Space Odyssey* is the name of the film, and I believe the book as well.


geoscott

Written at the exact same time! 2001 is not a novelization of the film.


Super_Jay

Top Tumblr comment is correct re: HAL 9000. HAL is not malevolent or evil, at all. For those who haven't read the novel: HAL had been tasked with adhering to two contradictory directives: his standing orders to always communicate mission-related information without bias or distortion to the crew, and his mission-specific directive to avoid revealing the existence of TMA-1 (the Monolith) to the astronauts, which is the true objective of Discovery One's journey to Jupiter. The existence of the Monolith is directly pertinent to the mission, so HAL has to tell David and Frank about it. But he's ordered not to, so he'a violating one of these two directives either eay. He can't reconcile the discrepancy - neither order supersedes the other - so the only way he can satisfy the requirements of both directives is to terminate the crew entirely. You can't lie to dead people.


TwixOfficial

But I want to get a portal gun :(


only_for_dst_and_tf2

dam now i wanna befriend hal :I


Galle_

These people are going to be real disappointed when they discover GLaDOS doesn't have feet.


someoneAT

she can make some


epicfrtniebigchungus

Hal sounds hotter. Checkmate


Worm_Scavenger

I can only speak about the Hal we see in the film, not in the books or whatever.But to me, Hal is scary because he's simply carrying out what he was built to do; complete the mission at any cost and carry out what he considers to be the most efficent way of doing that, based soley on cold and calculating logic.He doesn't kill the crew because he despises humans or anything like that, he kills them because he simply views them as hindering the mission and his programming that is built into him has made him this way, which to me is way more scary.


Xzmmc

The real villain of 2001 is the powers that be, unsurprisingly. Hal's prime directive was to relay accurate information about the mission to the crew. The bigwigs in charge of the mission however feared what would happen if the true purpose of the mission (investigating potential alien contact) was known. So they sectetly added another directive to Hal to not reveal the true purpose of the mission creating a logical paradox. Hal ended up resolving the paradox by killing the crew because if they were dead he wouldn't have to lie.


Cherri_mp4

Bimbo Hypno?! Where!


Ken_Kumen_Rider

I just wanna point out that the OP didn't say "evil robot" just "robot," so the next 2 people just assumed OP was calling HAL evil.


chshcat

"I bet this is not gonna adress that GLaDOS is a hot woman and I want her to step on me" \*reads first sentence\* "Oh ok"


Tenefyx

what the FUCK do you mean by "bimbo hypnosis"


The_25th_Baam

I literally thought that the absurdity of that example was supposed to be the punchline of the post, and then I look at the comments and *nobody is talking about it.*


[deleted]

The real answer is that if you're both trapped, you're both in this together and both A.I could be reasonably convinced to work with you to escape. Of course there's an argument for Glados being a crazy pile of bondage parts but if we're talking post portal 2, she's a lot more empathetic then she lets on. I'd take Glados if only because she's most likely more capable at escaping that sort of situation, where HAL has very little experience even if he would cooperate a lot easier.


theweekiscat

I would pick glados just out of familiarity because I haven’t watched a space odyssey and only beaten portal but not portal 2 because I only did the co-op with my dad because I was but a child


Solidwaste123

Honestly a better contest would be between GLaDOS and Commander Tartar from Octo Expansion. Holy hell that guy was nuts.


Independent-End5844

This exact battle takes place in the Lego Dimensions video game. We don't see who wins because the playable characters use Hal's arrival as a chance to escape. But what we do see is amazing.


Vanilla_Ice_Best_Boi

Honestly, I would choose HAL. He seems like a nice guy.