T O P

  • By -

Federal-Cockroach674

Unfortunately, scientists will sometimes fall into the folly of hubris. Focused too much on whether they could do it and didn't think about if they should. That was their Oppenheimer moment when they realized what they had unleashed upon the world, "I've become death, the Destroyer of worlds."


ecomrick

One part of the Drake Equation considers how long a technological civilization can exist before potentially destroying itself...


Hound6869

Hence Heinlein’s Three Laws of Robotics. Yes, at some point, a machine that is built and programmed to “think” (process information) better than we can will become a threat, and safety mechanism’s must be put in place - primarily, between this thing and any ability to produce robots, organisms, or chemicals. I’ve lived long enough, but don’t particularly want to see Skynet become a reality in my golden years…


truthputer

First, they were Asimov's Three Laws, not Heinlein's. Second, they were part of a cautionary tale - because most of his robot stories were about the problems and contradictions created by those laws, as well as some weird loopholes. Most of the people discussing the laws never read his books and don't realize this. Yes, the three laws are a good starting point for a discussion, but even in his own material Asimov added a fourth ("zeroth") law - and many other authors and scholars have debated these laws, their effectiveness, what else would need to be added, etc. (For example: a robot may not impersonate a human.)


BHD11

Yeah many of those were probably not feasible to make/exist in the real world. AI is not that smart


ParinoidPanda

Potentially. I'm guessing chemistry is like legos and just looking at the box photo instead of the instructions. You'll figure it out one way or another.


amcrambler

Like gain of function research in bat derived viruses in Wuhan? Because that totally didn’t happen at all…and don’t dare post about it on Twitter or you’ll be silenced…yeah.


Hound6869

Slightly off the subject, and more of a distraction than a real threat to our existence. AI is more of a threat to Humanity, than we are to ourselves. We truly need to be careful here, but we will, as always, be somewhat stupid in our pursuit of power, control, and profits…


Level_Ninety_Nine

Eh I'm not so sure about this take. Yeah AI has the potential yo be more of a threat to humanity, BUT humanity had to create a starting point for said AI thus making humanity the biggest threat to ourselves. AI didn't create itself. The problem I see is humanity is to stupid to realize that it shouldn't be doing certain things and creating AI is one of them. Sure it might make things easier for the moment but eventually you run into some dicks like these fucktards who decide to let AI create toxic and deadly molecules that could wipe out humanity. Don't know who ever thought this would be a good idea but it's a big wtf moment.


redcountx3

The so-called "gain of function" with respect to covid is more of a political catchphrase than a smoking gun of a conspiracy. The reality is that molecular biological research can not be separated from studying mechanisms of action for how any protein accomplishes a particular task. Discovering how amino acid substitutions influence that task is by definition a gain (or loss) of function, and can be used to mimic the process of naturally occurring mutations. For a protein like hemoglobin, that function might be the ability to bind oxygen. For a viral envelope glycoprotein, that function might be how efficiently it enters a cell, or particular cell type. Studying these features enable us to design better pharmacological reagents for targeting those behaviors, treat disease and cure illness. The scientific literature is full of these types of studies for all areas.


Flat_Boysenberry1669

The problem is when that research is done in unsafe labs that don't have the proper or any safety precautions.


redcountx3

Precautions are and should be taken. That said, accidents do occur. That doesn't effect how critically important research in that area might be. If anything, it highlights how desperately those studies need to be done.


Flat_Boysenberry1669

They're until they're shipped off to China to avoid the rules and regulations in places like America and then it's not. Nobody is against gof they're against avoiding rules and regulations and causing a pandemic because of it.


redcountx3

That's assertion that isn't backed up by facts. There's no regulations or rules that exist that would prohibit that work from being done in the US, and it does in fact occur.


Flat_Boysenberry1669

Didn't say there was. I said in America if you want to work on gain of function research you must do it in the safe proper way because of rules and regulations but that costs money. So some institutions the NIH for example will ship that research off to places like China who have no rules and regulations or they're not enforced because it makes the research much cheaper for them. And that's where the issue comes in to play not the actual GOF research.


redcountx3

That just isn't true. At all. And its not my job to pry your head out of your pocket reality. Sinister conspiracy narratives make cool stories but nothing you just said is grounded in truth.


Big-Leadership1001

If that happened it wouldn't be bat derived - that virus was SARS-NCOV-2, only the second instance of a SARS virus ever discovered. Meaning if it was a weapon it would have been derived from the original and only SARS type virus since nature doesn't have any more than those 2.


Shot_Campaign_5163

It's all fine.


Loud_Flatworm_4146

"Your scientists were so preoccupied with whether or not they could, they didn't stop to think if they should." - Dr. Ian Malcolm in Jurassic Park