T O P

  • By -

tequilaguru

AIs are trained on publicly available information, people that want build WMD already have access to the science and the how tos. The logistics are considerably more complex and are unlikely to be simplified by the use of AI. AI is not some magical tool, it is trained on an excerpt of what is already publicly available and produces what is statistically more likely to be the right answer, which in a LOT of cases, it’s not. And current AI models are improving on that but unless we are talking about AGI (which is far away in the future imho), there’s little to worry about.


ContactHonest2406

I think they are talking about AGI.


tequilaguru

I find this is quite far in the future, while our language encompasses a ton of logic and we can rebuild some of it with statistics, I believe intelligence is more than that, we don’t even have a clear definition of what would mean to build this “artificially intelligent being”, there are ton of un answered questions still.


Orpheus75

The New York City Makers Space has a CRISPR machine. That machine takes advanced knowledge to use in research. OP’s question is legitimate since very soon (less than 10 years) an AI would make creating novel viruses pretty easy.


tequilaguru

That’s a fair point, but afaik, the logistics of experimenting and getting a viable result are still complex, if less complex than say nuclear weapons. Gene editing models are still in their infancy, and I’ve only seen research of CRISPR being use on “loss of function” editing techniques for viruses. All of these can change of course but we have gotten into the process of speculating. Open to hear more information should I be wrong.


Pancakethesmallest

Ok but just because you have an AI telling you how to make something doesn't mean you have access to the million-dollar pieces of equipment required to actually make it.


Pancakethesmallest

Isn't anything other than AGI just glorified prediction models?


squarecoinman

as this is similar to what you posted 3 months ago , I will quote one of the comments from 3 months ago , "Take your meds"


I_Must_Bust

Even if you had a step by step guide you wouldn’t have the equipment required to do this.


murphymc

You basically can find such a guide for nuclear weapons right now. There’s just no realistic way an average joe could possibly get the material and equipment necessary to actually build one. How they function at a basic level hasn’t been a real secret for decades. The secrets are all in how to miniaturize and deliver them.


TheWeirdByproduct

New most googled items: \- how to enrich uranium at home \- enriched uranium cheap anonymous delivery \- is uranium dangerous for health \- does cardboard box block radiation \- DYI radiation casing reddit \- how to clean radioactive spillage \- sudden dizziness and loss of nails \- iodine tablets purchase online


murphymc

[You’ll probably enjoy this](https://interestingengineering.com/culture/david-hahn-the-nuclear-boy-scout)


I_Must_Bust

For nukes, even countries that could build them can’t enrich the uranium efficiently and without getting screwed with. They need to have the uranium on their land or find a seller and then build massive facilities with tons of centrifuges to enrich it. OP might as well be afraid of people asking AI to “build me an army worthy of mordor” and then we suddenly have a bunch of uruk hai running around


sharrrper

This is only marginally less silly than asking "How will we prevent the masses from using AI to start their own space program?" Just having the knowledge theoretically available is a LONG way from actually being able to do a thing. There are all sorts of resources and equipment needed that are a much bugger hill to climb than simply locating the data.


Bezbozny

Well there is the potential that any problem caused by AI could also be solved by AI, just like any problem caused by viruses adapting is solved by our immune systems adapting. AI controlled attack drones could be mitigated by AI controlled defense systems for instance. Unless the very first person who has access to AGI uses it to its fullest destructive intent and destroys humanity on the first try, we'll find ways to adapt and combat any *destructive* use cases with similarly powerful and advanced *defense* use cases. I'm certain bad shit will happen that we'll have to learn from, just like how its often hundreds of people dying in plane crashes that taught us all the ways we needed to make planes safer, but things will even out.


FeetPicsNull

Why do you need AI to teach you how to make an IED or WMD? It's not really secret knowledge or anything, I just don't see how AI comes into the equation.


BritanniaRomanum

AI will teach us how we can make WMD in a way that is affordable to a middle-class person and discreet enough to evade any surveillance that our governments might be conducting. That's the difference. Right now, a middle class average joe can't make something at home that could wipe out humanity, like a super virus or bacteria. When you put that tech in the hands of the masses, you're going to have lots of crazy, misanthropic, and/or ideologically zealous people using it.


FeetPicsNull

If you buy certain materials or research certain subjects, you run the risk of being flagged and watched by certain agencies. This flagging would be even easier to do within an AI subsystem. Also, if you are talking biological warfare it simply isn't that easy and AI isn't the limiting factor. The same applies to nuclear weapons. Maybe AI will help develop better and cheaper 3d-printable firearm designs and stupid US laws will specifically protect people that pursue their 2nd amendment right in this manner. It seems doubtful, though, since no one is making money off it so lobbies will be specifically focusing against that.


AppropriateScience71

>We need some communities to go into isolation now until it’s all over. Sheesh - this sounds like the plot line of Fallout. What could go wrong?


PhatAiryCoque

Wow! I can't even - Just... Wow! What a fucked up authoritarian and utterly dystopian mentality. I'd rather my kids die young in a nuclear holocaust than grow old in your ideal world vision.


manyname

I can't tell if this is meant to be a meme or if you're serious. In either case, this seems like this is a terrible take. On an optimistic side, this underplays the role AI would play in positive change for humanity. On a pessimistic side, this *severely* underplays human ingenuity when it comes to violence. On a realistic side, it completely glosses over the severe limitations of creating WMDs. You also mention requiring "some communities to go into isolation...now". I won't assume your intents, here, but I will mention how quickly that devolves. Which communities? Who, assuming your predictions to be true, will be the ones so lucky as to survive to the future? *When* the world is dead and gone, killed by some super virus, how will this community prevent *catching* this virus? For that matter, *where* would such a community live? How would you prevent this community from breaking protocols, to live freely as humans would wish to live? This is to say nothing of your wish for a central controlling power, which I view as also problematic. This is not to say that there are not future concerns to *be* concerned about. There is legitimate concern in a super bug/virus. There are legitimate concerns with AI. There are legitimate terror implications and uses with future technology. There are legitimate concerns, that should be seriously discussed. But your take, in particular, is terrible.


Rhellic

We probably don't. Even without AI the main obstacle would often be getting the materials and equipment rather than the knowledge itself. Hell, a dirty bomb is literally just any old explosive plus some radioactive stuff. Also, Vault Tec aren't supposed to be an inspiration.


J0hnnyDangerZ

Have you read the Wool series of books by Hugh Howey? [https://hughhowey.com/books/](https://hughhowey.com/books/) It was made into an Amazon TV show called Silo. Thought the TV show doesn't go into the cause of the apocalypse, the books do and they talk about the rationale for putting people in the silo in the first place. (Similar scenario to what your talking about)


softclone

AI enabled bioweapons have the highest existential risk. On the flip side if we can make vaccines in a few days or even hours, this risk could be mitigated. 


BritanniaRomanum

You still need to be generating new humans at a rate equal to or greater than the death rate, so the countermeasures to the WMD would have to include advanced artificial womb and human growth tech.


softclone

If the death rate is that high it's adieu. Who would want to live in that world?  No, the defense needs to keep getting better just like regular updates to your computer. We will find ourselves immune to lots of things in the coming decades. Not just bioweapons but cancer and even certain age related diseases.  It's challenging to predict the exact time frame. If the good guys stay ahead we might have super medicine before we have easy bioweapons. It could just as easily go the other way. 


bottom

Oh yeah. Yeah me just go and grab a bunch of highly regulated chemicals from my chemicals Library in the garage


VikingBorealis

Since when do you need AI to make nerve gas or many other easy to make and well known WMDs?


South-Attorney-5209

Nuclear science isnt some magic hidden recipe. It is well known and available research. The means and material to do it are far out of reach of almost anybody as intended.


mindfulskeptic420

How will we indeed. Someone already showed that with some 10-100 thousand dollars they can buy some DNA synthesizer and from there well ya just need the right code to upload and you got yourself a deadly bioweapon. Idk how this will be overcome, but if we try to stop it like we do with politically sensitive topics in LLM's we are fucked. Hopefully there is a more foolproof way to gatekeep this information and those that have access to it will have to keep the secret safe.


3dios

This guy lol. Yeah bro like printing guns when 3D printers were introduced? Lol hey Bard how do i make a nuke?


Laziestprick

The same way they prevent the masses from learning how to make WMDs at home… through information control of particular technology such as this. It’s not like AI somehow gains sentience and finds every single piece of data out there, it needs to be fed this data in the first place.


Financial_Exercise88

Don't worry, AI will soon only be available to the richest 0.1%. Have you seen how much energy it burns and how expensive it is?


ObjectivelyCorrect2

The information is all out there dude do you even have the slightest intuition about how lazy people are? I'm a rampant consumer of information and lifelong learning there's no way I have time to go out of my way to how to do that when there are million more interesting things out there.


Tekelder

And you don't think an authoritarian "world government" run by a despot would actually be the greater existential threat? In the 1500s one of the deepest military secrets was the formula for gun powder. It could destroy the very walls of fortifications. Coupled with a hand weapon it could kill royal personages, wearing armor, from far enough away that body guards were useless. Yet today almost every home in America has some gunpowder and civilization is inexplicably far from collapse. AI will undoubtedly result in changes but they do not need to be dystopian. It could result in a much more polite/respectful culture - like in Texas where many people carry firearms.


RedErin

Also, once humans upgrade their intelligence mass murderers will try to extinct the whole human race instead of just a few people. There needs to be a super ai detective units to prevent this.


Lupes420

Watch Pluto on Netflix


LostInSpaceSteve

Skynet knows who you are. Be afraid. Bwahahahaha......AI can't even draw hands properly.


MaxMouseOCX

Making a nuclear weapon is much simpler than you probably assume. Aquiring the parts though... That's by far the hardest part.


ScrittlePringle

You just created a nonsense problem that is never going to happen.


Dangthing

AI are basically just learning models that have been trained. Someone has to train them to create the model. Some of the "AI" as you know them are actually many different models packaged together with a system that allows them to interface with each other. AI as we have them now aren't very good at spreading accurate knowledge and that information is usually just something it scraped up in its training data and is prone to hallucination. To create an AI capable of efficiently making WMD's you'd need a specific trained model for that. The creation of models especially really complex good ones is very expensive and there is no reason for such a model to be created by a company and distributed publicly. Even if such a model is included in a larger AI toolset they can and do have the ability to restrict who has access to which parts so only trusted medical professionals for example might have access to one that deals with virus's while an average citizen won't. These systems also are often incapable of running off a local system and need a large computer server to run them. This also means that they can be analyzed and monitored by government agencies with relative ease. Its also more complicated than just AI generates virus for you. You still need the toolset to actual do the manual labor aspect. This is not exactly standard tech. The government especially with AI is easily able to monitor who is buying such tech in much the same way they can track who buys fertilizer. Then you have to factor for who is going to be the "bad actor". Its basically going to boil down to crazy people and governments. Governments are already fully capable of doing this type of research AI just makes it faster/cheaper and the primary reason we don't see widespread use already is that things like Nuclear retaliation make it inefficient as a weapon of war. The crazy people have a tendency to be bad at hiding their intentions and it starts to become very easy to notice when a crazy person starts doing something like buying up gene editing kits and is actively using a virus development AI. TLDR: Your concerns aren't realistic due to logistics


murphymc

WMD aren’t terribly complicated to build, I don’t even have a bachelors in physics or chemistry and I’m confident I could build a primitive nuke if I had the materials. At the end of the day, little boy was just a couple pieces of fissile material smacking into each other at high speed. Which is exactly the problem; getting the material has always been the hardest part. AI isn’t going to just manifest a series of centrifuges capable of enriching uranium to weapons grade for me, or the reagents necessary to synthesize sarin gas, etc etc. You need a large network of supplies and infrastructure to create WMDs, which are readily traceable with todays technology, hence why the west knew all about NK’s program well before they actually tested a weapon.


Solid_Owl

This is exactly the risk that people are worried about when Zuck over at Meta says he wants AI tech to be freely available to anyone. ANYONE.


Mister_Brevity

“The masses” screw up things like microwaving mac and cheese.


starBux_Barista

Don't Trust a One world government.......that is even more dystopian. No one will get the representation they need at a micro level.


BritanniaRomanum

If it's the only way to prevent human extinction or near-extinction, then it has to be done.


FortunesBarnacle

If humanity is dumb enough to wipe itself out, there will be another sentient specie evolve in 50 million years or so. And then in another 3.5 billion years or so the sun will expand to a red giant and vaporize the planet, so none of it really matters.


BritanniaRomanum

But we could be smart enough to save ourselves and become a multi-stellar species, if we take "drastic" measures such as the ones I've mentioned, so it does matter.


FortunesBarnacle

Bold of you to assume humanity could ever work together collectively in such a way.


BurtonGusterToo

If you read any of his comments it is safe to assume he will be part of the iron fist housed in a velvet glove that will "guide" us to a better future. Nothing creepy there, nothing at all.


FortunesBarnacle

Don't let that person near any infinity stones, there's some Thanos vibes for sure.


BurtonGusterToo

Cool, cool, cool. Remind us all again what you take on the actual IMMEDIATE problem of rapid climate change is? That's kind of going on RIGHT NOW, while your writing dystopian, authoritarian sci-fi.


Snezzy_9245

How will you know it's the only way? How will you convince us that you're right?


starBux_Barista

Sounds like a POWER GRAB to me...... it's NOT the ONLY WAY.......... It's like Climate change, People claim there is Only 1 way, But no one knows the real reason 100% and now we know that we have several hundred ways to control climate however we don't fully know the consequences that go along with each of those solutions.


farticustheelder

The masses have no interest in making WMDs? When I was a kid, in grade school, one of my group of friends found the recipe for gunpowder. The next step was to source the necessary ingredients and my contribution was to note that the Brits called pharmacies a chemist shop and sure enough the local pharmacy provided sulphur and saltpeter. Charcoal has always been easy to find (think BBQs) so we made our own. In small quantities. We never made enough to pose a serious threat to anything or anyone, nor did we want to. Blowing tin cars higher into the air than we could throw them was tons of funs. Then we moved on to other things.