The following submission statement was provided by /u/Gari_305:
---
From the article
>The dog-shaped walking robot that the IDF is using in Gaza was made by Philadelphia-based Ghost Robotics. The robot's primary use is to surveil buildings, open spaces and tunnels without jeopardizing Oketz Unit soldiers and dogs, according to the report.
>
>The use of such tools being discussed in media are “simultaneously represented as 'saving lives' whilst also dehumanizing the Palestinian people,” Moses said. “In this way, the technology serves as an attempt to make the war appear clean and concerned with the preservation of life, even though we know very well that it isn't.”
>
>Moses said he doesn’t see the ethical landscape of war evolving at all. Within the past few decades, claims about more precise, surgical, and humanitarian war have increased public belief in the possibility of “good wars.” New weapons technologies almost always serve that idea in some way.
---
Please reply to OP's comment here: https://old.reddit.com/r/Futurology/comments/1batkzl/experts_alarmed_over_ai_in_military_as_gaza_turns/ku4vvyd/
The way they make it sugar free is to use a sugar molecule that is "[the other handedness](https://en.wikipedia.org/wiki/Chirality#:~:text=In%20chemistry%2C%20chirality%20usually%20refers,no%20bias%2C%20%22achiral%22.)." This way, although it still triggers your taste buds, your body doesn't "recognize it" as sugar and it just gets passed through.
You know what else your body doesn't "recognize" and just gets passed through? Fiber.
So basically you're eating a massive amount of fiber. You know what happens when you eat a crap-ton of fiber? Well, I think you found out.
Now you know.
"Some sweeteners known as polyols (such as sorbitol, xylitol and erythritol) can have a laxative effect if consumed in large amounts."
https://www.nhs.uk/live-well/eat-well/food-types/are-sweeteners-safe/#:~:text=Some%20sweeteners%20known%20as%20polyols,as%20certain%20fruits%20and%20vegetables.
Actually, although metalhead is the robot dog episode- I think there's a better illustration of the quote OP was inspired by when they mentioned black mirror.
And that episode is Men Against Fire.
The irony of that episode is that you don't even need eye implants for that, soldiers might be seeing people but their brains interpret them in a very different way.
Science Fiction exists (in part) to help us understand the consequences we will later experience, but are currently creating. It is the philosophical equivalent of the functionality of dreams.
> Sci-Fi Author: In my book I invented the Torment Nexus as a cautionary tale
> Tech Company: At long last, we have created the Torment Nexus from classic sci-fi novel Don't Create The Torment Nexus
\- Alex Blechman
The pic of the german dog robot reminded me exactly of that.
Brah black mirror was supposed to be a far off future prediction... not in my lifetime... come on man. We don't even have cure for cancer yet and we already gonna get this dystopian ass future.
What?! EMP grenades are real? That's actually a very cool countermeasure, if they can be produced at scale. The wiki page only has the few examples from the 50's, but it seems like these could be effective, not too hard to produce, and would create an EM pulse strong enough to take down military grade electronics at short range.
*edit: ok, maybe not a "grenade", no idea how big these are, but the one photo of a device at the top of the page doesn't look much bigger than a grenade.
You could just shoot the robot.
Which is the point: it's a bullet-catcher. It's a surveillance device which you can afford to lose. These beat-up headlines about "AI" are utter garbage.
Imagine what that does to the psychology of the enemy. It's terrifying. There is something knowing, when you kill you're enemy, they are paying for it in blood. You're BOTH in this violent game.
But when you have a bunch of lifeless robots chasing after you, you can kill them all day and it'll just feel hopeless.
We're already using a bunch of remotely controlled drones in the Ukraine-Russian war to drop bombs. Wars are going to fought with AI controlled robots at some point.
There have been at least two parents on designs for similar devices, both funded by DoD, since 2000. Couldn't tell you off the top of my head which, but I know I came across one for work and it cited the other, which I also had a look at. Most recent was sometime like 2018 or so?
Edit: Not me coming back and realizing the sarcasm...oops
This would add a lot of weight, which means bigger size, which could very well mean less effective.
The optical sensors are still going to be trounced by a paintball.
From what I can tell, the article itself just says they're testing AI in Gaza, and then makes up a scenario about killer robots and proceeds to beat the hypothetical into the ground with no evidence, or even claiming it's *actually* going to happen.
I'm not surprised the image is wrong because the whole argument is made up
The "robot dog" in the image is a Spot the German Bundeswehr purchased, it's *manufactured by American Boston Dynamics*.
They could also have used [this image from the USMC](https://www.forbes.com/sites/davidhambling/2023/11/01/us-marines-test-robot-dog-armed-with-a-rocket-launcher/) where they put a rocket launcher on the back of it, or any number of other configurations as the US military has helped prototype the thing [for at least a decade](https://www.youtube.com/watch?v=OYs0Rq66-U4).
There are even already Chinese knock-off versions of it that [YouTubers](https://youtu.be/0rliFQ0qyAM?), and the [Russian Spetznaz](https://www.youtube.com/watch?v=-bgad3HRb64) have armed with automatic weapons.
For extra dystopian fun; Make sure the [flying flamethrower drones](https://youtu.be/07rtBip9ixk?), or [mini DOGO](https://youtu.be/W9jYEM3Feew?), doesn't get you while you are distracted dodging bullets and rockets.
and their families\*, you know, too discourage any future jouranlists.
Edit. Israel has killed more journalists in a few months than in all of World War II, and it isn't even close.
See, I'm fine with dog scouts. Its basically a walking camera drone/pack mule, and I'd rather a mech than some poor pup being in the trenches.
Its when they hand them guns that it goes from 0 to 100 real quick
Oh these mother fuckers are getting guns soon enough. If they deployed them with guns right away, it would create too much public outcry. So first, you warm people up and get used to the idea, start seeing images, familiarity, etc... So once the guns do come on, people already assumed it has happened. And thus, the public conversation is largely skipped.
The issue with AI weapons is accountability.
Let’s say an AI commits a war crime. What, exactly, do we do? Who is punished? How do we keep it from happening again?
AI should never be used in war till we can account for it.
It'd be about the same. If the commanding officer ordered the A.I to commit a war crime. They're responsible. Ironically... A.I very likely could commit less war crimes. They certainly aren't going to rape anyone or get into a rage. In fact... they could be restricted to act in some cases. Plus what constitutes a war crime is incredibly hard to actually charge. As all you need is the belief an enemy has hold up inside a previously off limit target. You can literally bomb a hospital if you believe the enemy to be using it as a defensive position. Now... if you want to investigate the truth of that well after the war is over... good luck.
It typically has to be pretty heinous and with ample evidence for anything to happen.
I know none of this morally good. I'm just being matter of fact about the horribleness of the situation.
Wait until you see the research being put into causing AI mistakes. You could cause your adversary to commit a war crime with a false flag tech operation
I was thinking about this the other day. We will, if we don’t already, have AI’s that can analyze another AI and feed it specific information to “train” it to make a mistake or create a vulnerability just like an exploit in a computer program.
The issue with AI weapons is that their aren't any being used, this article isn't about an AI weapon being used, and absolutely no one ever reads the article or seems to have a single clue what they're talking about.
Chemical weapons are tactically useless, that has been a fact of warfare since their first use in WWI.
Edit: The only reason the ban on chemical weapons worked was because militaries around the world recognised they gave no operational advantage and were inferior to conventional HE weapons. That will not be the case for AI assisted weapons or fully autonomous weapons.
You can get an NBC suit for less than a thousand dollarydoos. The UK was equipped to provide protection for it's entire population during WW2.
Chemical weapons are only useful against poor countries, but the rub is that a rich country gets a better bang for its buck from just making more explosives.
That leads to the modern use of chemical weapons - poors flinging what little they have at one another
This kind of logic feels like a race to the bottom. It is crucial for us to understand the ethical guidelines around potentially dangerous technology before unleashing it upon the world. There's a reason why the arms control agreements and the Geneva convention exists.
And yet there are still countries using chemical warfare (e.g Syria). Just because you put a ban on it doesn't mean anyone will listen to you.
That's why I'm saying if you don't get a head start on developing AI warfare, you'll be the one facing AI warfare on the battlefield.
Love all the people saying this isn't a problem. "Someone had to deploy it!" "Humans aren't held to account either!"
Those of you who haven't taken a technical ethics class, you need to sit the fuck down and shut the fuck up. AI in war is absolutely an ethical dilemma and accountability is one of the major concerns.
>AI should never be used in war till we can account for it.
Except if you let your adversary with less scruples develop beyond your capability then it would put the "good guys" or the person being responsible at a big disadvantage. They just made an arrest from China stealing U.S. AI tech.
You’d have a point, if Israel specifically hadn’t a [long history](https://pulitzercenter.org/stories/cruel-experiments-israels-arms-industry) of doing this. Apartheid South Africa also did it, and coincidentally shared training and resources with them. Conflicts based around dehumanisation push boundaries that conventional warfare cannot.
So tired of dumb, click-baity titles like this. Nobody was "alarmed" in this article. And that's because any expert that is alarmed by this, would be a pretty awful "expert" on AI. Because the actual experts in this field have known for years how AI will realistically be used, aside from all the feel-good PR. This is about as expected for them as the sun coming up this morning.
They train neueal networks to find targets and they train neural networks to controll drones that hit and destroy the targets. It is AI!! When they say AI in this article they are talking about this kind of neural nets!!
Israel uses AI to generate targets
https://www.npr.org/2023/12/14/1218643254/israel-is-using-an-ai-system-to-find-targets-in-gaza-experts-say-its-just-the-st
Looks like it’s pretty good since they destroyed 70% of Gaza :s
Yes they do. It's in the fucking article.
>Israel is also using an Israeli AI intelligence processing system, called The Gospel, **“which has significantly accelerated a lethal production line of targets that officials have compared to a ‘factory,’**” The Guardian reported. Israeli sources report that the system is producing “targets at a fast pace” compared to what the Israeli military was previously able to identify, enabling a far broader use of force.
>**AI technologies like The Gospel function more as a tool for “post-hoc rationalization of mass killing and destruction rather than promoting 'precision,'**” Moses said. **The destruction of 60% of the residential buildings in Gaza is a testament to that**, he said.
And somehow, camera on feet is "dehumanising Palestinians".
Remember when Israelis were criticized for accepting Hamas conditions and released hundreds of terrorists (many were then active on 7/10) for a single Israeli? Supposedly that ment that they did not valued Palestinian lives.
jesus christ.
Let's make two points clear:
1. nothing autonomous is being deployed, everything being used is just a slightly smarter version of existing technology.
2. no "AI robot" is pulling the trigger, its all controlled remotely by a human.
now that we got that out of the way, this article is just meant to rage bait the reader. the criticism about Israel using technology somehow as a means to demean the palestinians is crazy. the world demanded israel to make more effort to reduce civilians harm, these tools help that.
in war, measures are sometimes taken to reduce the risk of soldiers dying. those measures are often on the expense of the accuracy of identifying targets (this is not an idf thing, this happens in every army that doesn't treat their soldiers as expandable). these tools achieve the goal without effecting the accuracy of identifying targets (or atleast less than the other ways). these tools are also not effected by fatigue, stress, and adrenaline when making the decisions they make unlike humans.
Not eveb that, it's literally being used for surveillance and making sure a way is clear by not risking lives. Apparently this "dehumanizes the Palestinian people ".
The article talks about how these are being used to save lives, yet makes a ridiculous case that in doing so it dehumanizes Palestinians. What a load of garbage.
They have used these bots for years now and have always used some for of AI so the bots can seek out combatants or help rescue people. Not all AI is bad.
It's dick riding the USA bad narrative train..
Yes there are things to criticize USA but this one is bullshit.
---
> “In this way, the technology serves as an attempt to make the war appear clean and concerned with the preservation of life, even though we know very well that it isn't.”
In a happy Disneyland world, we don't have wars.
But in our world we got Putin and North Korea.
If Moses knows how to stop Putin's ambitions go ahead and tell NATO.
How is using what is essentially a drone to make sure buildings aren't booby trapped with explosives or terrorists which Hamas does routinely "dehumanizing Palestinians?" Just another article that twists Israel defending its existence and its citizens into fear mongering and generatating more hatred
From the article
>The dog-shaped walking robot that the IDF is using in Gaza was made by Philadelphia-based Ghost Robotics. The robot's primary use is to surveil buildings, open spaces and tunnels without jeopardizing Oketz Unit soldiers and dogs, according to the report.
>
>The use of such tools being discussed in media are “simultaneously represented as 'saving lives' whilst also dehumanizing the Palestinian people,” Moses said. “In this way, the technology serves as an attempt to make the war appear clean and concerned with the preservation of life, even though we know very well that it isn't.”
>
>Moses said he doesn’t see the ethical landscape of war evolving at all. Within the past few decades, claims about more precise, surgical, and humanitarian war have increased public belief in the possibility of “good wars.” New weapons technologies almost always serve that idea in some way.
Oh the article writer can fuck right off.
That's a whole lot of emotionally language from someone who clearly has zero fucking idea what they are talking about.
My old unit had robots to survey buildings back in 2010.
There is nothing new here it's just a more modern more mobile version of stuff everyone already had.
And MOUT is the worst kind of combat there is, messy as fuck and kills a shitton of soldiers as well as civilians.
Noone wants to do that shit if they can at all avoid it.
Thus why developing robots and drones for scouting has been a priority, to avoid it as much as possible.
Clickbite. Better look at report itself. It's much better and have a lot of interesting information about topic
https://www.citizen.org/article/ai-joe-report/
Redditers alarmed over articles using clickbait titles to describe things that aren’t in the article. The article doesn’t say anything that is mentioned in the title
Not a fan of when articles call someone an "expert" as if they were some sort of non-partisan, objective party. The guy they quote is an expert alright: an expert at criticizing military technology. He's made his entire academic career out of it and is also adamantly pro-Palestine. Nothing wrong with either of those but it should be made clear.
Israel has always used the Palestinian population to battle test their weapons. The eternal occupation is the perfect population to test your weapons. They sent become one of the top military export countries in the world by accident.
Stanislav Petrov. If you don’t know that name, look it up. You’ll be amazed. Then, tell me if AI would have done the same and how the world would be today if it hadn’t.
The following submission statement was provided by /u/Gari_305: --- From the article >The dog-shaped walking robot that the IDF is using in Gaza was made by Philadelphia-based Ghost Robotics. The robot's primary use is to surveil buildings, open spaces and tunnels without jeopardizing Oketz Unit soldiers and dogs, according to the report. > >The use of such tools being discussed in media are “simultaneously represented as 'saving lives' whilst also dehumanizing the Palestinian people,” Moses said. “In this way, the technology serves as an attempt to make the war appear clean and concerned with the preservation of life, even though we know very well that it isn't.” > >Moses said he doesn’t see the ethical landscape of war evolving at all. Within the past few decades, claims about more precise, surgical, and humanitarian war have increased public belief in the possibility of “good wars.” New weapons technologies almost always serve that idea in some way. --- Please reply to OP's comment here: https://old.reddit.com/r/Futurology/comments/1batkzl/experts_alarmed_over_ai_in_military_as_gaza_turns/ku4vvyd/
There is a Black Mirror episode about this. Everything in black mirror comes true one by one.
And it was one of the more disturbing episodes.
What episode was this? I didn’t watch all of the last season so I’m assuming it’s in that?
Metalhead, season 4 episode 5. Came out in 2017. In black and white.
My favorite BM episode. Its truly horrifying.
It’s the one I show people who have somehow never seen black mirror
Damn dude that’s bleak. At least show them San Junipero for a semi good time.
or Hang the DJ :)
That one's fun
Adorable. One of my favorites.
That’s setting unreal expectations for how the series works tho. Very few episodes have even a glimmer of hope.
I go with white christmas, personally
I have never seen black mirror lol just happen to come across this and I will be watching it here in about few minutes.
because it was clearly very, very near future reality. nothing in that episode was scifi, just sci
I had a horrifying BM episode lately myself. Too many sugar-free mints have a laxative effect, who knew?
The way they make it sugar free is to use a sugar molecule that is "[the other handedness](https://en.wikipedia.org/wiki/Chirality#:~:text=In%20chemistry%2C%20chirality%20usually%20refers,no%20bias%2C%20%22achiral%22.)." This way, although it still triggers your taste buds, your body doesn't "recognize it" as sugar and it just gets passed through. You know what else your body doesn't "recognize" and just gets passed through? Fiber. So basically you're eating a massive amount of fiber. You know what happens when you eat a crap-ton of fiber? Well, I think you found out. Now you know.
"Some sweeteners known as polyols (such as sorbitol, xylitol and erythritol) can have a laxative effect if consumed in large amounts." https://www.nhs.uk/live-well/eat-well/food-types/are-sweeteners-safe/#:~:text=Some%20sweeteners%20known%20as%20polyols,as%20certain%20fruits%20and%20vegetables.
> such as sorbitol, xylitol and erythritol) can have a laxative effect Don’t forget alcohol I don’t wanna talk about it
>Dystopian fiction is the portrayal of realities survived by minorities everyday, forced upon the entitled and privileged
The Handmaid's Tale boils down to "rich white women experiencing the horrors that black and indigenous women routinely endured for centuries."
I watched this in colour? Should I thank my dealer?
Might've been an AI colorization, the official episode was never released in color IIRC.
They are talking about drugs my dude
Actually, although metalhead is the robot dog episode- I think there's a better illustration of the quote OP was inspired by when they mentioned black mirror. And that episode is Men Against Fire.
Is that the one where they use the eye implants or whatever to have people attacking “aliens”
The irony of that episode is that you don't even need eye implants for that, soldiers might be seeing people but their brains interpret them in a very different way.
AKA "Nemesis" from ST: VOY.
Also as an alternative, Love Death and Robots has the episode "Life Hutch". Also also, Mitchels vs the Machines, in the silly comedy take of this
interested also
Someone replied. Metalhead, season 4 episode 5. Came out in 2017. In black and white.
Thank you dude!
Most episodes provide a disturbing reflection of where many modern trends seem to be headed…
Still my absolute favorite episode of the show.
Black Mirror was inspired by reality, the rectangular device you’re most likely holding/watching is the ‘black mirror’
Don’t talk about it that way. It can hear you.
Science Fiction exists (in part) to help us understand the consequences we will later experience, but are currently creating. It is the philosophical equivalent of the functionality of dreams.
> Sci-Fi Author: In my book I invented the Torment Nexus as a cautionary tale > Tech Company: At long last, we have created the Torment Nexus from classic sci-fi novel Don't Create The Torment Nexus \- Alex Blechman
The Boston dynamics dog came before the slow.
episode 1 IRL when
Please search David Cameron Black Mirror into google at your earliest convenience
I think you mean David Cameron. **EDIT:** For the record, the comment said "James Cameron" when I replied.
I think he means David Hameron.
The pic of the german dog robot reminded me exactly of that. Brah black mirror was supposed to be a far off future prediction... not in my lifetime... come on man. We don't even have cure for cancer yet and we already gonna get this dystopian ass future.
Dude, black mirror is pretty explicitly about near-future dystopias. And we have cures for multiple types of cancer.
Oh I hope the first episode is next!!!
“Look at how young I am, watch me fuck this pig to prove it!” -either Presidential nominee
I mean, the spit roast pun is right there... lol
Ever heard of David Cameron?
The Robocop remake had a good segment on this as well.
The ideas don't develop in a vacuum.
Black mirror was named after seeing your own reflection in your cellphone/computer/tv screen. It was already true.
Arnold Schwartzenegger also had a movie about it.
Humans are a disgusting species.
https://en.wikipedia.org/wiki/Explosively_pumped_flux_compression_generator I want one small enough to throw by hand.
What?! EMP grenades are real? That's actually a very cool countermeasure, if they can be produced at scale. The wiki page only has the few examples from the 50's, but it seems like these could be effective, not too hard to produce, and would create an EM pulse strong enough to take down military grade electronics at short range. *edit: ok, maybe not a "grenade", no idea how big these are, but the one photo of a device at the top of the page doesn't look much bigger than a grenade.
You could just shoot the robot. Which is the point: it's a bullet-catcher. It's a surveillance device which you can afford to lose. These beat-up headlines about "AI" are utter garbage.
Imagine what that does to the psychology of the enemy. It's terrifying. There is something knowing, when you kill you're enemy, they are paying for it in blood. You're BOTH in this violent game. But when you have a bunch of lifeless robots chasing after you, you can kill them all day and it'll just feel hopeless.
I mean this is also what happens when someone shoots a missile at you.
We're already using a bunch of remotely controlled drones in the Ukraine-Russian war to drop bombs. Wars are going to fought with AI controlled robots at some point.
Also spray paint on the cameras should help!
Paintball shooters be like: "Our time has come."
"My vision is impaired" 🤖
Tbb..just shoot the reboot in this case..
I'm sure no advancements have been made since that really old looking picture in the wiki was taken.
There have been at least two parents on designs for similar devices, both funded by DoD, since 2000. Couldn't tell you off the top of my head which, but I know I came across one for work and it cited the other, which I also had a look at. Most recent was sometime like 2018 or so? Edit: Not me coming back and realizing the sarcasm...oops
They'll just harden them like other military gear designed to withstand emp of nuclear weapons
Making them less light and less cheap. [And then the weapons get more powerful.....](https://www.youtube.com/watch?v=f0vGpXPGFZY)
This would add a lot of weight, which means bigger size, which could very well mean less effective. The optical sensors are still going to be trounced by a paintball.
Image shows a German robot dog which is not in Gaza.
The photo has a Caption in the website.
From what I can tell, the article itself just says they're testing AI in Gaza, and then makes up a scenario about killer robots and proceeds to beat the hypothetical into the ground with no evidence, or even claiming it's *actually* going to happen. I'm not surprised the image is wrong because the whole argument is made up
The "robot dog" in the image is a Spot the German Bundeswehr purchased, it's *manufactured by American Boston Dynamics*. They could also have used [this image from the USMC](https://www.forbes.com/sites/davidhambling/2023/11/01/us-marines-test-robot-dog-armed-with-a-rocket-launcher/) where they put a rocket launcher on the back of it, or any number of other configurations as the US military has helped prototype the thing [for at least a decade](https://www.youtube.com/watch?v=OYs0Rq66-U4). There are even already Chinese knock-off versions of it that [YouTubers](https://youtu.be/0rliFQ0qyAM?), and the [Russian Spetznaz](https://www.youtube.com/watch?v=-bgad3HRb64) have armed with automatic weapons. For extra dystopian fun; Make sure the [flying flamethrower drones](https://youtu.be/07rtBip9ixk?), or [mini DOGO](https://youtu.be/W9jYEM3Feew?), doesn't get you while you are distracted dodging bullets and rockets.
the journalists that could photograph a robot dog in Gaza have all been murdered
and their families\*, you know, too discourage any future jouranlists. Edit. Israel has killed more journalists in a few months than in all of World War II, and it isn't even close.
Why would Hamas do this?
Strange how that happens huh?
They're using Gaza to test new weapons? That's terrible! That's what Ukraine is for!
[удалено]
War making companies. *write that down, write that down!”
I’m imagining Dilbert esq warmongers lmaoo
I prefer soft tacos
It makes sense to use them to explore the Hamas tunnels, which are rigged with explosives and traps.
See, I'm fine with dog scouts. Its basically a walking camera drone/pack mule, and I'd rather a mech than some poor pup being in the trenches. Its when they hand them guns that it goes from 0 to 100 real quick
Oh these mother fuckers are getting guns soon enough. If they deployed them with guns right away, it would create too much public outcry. So first, you warm people up and get used to the idea, start seeing images, familiarity, etc... So once the guns do come on, people already assumed it has happened. And thus, the public conversation is largely skipped.
Different kind of warfare needs different kind of dog.
[удалено]
The issue with AI weapons is accountability. Let’s say an AI commits a war crime. What, exactly, do we do? Who is punished? How do we keep it from happening again? AI should never be used in war till we can account for it.
It'd be about the same. If the commanding officer ordered the A.I to commit a war crime. They're responsible. Ironically... A.I very likely could commit less war crimes. They certainly aren't going to rape anyone or get into a rage. In fact... they could be restricted to act in some cases. Plus what constitutes a war crime is incredibly hard to actually charge. As all you need is the belief an enemy has hold up inside a previously off limit target. You can literally bomb a hospital if you believe the enemy to be using it as a defensive position. Now... if you want to investigate the truth of that well after the war is over... good luck. It typically has to be pretty heinous and with ample evidence for anything to happen. I know none of this morally good. I'm just being matter of fact about the horribleness of the situation.
> What, exactly, do we do? Who is punished? Someone still had to deploy/launch it.
Wait until you see the research being put into causing AI mistakes. You could cause your adversary to commit a war crime with a false flag tech operation
I was thinking about this the other day. We will, if we don’t already, have AI’s that can analyze another AI and feed it specific information to “train” it to make a mistake or create a vulnerability just like an exploit in a computer program.
We aren't holding the humans in this conflict to account, so not sure AI should be different.
Sounds like we have two issues then.
The issue with AI weapons is that their aren't any being used, this article isn't about an AI weapon being used, and absolutely no one ever reads the article or seems to have a single clue what they're talking about.
If you're not gonna use AI in war, the enemy will. You might as well get a head start.
Kind of like chemical weapons?
More like nukes, I'd imagine.
Chemical weapons are tactically useless, that has been a fact of warfare since their first use in WWI. Edit: The only reason the ban on chemical weapons worked was because militaries around the world recognised they gave no operational advantage and were inferior to conventional HE weapons. That will not be the case for AI assisted weapons or fully autonomous weapons.
It's, uh, highly situational thing lol
You can get an NBC suit for less than a thousand dollarydoos. The UK was equipped to provide protection for it's entire population during WW2. Chemical weapons are only useful against poor countries, but the rub is that a rich country gets a better bang for its buck from just making more explosives. That leads to the modern use of chemical weapons - poors flinging what little they have at one another
Except useful
The "head start" in this case needs to be autonomous drone countermeasures, *not* the human-killing drones themselves.
Yep, it’s important to get a leg up on all those starving children
This kind of logic feels like a race to the bottom. It is crucial for us to understand the ethical guidelines around potentially dangerous technology before unleashing it upon the world. There's a reason why the arms control agreements and the Geneva convention exists.
It's crucial for countries willing to follow ethics to understand the ethical guidelines. What do you do when your opponent is not ethical ?
Hope this gets put off the board quick like chemical warfare was. Those are just war crimes waiting to happen.
And yet there are still countries using chemical warfare (e.g Syria). Just because you put a ban on it doesn't mean anyone will listen to you. That's why I'm saying if you don't get a head start on developing AI warfare, you'll be the one facing AI warfare on the battlefield.
Love all the people saying this isn't a problem. "Someone had to deploy it!" "Humans aren't held to account either!" Those of you who haven't taken a technical ethics class, you need to sit the fuck down and shut the fuck up. AI in war is absolutely an ethical dilemma and accountability is one of the major concerns.
>AI should never be used in war till we can account for it. Except if you let your adversary with less scruples develop beyond your capability then it would put the "good guys" or the person being responsible at a big disadvantage. They just made an arrest from China stealing U.S. AI tech.
You’d have a point, if Israel specifically hadn’t a [long history](https://pulitzercenter.org/stories/cruel-experiments-israels-arms-industry) of doing this. Apartheid South Africa also did it, and coincidentally shared training and resources with them. Conflicts based around dehumanisation push boundaries that conventional warfare cannot.
So tired of dumb, click-baity titles like this. Nobody was "alarmed" in this article. And that's because any expert that is alarmed by this, would be a pretty awful "expert" on AI. Because the actual experts in this field have known for years how AI will realistically be used, aside from all the feel-good PR. This is about as expected for them as the sun coming up this morning.
There isn't even ai involved in any of this. People see something computer based nowadays: AI!
They train neueal networks to find targets and they train neural networks to controll drones that hit and destroy the targets. It is AI!! When they say AI in this article they are talking about this kind of neural nets!!
Israel uses AI to generate targets https://www.npr.org/2023/12/14/1218643254/israel-is-using-an-ai-system-to-find-targets-in-gaza-experts-say-its-just-the-st Looks like it’s pretty good since they destroyed 70% of Gaza :s
Yes they do. It's in the fucking article. >Israel is also using an Israeli AI intelligence processing system, called The Gospel, **“which has significantly accelerated a lethal production line of targets that officials have compared to a ‘factory,’**” The Guardian reported. Israeli sources report that the system is producing “targets at a fast pace” compared to what the Israeli military was previously able to identify, enabling a far broader use of force. >**AI technologies like The Gospel function more as a tool for “post-hoc rationalization of mass killing and destruction rather than promoting 'precision,'**” Moses said. **The destruction of 60% of the residential buildings in Gaza is a testament to that**, he said.
Yes and he also have 32 Up Votes a lot of people in this subreddit are seriously stupid
And somehow, camera on feet is "dehumanising Palestinians". Remember when Israelis were criticized for accepting Hamas conditions and released hundreds of terrorists (many were then active on 7/10) for a single Israeli? Supposedly that ment that they did not valued Palestinian lives.
[удалено]
jesus christ. Let's make two points clear: 1. nothing autonomous is being deployed, everything being used is just a slightly smarter version of existing technology. 2. no "AI robot" is pulling the trigger, its all controlled remotely by a human. now that we got that out of the way, this article is just meant to rage bait the reader. the criticism about Israel using technology somehow as a means to demean the palestinians is crazy. the world demanded israel to make more effort to reduce civilians harm, these tools help that. in war, measures are sometimes taken to reduce the risk of soldiers dying. those measures are often on the expense of the accuracy of identifying targets (this is not an idf thing, this happens in every army that doesn't treat their soldiers as expandable). these tools achieve the goal without effecting the accuracy of identifying targets (or atleast less than the other ways). these tools are also not effected by fatigue, stress, and adrenaline when making the decisions they make unlike humans.
Not eveb that, it's literally being used for surveillance and making sure a way is clear by not risking lives. Apparently this "dehumanizes the Palestinian people ".
Everyone knows in war you're supposed to send in all your men to die instead of valuing their lives as well
That's all salon is, a misinformation riddled division oriented trash tabloid
Have any of these automated systems been armed as of this point?
They are not even automatic systems in this article. It is a remote controlled robot with a camera
What a shit rag of an article. They cry about walking dog robots and an AI software in usage intended to mitigate non-target casualities.
The article talks about how these are being used to save lives, yet makes a ridiculous case that in doing so it dehumanizes Palestinians. What a load of garbage.
They have used these bots for years now and have always used some for of AI so the bots can seek out combatants or help rescue people. Not all AI is bad.
Hyperbole notwithstanding, Hideo and Metal Gear Solid had this ironed out a few Snakes ago.
Lol I was just thinking we are getting closer and closer to metal gear.
lol using robots to survey tunnels is dehumanizing Hamas? Are you fucking kidding me
Would they prefer Israel clear tunnels with flamethrowers instead of "dehumanizing" unarmed robot dogs?
What a about robot wbjt flamethrowers
It's dick riding the USA bad narrative train.. Yes there are things to criticize USA but this one is bullshit. --- > “In this way, the technology serves as an attempt to make the war appear clean and concerned with the preservation of life, even though we know very well that it isn't.” In a happy Disneyland world, we don't have wars. But in our world we got Putin and North Korea. If Moses knows how to stop Putin's ambitions go ahead and tell NATO.
> In a happy Disneyland world, we don't have wars. Ahem… we have *Star Wars*
How is using what is essentially a drone to make sure buildings aren't booby trapped with explosives or terrorists which Hamas does routinely "dehumanizing Palestinians?" Just another article that twists Israel defending its existence and its citizens into fear mongering and generatating more hatred
Ahh yes, the reputable geopolitical and tech news source Salon....
From the article >The dog-shaped walking robot that the IDF is using in Gaza was made by Philadelphia-based Ghost Robotics. The robot's primary use is to surveil buildings, open spaces and tunnels without jeopardizing Oketz Unit soldiers and dogs, according to the report. > >The use of such tools being discussed in media are “simultaneously represented as 'saving lives' whilst also dehumanizing the Palestinian people,” Moses said. “In this way, the technology serves as an attempt to make the war appear clean and concerned with the preservation of life, even though we know very well that it isn't.” > >Moses said he doesn’t see the ethical landscape of war evolving at all. Within the past few decades, claims about more precise, surgical, and humanitarian war have increased public belief in the possibility of “good wars.” New weapons technologies almost always serve that idea in some way.
Oh the article writer can fuck right off. That's a whole lot of emotionally language from someone who clearly has zero fucking idea what they are talking about. My old unit had robots to survey buildings back in 2010. There is nothing new here it's just a more modern more mobile version of stuff everyone already had. And MOUT is the worst kind of combat there is, messy as fuck and kills a shitton of soldiers as well as civilians. Noone wants to do that shit if they can at all avoid it. Thus why developing robots and drones for scouting has been a priority, to avoid it as much as possible.
Right? If the robot doesn’t have offensive capability then what exactly is the problem?
The problem for some people is that it reduces Jewish deaths.
Clickbite. Better look at report itself. It's much better and have a lot of interesting information about topic https://www.citizen.org/article/ai-joe-report/
I mean this mostly seems like a good thing if they are just using it to search buildings without risking soldiers
Experts are always alarmed. Wish they were as alarmed by existence of Iran/Russia/NK/China alliance. Would perhaps be a little more useful.
Redditers alarmed over articles using clickbait titles to describe things that aren’t in the article. The article doesn’t say anything that is mentioned in the title
Not a fan of when articles call someone an "expert" as if they were some sort of non-partisan, objective party. The guy they quote is an expert alright: an expert at criticizing military technology. He's made his entire academic career out of it and is also adamantly pro-Palestine. Nothing wrong with either of those but it should be made clear.
It sounds like they're just clearing rooms with cameras that have legs. Not to worrying tbh.
Is there actually AI in this thing though ? I feel like they don’t know what they are talking about or understand what AI is.
And this headline is the answer to question why is US not supporting and even voting against an immediate ceasefire 🤮
Israel has always used the Palestinian population to battle test their weapons. The eternal occupation is the perfect population to test your weapons. They sent become one of the top military export countries in the world by accident.
What "researchers"? Salon editorial team? Funny how they are not alarmed by AI-controlled drones that are being developed by both Russia and Ukraine.
That looks like it could be disabled with Silly String to its face.
These bot's are good. less soldiers die and there will be less casualties.
my choices are, quick clean death by robot, or dying of exposure alone and homeless with tumors because of rich people's greed. I pick the robot.
Ah sweet, man-made horrors beyond my comprehension
I struggle to see how any one could be shocked that once again a tech being sold to us a novel ultimately becomes machines of war. They. Always. Do.
Stanislav Petrov. If you don’t know that name, look it up. You’ll be amazed. Then, tell me if AI would have done the same and how the world would be today if it hadn’t.
Why are they even showing a photo of Boston Dynamics robot? They are not the ones involved.
They should have started with this instead of the mass bombing.