backtop


Print 86 comment(s) - last by judasmachine.. on Dec 4 at 1:42 PM


BAE Systems Black Knight   (Source: Defense Update)
BAE Systems' Black Knight is a formidable weapon for the battlefield

In early October, DailyTech brought you the story of Foster-Miller's MAARS (Modular Advanced Armed Robotic System). The MAARS followed in the footsteps of previous battlefield robots like the REDOWL PakBot and the SUGV Early.

The 300-pound MAARS, on the other hand, brought serious firepower and technology to the table. The MAARS features a M240B Medium Machine Gun and uses GPS tracking to reduce the risk of friendly fire.

While the MAARS is an impressive piece of machinery, BAE Systems is taking battlefield robots to the next level with its Black Knight. The Black Knight is a semi-autonomous 9.5 ton tank based on the Bradley fighting vehicle.

The Black Knight can be controlled from the traditional commander's station or by remote control via the Dismounted Control Device (DCD). Due to its advanced programming, the Black Knight can also autonomously plan routes and avoid obstacles without user intervention.

When it comes to the Black Knights armament, human intervention is required to fire rounds (thankfully). Considering that the Black Knight is armed with a 30mm gun and a coaxial machine gun, it's good to know that this tank won't be rolling around firing at anything that moves.

BAE Systems detailed how admirably the Black Knight performed during a demonstration in early 2007.

"While the Bradley Technology Demonstrator was engaging an enemy target from cover in a support by fire position, the Black Knight was able to autonomously move to a covered position and observe the target, using its sensor package to provide battle damage assessment data back to the Bradley," explained BAE Systems.

"If the enemy target needed to be re-engaged, the Black Knight could effectively neutralize the target, but the command to fire would always be made by a remote Soldier and only after the data necessary to make positive identification is received."

It may be years before such potent machinery is available for use on actual battlefields, however.


Comments     Threshold


This article is over a month old, voting and posting comments is disabled

RE: Playing with fire
By masher2 (blog) on 11/8/2007 1:56:13 PM , Rating: 2
> " to turn that decision process over to a machine is unacceptable."

If you aim a rifle at someone and pull the trigger, have you "turned over" the decision over what to kill to a machine? If you press a bomb bay door button, knowing it means the death of anyone below your aircraft, have you "turned over" the decision to your bomber?

These autonomous machines are no different. The overall decisions are still being made by humans. They're just being implemented by circuits, not muscles.


RE: Playing with fire
By clovell on 11/8/2007 2:09:00 PM , Rating: 1
The machines are either autonomous or they're not; you can't really have it both ways.

I think what he's driving at is that there should always be someone directly in control of these things when they're being used to kill people.


RE: Playing with fire
By masher2 (blog) on 11/8/2007 2:14:30 PM , Rating: 3
What I'm saying is that there's absolutely zero difference between instructing a robot to kill all personnel inside a map grid square, and dropping a bomb on a building that you cannot see inside, or launching an artillery shell at a location miles away that you cannot directly see. In all cases, you may accidently kill noncombatants in harm's way.

The ultimate decision is still a human one. In fact, in the case of the robot, which would presumably contain at least rudimentary discrimination ability, it would actually be the choice best able to carry out the "human" decision of killing only enemy combatants in a certain area.


RE: Playing with fire
By dustinfrazier on 11/8/2007 6:27:28 PM , Rating: 2
In those instances, the men pulling the trigger and dropping the bombs are still in harms way. A free populace will usually try to bring an end to bloodshed when it's their relatives who might not come home. When there is no bloodshed (theoretically) how ambivalent will the public become? I don't care about the autonomous argument, I care about the care free argument. It is my opinion that world governments don't show enough restraint when it comes to sending real live troops in to battle. How much less restraint will there be when they can suggest to their populace a 0 body count for their side? How many more black ops would take place? Do you think Chechnya would still exist if Russia had the kind of automation we are discussing?

Someone mentioned earlier that you must also have boots on the ground. I believe that is absolutely true. Believing that AI can be developed to the point that these machines will only harm combatants, what's next? The countries non-combatants, the majority of the population, would not act any differently than they do nowadays when another country invades. They would retaliate. This would turn the civilian population into combatants. So you can invade a country, kill the "enemy" and get out, but that doesn't get you much unless they were the first to attack you. You definitely can't hold or occupy anything unless you are ok with killing the civilians.

The next problem is when the majority of the population carries a weapon and the combatants don't wear a uniform, they look just like everyone else. It is common for everyday Iraqis to carry an AK, especially nowadays. It is not uncommon to see men going about their normal business with an automatic weapon slung over their shoulder.

One last thing I would like to ad is that no one wants to stop a war more than the real people who have to fight it. Knowing the real, human repercussions of past wars is what keeps people from wanting future wars. War should never be made easy.


"Nowadays, security guys break the Mac every single day. Every single day, they come out with a total exploit, your machine can be taken over totally. I dare anybody to do that once a month on the Windows machine." -- Bill Gates














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki