PDA

View Full Version : Pentagon request packs of human killer robots 2008



revelarts
11-16-2010, 01:25 AM
For the good of mankind
and never to be used against anyone except ...
"uncooperative humans"

http://www.newscientist.com/blogs/shortsharpscience/2008/10/packs-of-robots-will-hunt-down.html


Packs of robots will hunt down uncooperative humans
18:00 22 October 2008
Technology
The latest request from the Pentagon jars the senses. At least, it did mine. They are looking for contractors to provide a "Multi-Robot Pursuit System" that will let packs of robots "search for and detect a non-cooperative human".

One thing that really bugs defence chiefs is having their troops diverted from other duties to control robots. So having a pack of them controlled by one person makes logistical sense. But I'm concerned about where this technology will end up.

Given that iRobot last year struck a deal with Taser International to mount stun weapons on its military robots, how long before we see packs of droids hunting down pesky demonstrators with paralysing weapons? Or could the packs even be lethally armed? I asked two experts on automated weapons what they thought - click the continue reading link to read what they said. Both were concerned that packs of robots would be entrusted with tasks - and weapons - they were not up to handling without making wrong decisions.

Steve Wright of Leeds Metropolitan University is an expert on police and military technologies, and last year correctly predicted this pack-hunting mode of operation would happen. "The giveaway here is the phrase 'a non-cooperative human subject'," he told me:

"What we have here are the beginnings of something designed to enable robots to hunt down humans like a pack of dogs. Once the software is perfected we can reasonably anticipate that they will become autonomous and become armed.

We can also expect such systems to be equipped with human detection and tracking devices including sensors which detect human breath and the radio waves associated with a human heart beat. These are technologies already developed."

Another commentator often in the news for his views on military robot autonomy is Noel Sharkey, an AI and robotics engineer at the University of Sheffield. He says he can understand why the military want such technology, but also worries it will be used irresponsibly.

"This is a clear step towards one of the main goals of the US Army's Future Combat Systems project, which aims to make a single soldier the nexus for a large scale robot attack. Independently, ground and aerial robots have been tested together and once the bits are joined, there will be a robot force under command of a single soldier with potentially dire consequences for innocents around the corner."

What do you make of this? Are we letting our militaries run technologically amok with our tax dollars? Or can robot soldiers be programmed to be even more ethical than human ones, as some researchers claim?

Paul Marks, technology correspondent

Pagan
11-16-2010, 02:00 AM
http://images.pictureshunt.com/pics/t/the_terminator_robot-11361.jpg

SassyLady
11-16-2010, 02:10 AM
For the good of mankind
and never to be used against anyone except ...
"uncooperative humans"

http://www.newscientist.com/blogs/shortsharpscience/2008/10/packs-of-robots-will-hunt-down.html

I think robots should never be programmed to control/hunt/kill humans. As long as humans are the ones that need to do the hunting/killing then the idea of human lives being lost will have a more daunting effect on waging war ... if robots are used I believe there will be more global conflicts.

Pagan
11-16-2010, 02:29 AM
I think robots should never be programmed to control/hunt/kill humans. As long as humans are the ones that need to do the hunting/killing then the idea of human lives being lost will have a more daunting effect on waging war ... if robots are used I believe there will be more global conflicts.

Nailed it on the head, the attempts to make warfare less personal will only perpetuate it IMO.

revelarts
11-18-2010, 09:20 AM
man to bad we lost Dmp's and my post here.

i'll try to repost mine.

Sertes
11-19-2010, 07:23 AM
See it in action:

R1CitRWEZqs

revelarts
11-28-2010, 11:35 PM
http://graphics8.nytimes.com/images/2010/11/28/us/28robot-span/28robot-span-articleLarge.jpg

alpha model?
http://www.nytimes.com/2010/11/28/science/28robot.html?_r=2
War Machines: Recruiting Robots for Combat
By JOHN MARKOFF
Published: November 27, 2010

Articles in this series are examining the recent advances in artificial intelligence and robotics and their potential impact on society.

A New Generation of Robotic Weapons

FORT BENNING, Ga. — War would be a lot safer, the Army says, if only more of it were fought by robots.

REMOTELY CONTROLLED Some armed robots are operated with video-game-style consoles, helping to keep humans away from danger.

And while smart machines are already very much a part of modern warfare, the Army and its contractors are eager to add more. New robots — none of them particularly human-looking — are being designed to handle a broader range of tasks, from picking off snipers to serving as indefatigable night sentries.

In a mock city here used by Army Rangers for urban combat training, a 15-inch robot with a video camera scuttles around a bomb factory on a spying mission. Overhead an almost silent drone aircraft with a four-foot wingspan transmits images of the buildings below. Onto the scene rolls a sinister-looking vehicle on tank treads, about the size of a riding lawn mower, equipped with a machine gun and a grenade launcher.

Three backpack-clad technicians, standing out of the line of fire, operate the three robots with wireless video-game-style controllers. One swivels the video camera on the armed robot until it spots a sniper on a rooftop. The machine gun pirouettes, points and fires in two rapid bursts. Had the bullets been real, the target would have been destroyed.

The machines, viewed at a “Robotics Rodeo” last month at the Army’s training school here, not only protect soldiers, but also are never distracted, using an unblinking digital eye, or “persistent stare,” that automatically detects even the smallest motion. Nor do they ever panic under fire.

“One of the great arguments for armed robots is they can fire second,” said Joseph W. Dyer, a former vice admiral and the chief operating officer of iRobot, which makes robots that clear explosives as well as the Roomba robot vacuum cleaner. When a robot looks around a battlefield, he said, the remote technician who is seeing through its eyes can take time to assess a scene without firing in haste at an innocent person.

Yet the idea that robots on wheels or legs, with sensors and guns, might someday replace or supplement human soldiers is still a source of extreme controversy. Because robots can stage attacks with little immediate risk to the people who operate them, opponents say that robot warriors lower the barriers to warfare, potentially making nations more trigger-happy and leading to a new technological arms race.

“Wars will be started very easily and with minimal costs” as automation increases, predicted Wendell Wallach, a scholar at the Yale Interdisciplinary Center for Bioethics and chairman of its technology and ethics study group.

Civilians will be at greater risk, people in Mr. Wallach’s camp argue, because of the challenges in distinguishing between fighters and innocent bystanders. That job is maddeningly difficult for human beings on the ground. It only becomes more difficult when a device is remotely operated.

This problem has already arisen with Predator aircraft, which find their targets with the aid of soldiers on the ground but are operated from the United States. Because civilians in Iraq and Afghanistan have died as a result of collateral damage or mistaken identities, Predators have generated international opposition and prompted accusations of war crimes.

But robot combatants are supported by a range of military strategists, officers and weapons designers — and even some human rights advocates.

“A lot of people fear artificial intelligence,” said John Arquilla, executive director of the Information Operations Center at the Naval Postgraduate School. “I will stand my artificial intelligence against your human any day of the week and tell you that my A.I. will pay more attention to the rules of engagement and create fewer ethical lapses than a human force.”

Dr. Arquilla argues that weapons systems controlled by software will not act out of anger and malice and, in certain cases, can already make better decisions on the battlefield than humans.

His faith in machines is already being tested.

“Some of us think that the right organizational structure for the future is one that skillfully blends humans and intelligent machines,” Dr. Arquilla said. “We think that that’s the key to the mastery of 21st-century military affairs.”

Automation has proved vital in the wars America is fighting. In the air in Iraq and Afghanistan, unmanned aircraft with names like Predator, Reaper, Raven and Global Hawk have kept countless soldiers from flying sorties. Moreover, the military now routinely uses more than 6,000 tele-operated robots to search vehicles at checkpoints as well as to disarm one of the enemies’ most effective weapons: the I.E.D., or improvised explosive device.

Yet the shift to automated warfare may offer only a fleeting strategic advantage to the United States. Fifty-six nations are now developing robotic weapons, said Ron Arkin, a Georgia Institute of Technology roboticist and a government-financed researcher who has argued that it is possible to design “ethical” robots that conform to the laws of war and the military rules of escalation.

But the ethical issues are far from simple. Last month in Germany, an international group including artificial intelligence researchers, arms control specialists, human rights advocates and government officials called for agreements to limit the development and use of tele-operated and autonomous weapons.
Enlarge This Image
David Walter Banks for The New York Times

AN UNBLINKING SENTRY The Maars robot during a recent demonstration at Fort Benning, Ga.

BEHIND THE MACHINE Justin Jindra operated the Maars at the Robotics Rodeo last month.

The group, known as the International Committee for Robot Arms Control, said warfare was accelerated by automated systems, undermining the capacity of human beings to make responsible decisions. For example, a gun that was designed to function without humans could shoot an attacker more quickly and without a soldier’s consideration of subtle factors on the battlefield.

“The short-term benefits being derived from roboticizing aspects of warfare are likely to be far outweighed by the long-term consequences,” said Mr. Wallach, the Yale scholar, suggesting that wars would occur more readily and that a technological arms race would develop.

As the debate continues, so do the Army’s automation efforts. In 2001 Congress gave the Pentagon the goal of making one-third of the ground combat vehicles remotely operated by 2015. That seems unlikely, but there have been significant steps in that direction.

For example, a wagonlike Lockheed Martin device that can carry more than 1,000 pounds of gear and automatically follow a platoon at up to 17 miles per hour is scheduled to be tested in Afghanistan early next year.

For rougher terrain away from roads, engineers at Boston Dynamics are designing a walking robot to carry gear. Scheduled to be completed in 2012, it will carry 400 pounds as far as 20 miles, automatically following a soldier.

The four-legged modules have an extraordinary sense of balance, can climb steep grades and even move on icy surfaces. The robot’s “head” has an array of sensors that give it the odd appearance of a cross between a bug and a dog. Indeed, an earlier experimental version of the robot was known as Big Dog.

This month the Army and the Australian military held a contest for teams designing mobile micro-robots — some no larger than model cars — that, operating in swarms, can map a potentially hostile area, accurately detecting a variety of threats.

Separately, a computer scientist at the Naval Postgraduate School has proposed that the Defense Advanced Research Projects Agency finance a robotic submarine system that would intelligently control teams of dolphins to detect underwater mines and protect ships in harbors.

“If we run into a conflict with Iran, the likelihood of them trying to do something in the Strait of Hormuz is quite high,” said Raymond Buettner, deputy director of the Information Operations Center at the Naval Postgraduate School. “One land mine blowing up one ship and choking the world’s oil supply pays for the entire Navy marine mammal program and its robotics program for a long time.”

Such programs represent a resurgence in the development of autonomous systems in the wake of costly failures and the cancellation of the Army’s most ambitious such program in 2009. That program was once estimated to cost more than $300 billion and expected to provide the Army with an array of manned and unmanned vehicles linked by a futuristic information network.

Now, the shift toward developing smaller, lighter and less expensive systems is unmistakable. Supporters say it is a consequence of the effort to cause fewer civilian casualties. The Predator aircraft, for example, is being equipped with smaller, lighter weapons than the traditional 100-pound Hellfire missile, with a smaller killing radius.

At the same time, military technologists assert that tele-operated, semi-autonomous and autonomous robots are the best way to protect the lives of American troops.

Army Special Forces units have bought six lawn-mower-size robots — the type showcased in the Robotics Rodeo — for classified missions, and the National Guard has asked for dozens more to serve as sentries on bases in Iraq and Afghanistan. These units are known as the Modular Advanced Armed Robotic System, or Maars, and they are made by a company called QinetiQ North America.

The Maars robots first attracted the military’s interest as a defensive system during an Army Ranger exercise here in 2008. Used as a nighttime sentry against infiltrators equipped with thermal imaging vision systems, the battery-powered Maars unit remained invisible — it did not have the heat signature of a human being — and could “shoot” intruders with a laser tag gun without being detected itself, said Bob Quinn, a vice president at QinetiQ.

Maars is the descendant of an earlier experimental system built by QinetiQ. Three armed prototypes were sent to Iraq and created a brief controversy after they pointed a weapon inappropriately because of a software bug.

However, QinetiQ executives said the real shortcoming of the system was that it was rejected by Army legal officers because it did not follow military rules of engagement — for example, using voice warnings and then tear gas before firing guns. As a consequence, Maars has been equipped with a loudspeaker as well as a launcher so it can issue warnings and fire tear gas grenades before firing its machine gun.

Remotely controlled systems like the Predator aircraft and Maars move a step closer to concerns about the automation of warfare. What happens, ask skeptics, when humans are taken out of decision making on firing weapons? Despite the insistence of military officers that a human’s finger will always remain on the trigger, the speed of combat is quickly becoming too fast for human decision makers.

“If the decisions are being made by a human being who has eyes on the target, whether he is sitting in a tank or miles away, the main safeguard is still there,” said Tom Malinowski, Washington director for Human Rights Watch, which tracks war crimes. “What happens when you automate the decision? Proponents are saying that their systems are win-win, but that doesn’t reassure me.” "

SassyLady
11-29-2010, 01:55 AM
Think about these words when thinking about robots:

"software bug" or "virus"

Doesn't that scare you?

kowalskil
01-09-2011, 04:46 PM
Fighting "robots" is nothing new. Tanks and airplanes are robots. The same is true for satellites.

revelarts
01-16-2011, 11:01 AM
Tanks and Airplanes are not robots.

"Robot Warrior Ethics: In this panel discussion,

Dr. P.W. Singer (author of Wired for War; Senior Fellow & Director, 21st Century Defense Initiative, Brookings Institute),
Colonel Stephen Irwin (Staff Judge Advocate, U.S. Joint Forces Command), and
Lieutenant Colonel Robert Bracknell (Deputy Judge Advocate, U.S. Joint Forces Command & Joint Center for Operational Analysis)

They discuss the ethical implications of the increasing sophistication and autonomy of military hardware.

The panel was held 24 March 2010 and was sponsored by ODU's Institute for Ethics and Public Affairs, which is housed in the Department of Philosophy and Religious Studies. Dr. David C. Earnest, from ODU's Department of Political Science and Geography, served as moderator."


<object width="480" height="385"><param name="movie" value="http://www.youtube.com/v/lWtOMT176as?fs=1&amp;hl=en_US"></param><param name="allowFullScreen" value="true"></param><param name="allowscriptaccess" value="always"></param><embed src="http://www.youtube.com/v/lWtOMT176as?fs=1&amp;hl=en_US" type="application/x-shockwave-flash" allowscriptaccess="always" allowfullscreen="true" width="480" height="385"></embed></object>

revelarts
02-13-2011, 12:20 AM
Hold your breath to hide from surveillance robot

Read more: http://news.cnet.com/8301-17938_105-20030681-1.html#ixzz1DoUCiBj9


http://i.i.com.com/cnwk.1d/i/tim/2011/02/04/Cougar20-H-1.JPG


If you want to creep past this new security bot, you'd better be good at holding your breath.

TiaLinx's new Cougar20-H is a lightweight, remote-controlled surveillance robot that can detect human breathing and scan through concrete walls with its ultra-wideband radio frequency sensor array.

The Cougar20-H moves around on tracks and can roll up to a building, extend its arm, and start scanning through the wall with its RF array, developed with funding from the U.S. Army.

Operated from a laptop that can be more than 300 feet away, the robot can scan through reinforced concrete by detecting reflected radio waves. It can find people who are moving or even keeping still, so the operator can see them in real time.

The robot searches for the "biorhythmic patterns" of targets, according to the company. It hasn't divulged too many details about the machine.

But the device is related to the military's decade-long push to develop Sense-Through-the-Wall (STTW) radar technology so soldiers can reconnoiter a building without having to enter it. The aim is to keep soldiers out of harm's way by having them use lightweight scanning tools.

"Cougar20-H can also be remotely programmed at multiple way points to scan the desired premise in a multistory building and provide its layout," TiaLinx founder Fred Mohamadi said in a release. The robot also has multiple onboard cameras for day or night patrols.

The Cougar20-H ships next month and is expected to be used by the Department of Defense, Homeland Security and various law enforcement agencies, and even firefighters. It follows in the tracks of the remote-controlled Cougar10-H, which can detect underground unexploded ordnance and tunnels.

Some very Skynet-worthy skills. Things are not looking too good for the Resistance.

Read more: http://news.cnet.com/8301-17938_105-20030681-1.html#ixzz1DoTJ4O3i