• 2 minutes Oil prices going down
  • 11 minutes China & India in talks to form anti-OPEC
  • 16 minutes When will oil demand start declining due to EVs?
  • 42 mins Oil prices going down
  • 6 hours We Need A Lasting Solution To The Lies Told By Big Oil and API
  • 6 hours Another WTH? Example of Cheap Renewables
  • 1 day Trump Hits China With Tariffs On $50 Billion Of Goods
  • 2 days Bullish and bearish outlook for oil
  • 2 days Rolls Royce shedding 4,600 jobs
  • 3 days After Trump-KJU, Trump-Putin Summit
  • 3 hours What If Canada Had Wind and Not Oilsands?
  • 2 days When will oil demand start declining due to EVs?
  • 1 day Russia's Rosneft 'Comfortable' With $70-$80 Oil Ahead of OPEC Talks
  • 6 hours The Wonderful U.S. Oil Trade Deficit with Canada
  • 10 hours China & India in talks to form anti-OPEC
  • 3 days U.S. Cars Will No Longer Need 55mpg Fuel Efficiency By 2025.
  • 5 hours The Permian Mystery
  • 3 days Epic Fail as Solar Crashes and Wind Refuses to Blow
  • 2 days OPEC soap opera daily update
Alt Text

Venezuela Won’t Have Enough Oil To Export By 2019

GlobalData recently projected that Venezuela’s…

Alt Text

Elon Musk Survives Attempted Coup

Tesla CEO Elon Musk survived…

Zainab Calcuttawala

Zainab Calcuttawala

Zainab Calcuttawala is an American journalist based in Morocco. She completed her undergraduate coursework at the University of Texas at Austin (Hook’em) and reports on…

More Info

Trending Discussions

Elon Musk: Killer Robots Bigger Threat Than Nuclear North Korea

Elon Musk

Tech leader Elon Musk is expanding his influence to include crafting just war policy in a new letter to the United Nations calling for laws to ban the creation and use of killer robots during conflicts.

Musk—one of the major forces behind the artificial intelligence push for self-driving cars—says AI poses a bigger threat than nuclear North Korea, and the development of autonomous weapons “threatens to become the third revolution in warfare”.

Musk isn’t the only tech leader. The letter was co-signed by 115 other experts who believe deadly autonomous weapons will unleash a scale of violence seen only with the use of chemical or biological weapons.

“Lethal autonomous weapons threaten to become the third revolution in warfare,” the letter, released to the public on Monday, said. “Once developed, they will permit armed conflict to be fought at a scale greater than ever, and at timescales faster than humans can comprehend. These can be weapons of terror, weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways.”

The experts hail from foremost technology companies and the world’s global robotics and artificial intelligence (AI) communities. Scientists from countries including China, Israel, Russia, and Britain addressed the letter to the United Nations Convention on Certain Conventional Weapons, which specializes in containing the spread of devices “considered to cause unnecessary or unjustifiable suffering to combatants or to affect civilians indiscriminately.”

Government entities are notorious for falling behind the tech world in creating protective policies.

Izumi Nakamitsu, the U.N. head for Disarmament Affairs, said the lag is particularly dire in the killer robot field.

“There are currently no multilateral standards or regulations covering military AI applications,” Nakamitsu wrote. “Without wanting to sound alarmist, there is a very real danger that without prompt action, technological innovation will outpace civilian oversight in this space.” Related: Is OPEC Throwing In The Towel On U.S. Market Share?

So far, laws to control robotic consciousness have most famously been devised in sci-fi writer Isaac Asimov’s short story “Runaround.” The three laws, ingrained in the fictional “Handbook of Robotics, 56th Edition, 2058 A.D.,” declare as follows:

1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2. A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.

Moving forward, countless sci-fi writers utilized the three laws in their own works to elegantly explain their robot characters’ code of ethics. By the time Asimov’s cannon had robots in control of entire planets, a zeroth law had come into play as well: “A robot may not harm humanity, or, by inaction, allow humanity to come to harm.”

Although popular in the world of fiction, it would be foolish for the U.N. to adopt these rules to control AI. As pointed out in Elijah Baley’s book The Naked Sun, loopholes abound within this system, in which criminal masterminds or warlords could use a robot’s obedience as a handy tool to commit violence. Related: EIA Spreads Optimism With Double Draw

What happens when a robot is asked to add a substance to a drink and serve it to someone, when, unbeknownst to the robot, the special ingredient is poison? The device will continue its work as commanded because of its ignorance of the commanding human’s intent. Thinking bigger, a network of thousands of robots, each completing a separate task that partially, but not directly or completely, contributes to a devastating attack on a civilization, would not be thwarted by AI developed by the aforementioned laws. The robots simply do not have enough information to stop themselves.

In any case, the real world is almost always more complicated than can be depicted in a sci-fi novel. New laws regarding autonomous cars and other brained technology pass often, but the policy legwork to prevent the weaponization of the cutting edge is still caught in the Middle Ages.

Time to follow our thought leaders to bring our AI laws into the future.

By Zainab Calcuttawala for Oilprice.com

More Top Reads From Oilprice.com:




Back to homepage

Trending Discussions


Leave a comment
  • Wayne Rychner on August 24 2017 said:
    Robots are the danger according to Elon Musk, yet he wants to make autonomous electric vehicles. Imagine a semi-truck being hacked ....
  • MC on August 24 2017 said:
    Bigger threat than both of those is Elon's subsidy gravy train ending.
  • jack ma on August 24 2017 said:
    NK is no threat. The USA wants into SK to watch China and Russia because they have abandoned the dollar and that is war if it continues. NK has weapons so it does not end up like ME nations brutally assaulted and attacked by the neocon aggressive USA that used to defend nations but now murders them. NK was slaughtered in the NK war for no reason and USA soldiers following orders burned them alive with gasoline, including the woman and children. NK is no threat to anyone and they are the excuse also to get congress to give the USA military complex more money. The USA is the empire of lies. IMHO

Leave a comment




Oilprice - The No. 1 Source for Oil & Energy News