• 4 minutes Oil Price Editorial: Beware Of Saudi Oil Tanker Sabotage Stories
  • 7 minutes Mueller Report Brings Into Focus Obama's Attempted Coup Against Trump
  • 11 minutes Magic of Shale: EXPORTS!! Crude Exporters Navigate Gulf Coast Terminal Constraints
  • 14 minutes Wonders of Shale- Gas,bringing investments and jobs to the US
  • 2 hours Evil Awakens: Fascist Symbols And Rhetoric On Rise In Italian EU Vote
  • 7 hours Trump needs to educate US companies and citizens on Chinese Communist Party and People's Liberation Army. This is real ECONOMIC WARFARE. To understand Chinese warfare read General Sun Tzu's "Art of War" . . . written 500 B.C.
  • 2 hours Old - New Kim: Nuclear Negotiations With U. S. Will Never Resume Unless Washington Changes Its Position
  • 2 hours Theresa May to Step Down
  • 12 hours Apartheid Is Still There: Post-apartheid South Africa Is World’s Most Unequal Country
  • 3 hours Is $60/Bbl WTI still considered a break even for Shale Oil
  • 2 hours India After Elections: Economy And Hindu Are The First Modi’s Challenges
  • 3 hours Total nonsense in climate debate
  • 10 hours Level-Headed Analysis of the Future of U.S. Shale Oil Industry
  • 29 mins Apple Boycott in China
  • 16 hours ARAMCO BOARD: Former Dow Chemical CEO Andrew Liveris: I want to help Saudi Arabia become a 21st century economy
  • 17 hours IMO 2020 could create fierce competition for scarce water resources
  • 17 hours IMO2020 To scrub or not to scrub
  • 18 hours Asia Oil Refiners Mull Run Cuts With Margins At 16 yrs. Low For Season
  • 16 hours Why is Strait of Hormuz the World's Most Important Oil Artery
Alt Text

Meet America’s Newest $9 Trillion Climate Change Solution

Democratic Presidential hopeful and Washington…

Alt Text

Middle East Tensions Put Oil Markets On Edge

Oil markets continue to be…

Alt Text

How Trump And Xi Killed The Oil Rally

The U.S.-China trade war appears…

Andy Tully

Andy Tully

Andy Tully is a veteran news reporter who is now the news editor for Oilprice.com

More Info

Trending Discussions

New Safety Feature: A Smart Car Programmed To Let You Die?

The first auto safety device probably was the padded dashboard, unless you count such basics as roofs and windshields. Whatever the case, such features have proliferated to seat belts, air bags, rear cameras and the like.

Now researchers at the University of Alabama, Birmingham (UAB), are studying what may be the ultimate in safety features, one that's also counter-intuitive: The self-driving car that would allow its own occupants to die if its computer determines that their number would be fewer than the people whose lives are threatened in a looming auto accident. Related: Could $12 Trillion Trigger A Renewables Revolution?

“Ultimately this problem devolves into a choice between utilitarianism and deontology,” – the ethical principal that “some values are simply categorically always true,” UAB alumnus Ameen Barghi, a bioethicist, tells the school's news department.

Let's step back for a moment and look at a dilemma that highlights this ethical problem. Classically it's known as the Trolley Problem: An employee in charge of a switch on a trolley track knows a train is due to pass by soon, but suddenly notices that a school bus full of children is stalled on that track. A look at the alternate route shows the employee's young child has somehow crawled onto that track.
His choice is either to save his child, or save the many children on the bus. Which is right?

Now shift this dilemma to a highway of the not-too-distant future. It is crowded with cars, many of them self-driving vehicles. Google, which already has been experimenting with such autos, says its cars can ably handle the risks of the road, and boasts that any accidents involving its cars have been caused by human error, not programming glitches. Related: Why Buffett Bet A Billion On Solar

So here's another example of the dilemma involving not trolleys but cars: A tire suddenly blows out on a self-driving vehicle, and the auto's computer must now decide whether to allow the car to careen into oncoming traffic or deliberately steer the car into a retaining wall. Does it base its choice on the benefit of its occupants, or the benefit of others who may outnumber them?

Here's how Barghi breaks it down: “Utilitarianism tells us that we should always do what will produce the greatest happiness for the greatest number of people,” he told the UAB news department. In this scenario, then, the car should be programmed to ram into the retaining wall, endangering its occupants but sparing others on the highway.

But then there's deontology, which we might call ethical absolutism. “For example, [deontology dictates that] murder is always wrong, and we should never do it,” Barghi says. In the Trolley Problem, deontology says that “even if shifting the trolley will save five lives, we shouldn’t do it because we would be actively killing one.” Related: U.S. Oil Glut An EIA Invention?

As a result, he said, a company that follows deontology shouldn't program self-driving cars to save others while sacrificing the life of its occupants.

There's no word how Barghi stands on the dilemma of the self-driving car or the Trolley Problem. The UAB graduate, who will enroll in Britain's Oxford University in the autumn as a Rhodes Scholar, seems more interested in studying and debating such predicaments than in solving them. He served as a senior leader on UAB's team in the Bioethics Bowl in April at Florida State University. His team won this year's national championship.

But here's a hint: In last year's Bioethics Bowl, Barghi's team also competed, arguing a related case, whether governments would be justified in banning human driving altogether if self-driving cars proved to be significantly safer than cars with human drivers. Barghi's team argued in favor of self-driving cars.

By Andy Tully Of Oilprice.com

More Top Reads From Oilprice.com:




Download The Free Oilprice App Today

Back to homepage

Trending Discussions


Leave a comment
  • aed939 on June 28 2015 said:
    In order to gain the confidence of consumers in robots, self-driving cars must be completely loyal to its owners. It must make decisions that are consistent with the owners' own preferences, and that usually means that they value the lives of their children over stranger children.

Leave a comment




Oilprice - The No. 1 Source for Oil & Energy News