follow us like us subscribe contact us
Loading, please wait

MIT Invent New Algorithm to Protect the Electric Grid from Blackouts

By Charles Kennedy | Wed, 03 July 2013 22:04 | 2

In warm regions power grids are pushed to the limit during the summer months, as homes and offices increase their energy consumption by turning up the air conditioning. During these times of strained electricity supply just one small failure in the system, such as a downed transmission line or generator short circuit, can knock out the power to a neighbourhood, or entire town.

The complexity of the electric grid means that several small failures can actually set off a chain reaction, culminating in a large scale blackout. Forbes mentioned the power cut that occurred in the US in 2003, when a few isolated failures happened within a few hours across several states, resulting in a failure of the entire electrical grid, leaving more than 50 million people without power in the northeast. The estimated cost was $10 billion.

The best defence that grid operators have against such large power cuts is to ask customers to conserve energy where they can, and try and reduce the burden on the grid.

A team of MIT engineers, led by Professor Konstantin Turitsyn, has created an algorithm that claims to be able to analyse grids in real time, and identify potential failures that pose the biggest threat of a large scale blackout.

A satellite image of the blackout that hit the northeast US in 2003.
A satellite image of the blackout that hit the northeast US in 2003. (Siemens)

An article on MITnews explained that the algorithm “identifies the most dangerous pairs of failures among the millions of possible failures in a power grid. The algorithm “prunes” all the possible combinations down to the pairs most likely to cause widespread damage.”

Related Article: Pakistani Civil Servants Banned from Wearing Socks to Work During Energy Crisis

Algorithms already exist that perform a similar task, but they cannot guarantee to identify all of the potential failures, and they take a long time to perform an analysis of the grid. Professor Turitsyn said that “they don’t provide guarantees that the ones you assume to be safe are really safe. If you want to have some guarantees that the system is safe, you want the system to rely on algorithms that have zero missing rates.”

In order to test their algorithm, the team installed it into a system that received data from a mid-sized power grid of around 3,000 components. The algorithm was able to process the 10 million potential failure pairings within 10 minutes, discarding 99 percent of pairs that were deemed safe, and identifying one percent that would likely lead to larger blackouts if not sorted quickly.

By. Charles Kennedy of Oilprice.com

Leave a comment

  • malcolm on March 25 2014 said:
    wow!finally SOMEBODY's actually concerned about(a CRITICAL piece of)our long-neglected infrastructure.but it's a LONG way from proposals to real action....we've been warned for many moons about how ridiculously VULNERABLE the grid really is.sadly , we're still relying on an infrastucture that F.D.R. put in place.with 30-40% of the $$ wasted in IRAQ/AFGHAN, we could have upgraded/repaired & renewed our "grid"/roads/bridges etc. while creating many,many jobs...a DEPT of INTERIOR(yrs. ago)report found 70% of American bridges & dams in URGENT need of serious repair,not to mention the rusted out natural gas lines-which are explosions waiting to happen .our "leaders" have been asleep at the wheel for FAR too long!there's so many more examples,but this is too negative to continue.GOD BLESS the MIT crew for working on this problem,at least the grid part!
  • Ed on July 05 2013 said:
    Yeah, right. I feel much better now.

Leave a comment