Newly elected President Rodrigo Duterte…
Many consequences of the attempted…
In warm regions power grids are pushed to the limit during the summer months, as homes and offices increase their energy consumption by turning up the air conditioning. During these times of strained electricity supply just one small failure in the system, such as a downed transmission line or generator short circuit, can knock out the power to a neighbourhood, or entire town.
The complexity of the electric grid means that several small failures can actually set off a chain reaction, culminating in a large scale blackout. Forbes mentioned the power cut that occurred in the US in 2003, when a few isolated failures happened within a few hours across several states, resulting in a failure of the entire electrical grid, leaving more than 50 million people without power in the northeast. The estimated cost was $10 billion.
The best defence that grid operators have against such large power cuts is to ask customers to conserve energy where they can, and try and reduce the burden on the grid.
A team of MIT engineers, led by Professor Konstantin Turitsyn, has created an algorithm that claims to be able to analyse grids in real time, and identify potential failures that pose the biggest threat of a large scale blackout.
A satellite image of the blackout that hit the northeast US in 2003. (Siemens)
An article on MITnews explained that the algorithm “identifies the most dangerous pairs of failures among the millions of possible failures in a power grid. The algorithm “prunes” all the possible combinations down to the pairs most likely to cause widespread damage.”
Algorithms already exist that perform a similar task, but they cannot guarantee to identify all of the potential failures, and they take a long time to perform an analysis of the grid. Professor Turitsyn said that “they don’t provide guarantees that the ones you assume to be safe are really safe. If you want to have some guarantees that the system is safe, you want the system to rely on algorithms that have zero missing rates.”
In order to test their algorithm, the team installed it into a system that received data from a mid-sized power grid of around 3,000 components. The algorithm was able to process the 10 million potential failure pairings within 10 minutes, discarding 99 percent of pairs that were deemed safe, and identifying one percent that would likely lead to larger blackouts if not sorted quickly.
By. Charles Kennedy of Oilprice.com
Charles is a writer for Oilprice.com