A design fault in the $2.4bn (£1.2bn) US air traffic control system made by Lockheed Martin triggered a computer glitch that affected thousands of people on flights in the US last week.
The subsequent error blanked out a large proportion of the south west of the United States flight traffic, and FAA spokeswoman Laura Brown said that the computer had to examine a large number of air routes to "de-conflict the aircraft with lower altitude flights", claiming that the process used a large amount of available memory which affected other flight-processing functions.
The FAA has since set the system to require altitudes for every flight plan and added memory to the system, and Brown believes this should prevent similar problems from occurring in the future.
When the system went down, air traffic controllers working in the centre switched to a back-up system to view the planes.
The system failed because it limits how much data each plane can send it - and as the U-2 plane was operating at a high altitude with a complex flight plan, it brought the system to the brink.
One of the key issues was that the flight plan did not contain an altitude for the flight, and despite the usual altitude being manually entered at 60,000 feet, the system started to consider all altitudes from ground level to infinity, leading to error messages and continuous restarts.
Of more concern to air traffic controllers across the world, is that many systems may encounter similar issues as a result of an increase in the use of automation.
Furthermore, security experts believe that the same vulnerability could have been used by an attacker in a deliberate attempt to shut down air traffic control.
However, such a feat would be very difficult to mimic, because it involved several different factors such as a complex flight plan, an altitude discrepancy and the manually inputted data from the controller that added to the flight plan data.
Watch the below video for a review of the Nokia 2520 Windows RT tablet.