Philips was once the world champion in consumer electronics until the company started to drift off. This ‘complacency effect’ caused Air France flight 447 to crash and bank ING to receive a mega fine. ‘Unfortunately, managers all too often have the idea that everything will work out fine’, writes innovation expert Simone van Neerven.
On Sunday evening, May 31, 2009, at three past seven, flight AF447 departs from Rio de Janeiro airport to Paris but will never arrive there. The plane has crashed into the Atlantic Ocean. It’s a mystery what happened as the black boxes can’t be found. After two years of searching, they are finally located and salvaged.
The flight data shows that the speed indicators in the cockpit indicated contradictory information, causing the autopilot to switch off. Suddenly, the pilots had to fly the plane manually. The two co-pilots were confused, and the youngest and least experienced of the two decided to let the aeroplane climb without informing his colleague.
The plane stalled, but the two pilots had difficulty interpreting all the alarm signals and messages. They lost more and more speed and began to descend rapidly. The experienced captain, who was resting at that time, was alerted too late to handle the situation. The aircraft crashed into the sea, killing all 228 people on board.
The pilots relied too much on the autopilot and did not know how to act in such a crisis. When they had to take over the controls and fly the plane manually, they were unable to respond properly, leading to the crash of the plane.
This is a well-known phenomenon in aviation and is called complacency. It happens when tasks become routine, people start to rely too much on systems, or when it is assumed that everything functions as expected. This mindset can lead to critical mistakes and delayed reactions or errors in judgment, with dangerous or even fatal consequences.
Complacency extends beyond aviation. Even successful organisations cannot escape the danger of becoming complacent, for instance, because they rely on outdated strategies or do not adapt to new market developments or technological changes.
Famous and notorious examples are Kodak, Nokia and Blackberry. But there are also examples closer to home, in the Netherlands. For example, Philips was a world leader in consumer electronics and one of the most innovative brands in this area. However, Philips relied too much on its strong brand reputation and current products, which caused it to react to new digital technologies considerably later than its competition. They could not catch up and finally had to sell off this business.
Another example comes from ING, which, in 2018, became embroiled in a major money laundering scandal. For years, the bank had not taken sufficient measures to detect and prevent suspicious money flows, making money laundering fairly easy. The bank was aware of the flaws in its controls but trusted the good intentions of its customers. Rather than mitigating these risks, the bank prioritised commercial growth. When it all came out, the bank settled the case with the government by paying them 775 million euros.
Employees within the Philips Research department already recognised the importance of digital technology and software back in the 1980s and 1990s. They proposed how to respond to these new digital possibilities and the rise of the Personal Computer and internet technology, but all their suggestions were disregarded.
It’s unclear whether ING employees voiced concerns about money laundering. However, in the aftermath of the crisis, the bank highlighted its actions to report potential wrongdoing more easily.
In almost every case, an employee signalled the misconduct or shared an idea about a new product, service or business model upfront. Thus, organisations can easily protect themselves from dozing off and ending up in a disastrous situation simply by listening to their employees. However, this requires leaders not to dismiss the sometimes seemingly strange, wild ideas immediately. They need to really listen to them and then take action.
Unfortunately, managers often believe things will work out or assume the risks are not so high. They continue to act on autopilot, and the situation goes from bad to worse. So cherish it when people think along with you and do something with it.
With the rise of Artificial Intelligence (AI), the risk of complacency in organisations is also increasing. AI can have a lot of added value, but it also increases the risk that people will think less critically and lose their sharpness.
People tend to question decisions made by automated systems less (the so-called automation bias). Furthermore, many AI models, such as deep learning, work as a black box, making decisions difficult to understand. This lack of transparency leads to even less questioning because employees assume that the AI ”will know”.
These days, many organisations focus on efficiency, and AI is introduced to automate processes and save costs. However, reducing human controls and audits increases the risk that an organisation dozes off and slowly derails without noticing in time. Use AI to your advantage and don’t depend on it too much, but above all, embrace critical thinking and be very aware when the ‘computer says no‘!
This article was originally published in Dutch on MT/Sprout, the most popular business and management platform in the Netherlands.
don’t miss out!
get my columns straight into your mailbox: