Return to site
· general aviation,flying safety,aviation safety,decision making

This is the third in a three-part series on decision making. Part 1 and Part 2 are available in the previous two blog posts.

 

In Part 1 of this series we looked at analytical decision making and saw some flaws in the Risk Assessment Matrix and the DECIDE Model. In Part 2 we visited rapid decision making and saw the same flaws in the 3Ps Method, and the OODA Loop. These methods do have some value and they can be helpful in directing our thoughts and actions toward solving a problem. But it is essential that we “tune-up” our tools by recognizing ways that our unconscious minds, often through cognitive biases, negatively influence our decisions. I know from personal experience that we are all influenced by these factors, but that by better understanding how our unconscious mind works, we can avoid many traps. I have written extensively on this subject in the past so I will not revisit all of that here. For a very short primer on three common cognitive biases, check out my YouTube video, “The Bias Bundle bomb.”

 

Ideally, we could move our decisions from being subjective to being objective. We have not yet figured out a way to do that for all our decisions, but one tool, if properly crafted, will do that for us. It is called the FRAT, or Flight Risk Assessment Tool. The FRAT is used extensively in business aviation. Customized for each kind of airplane, it lists items that are not open to interpretation and assigns a numerical risk value to them. For example, a surface wind greater than 15 knots might assign a risk value of 2, while the pilot having flown less than 5 hours in the past 30 days might assign a risk value of 3. The assigned risk values are totaled and compared to a previously determined total value. If the risk value for the proposed flight exceeds the predetermined total value, the flight must not go. Sometimes it may be possible to make a change to the flight so that the risk value is reduced. For example, perhaps having any part of the flight occurring at night might have a risk value of 5. If the flight can be scheduled and executed such that no night flying is involved, the total risk value would be reduced by 5 and might now fall within the acceptable range. That is not to say that we should be looking for ways to circumvent the FRAT risk score. But if we can find ways to actually reduce the risk, we are making the flight safer.

 

There might also be some items which solely assign a no-go to the flight. That might include the pilot taking any medication containing diphenhydramine (Benadryl, etc.).

 

Nothing contained in a FRAT or any other tool should be construed as permission to violate any regulation. The sample FRAT contains the following in the Aircraft Section: “Has not had an annual inspection in the past 12 months” with a risk value of 8 assigned. Flying an aircraft that does not have a current annual inspection seems contrary to regulations, but there are circumstances in which it is legal. For example, the FAA can issue a ferry permit to allow a pilot to relocate an airplane from where it is located to a place where an annual inspection can be performed. The flight is legal, but it carries a higher risk because of the lack of inspection. A few other items have similar conditions.

 

For the FRAT to be effective, it must be developed well before a flight is scheduled when there is no external pressure. Then it must be followed to the letter with no deviation allowed for any reason.

broken image

When it comes to rapid and reflexive decision making, there is really no substitute for experience. That does not mean that a pilot must accumulate thousands of hours before becoming safe. In fact, the highly experienced pilot is often more likely to fall into a complacency trap or be unduly influenced by cognitive biases. It is crucial for a pilot to get the right kind of experience. That would include first developing a knowledge of the human factors that cause us to make flawed decisions. Then formal training from competent instructors is extremely valuable. Simulations of all kind can be used to provide experiences in just a few hours that would normally require years of flying to acquire. An added benefit there is that the fatality rate in simulations is extremely low!

 

Finally, much can be learned by studying accidents. Much can be gained by learning how the actions of others led to accidents. The accidents must be analyzed by looking at underlying causes beyond the NTSB probable cause findings. It is easy to criticize a pilot for making what appears to be a dumb or careless mistake. But the majority of pilots are neither dumb nor careless. Often an external factor caused the pilot to make a bad decision. After we read and think about a few of those accidents, we are less likely to fall into the same trap.

 

So in summary, we recognized two branches of decision making. In Part 1 we briefly discussed analytical decision making. We identified some commonly used tools to help us with the analytical part of our aeronautical decision making and pointed out some potential flaws in their use.

 

In Part 2, we discussed rapid and reflexive decision making and again identified some tools designed to assist us. Then we showed some potential flaws in their use.

 

In Part 3, we looked at how the unconscious mind can influence our decisions and we showed the need to move our decisions from the subjective to the objective. We discussed a tool for doing that and we listed some ways to improve our aeronautical decision making through knowledge and experience.

 

We can all improve our decision making by understanding how the unconscious mind can lead us astray. Being able to make better decisions can not only help us to be better pilots, but to be better in everything we do.