# 39 - Operational Learning and Command (Part 1)
Through Operational Learning, leaders prevented a mass casualty event, even when the fire did something nobody could predict.
In The Fire’s Path
Just days before the Pagami Creek Fire blew up on Monday, 12 September 2011, almost a hundred members of the public were in the area, relatively close to the fire. They were canoeing, hiking, camping—innocently enjoying summer in the wilderness.
When the blow up came, it was unforeseen and unprecedented.
The best fire behavior predictions (just hours before) said the fire should move in the opposite direction (to the south and west), with some possible fire growth to the north and east, possibly approaching the southern edge of Lake Insula (for maps and discussion of fire growth, see Reference Maps in Post #34).
But the fire didn’t care about predictions!
It ripped to the north and east, running more than 16 miles that day. It grew almost as much in a few hours as it did over the prior three weeks put together.
Those hundred or so visitors I mentioned—less than two days before the blow up they were directly in the fire’s future path. Nobody knew it at that time, of course. When the blow up came, it caught everyone off guard, and yet… not a single one of those visitors was injured.
Compare this to what happened during the Great Hinckley Fire of 1894.
Great Hinckley Fire (1894)
When the Great Hinckley Fire blew up outside of Hinckley, Minnesota in 1894, everyone was caught off guard. Some managed to escape the town due to a lucky train. The escape was harrowing.
Quoting from Pyne’s (1997) Fire In America:
[T]he [train] approached Hinckley, unaware of the extent of the fire, the smoke thickened, and it became necessary to use the headlight and slow to a crawl… Panic increased as fleeing refugees from the town flagged down the engine a little over a mile from Hinkley; perhaps 150 boarded the train. The engineer, Jim Root…put the throttle in reverse and began the ride that would propel him and his crowded throng of passengers into the lore of American fire.
The flames at times outraced the engine. Heat, smoke, and even fire enveloped the tiny train from front to rear. Flames came through the ventilators at the top of the passenger cars and through cracks along the windows. When the rear coach took fire, its occupants fled into the next car. One by one the heat blistered the exterior paint of the other cars, then burst them into flame. The stifling heat caused even the interior paints to blister, then run. The interiors were swallowed in darkness, broken only by screams and shouts. The window glass cracked. At last the insufferable heat and congestion drove one man mad. Shrieking, he threw himself out the window and vanished instantly in a cauldron of flame. Another followed his example. Then another still. The remaining passengers all but gave themselves up to complete panic.1
The story gets worse from there.
People’s clothes caught fire.
Bad burns.
It was pandemonium.
And remember… the people on the train were the lucky ones. Back in town, over 400 people lost their lives.
This is an example of what unexpected fire can do. And it happened less than two hundred miles from the Pagami Creek Fire, so it is a locally relevant example.
The Pagami blow up could have gone like the Great Hinckley Fire—urgent evacuation attempts, multiple fatalities.
Pagami could also have been like the Peshtigo Fire, which happened a few hundred miles east of Hinckley. In 1871, that fire took an unknown number of lives—I’ve seen estimates ranging from 1,000 to over 1,500.
In all three cases (1871 Peshtigo, 1894 Hinckley, and 2011 Pagami Creek Fire), the blow up was unforeseen, and people were directly in the path of the fire.
So all three were set up to be mass casualty events. But the Pagami Creek Fire didn’t turn out that way. So, how was disaster averted?
The answer has a lot to do with command—specifically with measures taken by the Incident Commander (IC) and Incident Management Team (IMT) in the days prior to the blow up. Somehow they prepared for an event that they didn’t know was coming. How did they do that?2
We will wrestle that question to the ground in this post.
Many have found fault with the IMT for the fact that four teams of firefighters had close calls on Lake Insula on 12 September. And yes, obviously, something went wrong that day—that’s why I was asked to investigate.
And yet … I respectfully insist we look at the full picture. Somehow, the IMT was prepared for an event that was unforeseeable. How prepared were they? Prepared enough to save about a hundred innocent people.
Let’s see how they did that.
Learning From A Mess
To make sense of what happened on 12 September 2011, we need to know what led up to it:
The Pagami Creek Fire started 18 August 2011 in the Boundary Waters Canoe Area Wilderness (BWCAW) in Northern Minnesota. The fire’s spread was gradual for the next three weeks. Visitors were allowed access to the wilderness, but they were kept out of the immediate vicinity of the fire.
By Friday, 9 September 2011, fire managers recognized the fire might get more aggressive. They had been keeping the public out of the immediate vicinity of the fire, but now their approach changed. They calculated their worst case scenario for how far the fire could spread in a day, and strongly advised people to leave the area.
Saturday, 10 September 2011 (The Mess): the fire spread even more aggressively than expected. The IMT was scrambling. Late in the day, they learned that members of the public were still out on the lakes that were supposed to be cleared. Partly this was due to breakdowns in communication —it turned out there were different ideas of what it meant to “strongly advise” people to leave.
In a word, Saturday was a mess.
That evening, the IC flew over the fire. He could tell the fire was more active than before, but it was unclear how far the fire had actually spread. So that night, the IC and his IMT made a new plan, to be implemented the next morning. They decided to do several things: Aggressively expand the closure area, far out in advance of the possible worst case scenario for fire spread. Force people off the lakes in the closure area (not just “strongly advise” them to leave). Bring in additional forces to help with clearing the lakes. Use motorized canoes to help move people out of the area quickly (motors are typically forbidden in the wilderness, but that was waived for the day).
Sunday, 11 September 2011 - They executed the plan, and succeeded in moving nearly 100 people out of the area.
So, in response to the mess they had on Saturday leaders learned and improved their system. This improvement started within hours of the mess.
What They Could Have Done
Remember, there is no guarantee that people will learn good lessons from their close calls. After Saturday’s mess, managers could have
blamed the fire or weather predictions;
doubled down on the status quo;
blamed the public for not clearing out faster;
blamed the firefighters working in the field.
Humans do stuff like that all the time. Why? Because learning is hard to do. It goes against our natural impulses to protect our reputation and defend our past choices. It’s much easier to insist you were right all along and start pointing fingers. Blame the public, customers or employees. Find reasons to defend the status quo. Point to policies that keep you from improving.3
These leaders didn’t get stuck in any of those traps.
Instead, they learned.
Operational Learning
Saturday’s mess showed the IC and other members of the IMT several things:
They recognized that the environment was becoming more dynamic, ambiguous, risky and complex than it had been.
Their fire spread predictions were not as reliable as they had been.
Their soft touch with the public (which worked before) was too passive for the new circumstances.
Fire leaders started acting on these lessons by Saturday night—just hours after the mess.
The most significant change was aggressively clearing people out of the lakes around the fire, starting Sunday. The new clearance radius went beyond the IMT’s worst case scenario.
This is an example of Operational Learning. The essence of Operational Learning is to take in information and adapt your actions, trying to get better results. This is an ongoing process. It can happen at the level of individuals, teams, organizations, even at the level of a profession. It can be more or less formal, and it can be more or less conscious.
In a prior post Post #34 we saw Operational Learning at the level of firefighters in the field making split second decisions, and we saw how learning saved their lives. In today’s post we have an example of Operational Learning at the level of management. We are about to see how it saved the lives of many members of the public.
How Their Learning Saved Lives
Rain was predicted for Monday, 12 September, and the fire was expected to slow down. The worst had passed, or so the firefighters thought.4 Still, they kept expanding the closure area, giving themselves yet more margin (even though they “knew” — based on their predictions — that they would not need it).
But that day, the rain didn’t do what it was supposed to.
Neither did the fire.
And when it ripped with unprecedented intensity — those hundred visitors who just days before were directly in the future path of the fire … they were all safe.
So why exactly were they safe?
They were safe because fire leaders learned from Saturday’s mess.
Due to their Operational Learning, leaders averted what could easily have been a mass casualty event.
I mentioned earlier that learning can be difficult and costly. Consider the costs if they had not learned.
The Takeaway For Leaders
In Post #31, I asked: “A DARC (Dynamic, Ambiguous, Risky, Complex) environment is not predictable … So how do you prepare?”
The Pagami Creek Fire shows us the answer:
To prepare for a DARC future, learn from the past.
Pyne, S. (1997). Fire in America: A Cultural History of Wildland and Rural Fire. Seattle, Washington: University of Washington Press.
Keep in mind, this area had not had extreme fire storms like the Hinkley and Peshtigo for over a century. No firefighter in our time had ever witness such fire behavior in that area.
For narrative, see Pagami Report, pp. 16-19.
For maps see: Reference Maps.
It’s natural to … block out the … warning signs that you don’t want to deal with, so you can believe the situation is how you want it to be, how you think it’s supposed to be. I got lost on a hike one time, and my friend and I had a map and a compass. And as we walked deeper and deeper into the brush in the wrong direction, we kept joking about how wrong the map was, how it needed to be updated, how maybe we got the declination wrong on the compass because things weren’t where they were supposed to be. We did everything but face the fact that we were lost. Despite the obvious facts in front of us, we worked hard to believe we were in control. Humans tend to avoid reality and revert to a familiar course of action. You see this all the time in firefighter fatalities.
From Post #29.