Roll for initiative!

I was having some difficulty deciding which articles to read for this post, so I decided to grab a D20 die and roll for it.  (I may be a bit too excited about planning a new D&D campaign once the semester is over.)  The results were honestly better than I suspected they would be!

The first article that random chance decided for me was Kumar and Chakrabarti’s 2012 piece on the Challenger disaster.  Before anything, they did an excellent job at defining a concept they were looking at in this case study: bounded awareness.  This is essentially when individuals making decision overlook available, and often obvious, information, leading either to a less-than-preferred decision or to something more catastrophic.  Through an examination of the Challenger disaster, they examine the following:

Managers’ dependence upon their existing tacit knowledge and the bounds on their awareness influence each other in a cycle of positive reinforcement. Building on this realization, [the authors] then highlight the path dependent nature of the evolution of tacit knowledge to argue that different decision makers in the organization can experience differing bounds on their
awareness towards the same piece of information. Finally, [the authors] supplement all this by arguing that the past history of success and/or failure experienced by the organization and its individuals also play an important role in the relationship between the two constructs. (Kumar & Chakrabarti, 2012, p. 935)

NASA’s decision to launch despite information indicating that this was not the best course of action was concluded to not be a case of “in-spite-of” but “because of” tacit knowledge.  Their previous knowledge and experience caused them to overlook important information, therefore bounding their knowledge, and making the wrong decision.  The authors also imply that past experience likely made the available information appear “trivial” (p. 943).  The following figure is their model for this cycle of “positive reinforcement”.

bounded knowledge

Kumar and Chakrbarti admit that their analysis does not provide solutions for breaking this cycle, only “how such a cycle can come into existence and then amplify” (p. 945).

This is where my lucky new D20 came in handy again.

Normally, rolling a 2 would leave me pretty bummed out, but I’ve read several other blogs by classmates that have mentioned the Chua (2007) discussion of hurricanes Katrina and Rita as case studies for knowledge management, so I was alright with it this time.  Chua used a variety of accounts about preparation and response to these disasters in order to get a clear picture of the knowledge management in both situations.  They looked at many different parts of the disaster management process such as disaster prediction, implementation of disaster plans, evacuations, and cooperation across agencies.  One conclusion they drew from their analysis was that there was a difference in knowledge creation between the two events.  During Katrina, there was a lot of “knowing”, but not much “doing”.  The response to Rita bridged that cognitive gap, illustrating the following:

Knowledge creation is not merely a social process usually presented in the mainstream KM literature, but is strongly laden with political overtones where forces of restoring goodwill and managing expectations are at play. (Chua, 2007, p. 1526)

Chua also determined that past mistakes were more likely to be “consciously avoided” than those in the more distant past.  They also discuss the importance of knowledge transfer which requires active parties to trust and reciprocate in sharing information.  Some of these conclusions offer insight into breaking the cycle of positive reinforcement that Kumar and Chakrabarti discuss, largely from a political point of view.  Similar to the fighting of authority figures in the wake of the hurricanes, NASA management caved to pressure to launch and ignored vital information.  Chua writes that knowledge should be “consciously categorized” and “compiled” (p. 1527).  While the case study about the hurricanes is in the wake of a disaster compared to the Challenger study focusing on pre-disaster, the emphasis is to focus on the information above authority squabbles or political motivations.  One of Emily‘s posts has a great take on ego within organizational decisions that I kept drawing on as I read these two case studies.

The D20 had already led me in the direction of two case studies that implied conscious effort to focus on the available information is necessary for success.  By this point, I was hoping for a more concrete solution, something beyond the obvious: “Don’t ignore available information!  Leave your ego at home when dealing with risky situations like this!”  When I tossed the die one more time, I was hoping for something that would provide a more structured or specific solution to the problems of bounded awareness.  Lucky number 13 didn’t fail me.

Lam and Chua (2009) address knowledge outsourcing – KO – as an alternative strategy for knowledge management.  Using (another!) case study, they took a look at KO in a for-profit education enterprise.  They concluded that, through the use of KO, organizations are “able to gain access to a pool of… expertise on a flexible basis” (p. 39).   While a good portion of this article emphasized the financial benefits of KO, one of their figures caught my eye:

KO

The cyclical portion of this model is what did it for me.  The monitoring of knowledge, which I believe could be extended to evaluation of knowledge and information, is a vital piece of this KO process.  I imagine applying this outsourcing of knowledge in higher risk situations than an educational organization.  It could be used as a way of implementing checks and balances in decision making.  Sometimes gaining a fresh pair of eyes can help in making realizations or recognizing mistakes.

Thanks, D20.  This was an interesting journey of connecting some dots and trying to apply different concepts to situations that I would not normally think to do.  At the very least, an interesting thought experiment.

Finals week must be hitting me hard if I’m personifying a die.

 

  1. Chua, A. Y. K. (2007). A tale of two hurricanes: Comparing Katrina and Rita through a knowledge management perspective. Journal of the American Society of Information Science and Technology, 58(10), 1518-1528. doi:10.1002/asi.20640
  2. Kumar J, A., & Chakrabarti, A. (2012). Bounded awareness and tacit knowledge:  Revisiting Challenger disaster. Journal of Knowledge Management, 16(6), 934-949. doi:10.1108/13673271211276209
  3. Lam, W., & Chua, A. Y. (2009). Knowledge outsourcing: An alternative strategy for knowledge management. Journal of Knowledge Management, 13(3), 28-43.
    doi:10.1108/13673270910962851.

 

11 thoughts on “Roll for initiative!

Add yours

  1. I really enjoyed this post! I also was really interested in the “Challenger” article and will probably now choose it for my last blog post (watch out for the link!). I’d read a little bit about habits with Pillet and Carillo (2015), but the idea that these habits, or that tacit knowledge or prior experience could actually be a real issue in times of crises didn’t register until now. I suppose I could have assumed, but ironically I was in too much a habit of seeing tacit knowledge as a good thing to register that thought. (Am I using irony correctly here? I’m up too late.) Not sure if the authors touch on this, or maybe if you just have insight here, but do you think if they weren’t in such a tight and stressful position with spacecraft they might have not made the same choices or overlooked the same issues?

    Like

    1. I have been reading Truth, Lies, and O-Rings by Allan McDonald, the former director of the Space Shuttle Solid Rocket Motor Project for Morton Thiokol, Inc. His memoir of the event is a pretty comprehensive look of the disaster from his point of view. He worked closely with NASA, and the problems seemed to be numerous. There was a combination of pressure from the higher-ups to make the launch happen, as well as some inflated egos here and there, with a healthy dose of bounded awareness tossed in for good measure. Personally, I don’t think the issue would have been as catastrophic if there weren’t such strict timelines, but from what I’ve read so far, I feel like there would still have been some sort of crisis. It seemed like that cycle of relying on preexisting tacit knowledge was already well in place. If you’re at all curious about it, I really recommend the book! There’s some literal rocket science here and there, but McDonald does a great job of writing so that the reader can get a sense of the emotions during that stretch of time, too.

      Like

  2. That sounds like a really interesting book. Thanks for the recommendation! But yeah, the ego thing often seems to be rearing its ugly head in these types of situations. I forget whose post talked about something that reminded me of bounded awareness. Whomever, I’ll try and find it in the morning, discussed distraction. They talked about that example of the crowd being asked to watch the basketball players and how many passes they did, and not a single person noticed a guy in a bear suit walking right across the middle of the court. Not quite the same thing as bounded awareness, but there are definitely some similarities! Being so caught up with some preoccupation, whether it being a task or your own tacit knowledge, that you miss something clear as day…

    Like

  3. You are a genius, Laure. I should use a D20 for decisionmaking (since you know how I’m so indecisive).

    I think the Challenger case is worth unpacking to a greater degree, especially with regard to its “in-spite-of” nature. I wish there was some literature that was cited in the Kumar and Chakrabarti article covering face management, since I have an inkling of a feeling that there was a level of personal bias influencing the decision to launch. In other words, the tacit was more of the little voice in the head that wanted to shoot for the stars so bad that NASA wanted to place a pretty damning bet on the lives of those who were lost.

    I understand that this is a REALLY cynical perspective, but I think it’s worth considering. How does bias and affect influence our decisionmaking with regard to tacit knowledge?

    Like

    1. Some of you may know this – I don’t make a secret of it – about the Challenger disaster: I was there. I was just starting my space career at the time and I knew a lot of people involved in the engineering of the system (I worked as the lead QA engineer on the Manned Maneuvering Unit and the Remote Manipulator Arm (depot) at Martin Marietta). So the senior staff were engaged in the testing, the payload integration (me included), the external tank, etc.

      I remember Bill Brittain telling me at the time, before the accident, that reusing parts such as the external boosters was ‘Black Magic’. No one really knows what happens inside those things and reusing them was definitely a big unknown to the senior practitioners at the time.

      Does bias and personal belief system affect decision-making? You bet your ass it does. I not only saw it first hand, I participated in some of it. Not on the external boosters though. Thank God I don’t have to live with that guilt.

      Liked by 2 people

  4. Fantastic premise for a blog post; the D20 die for the win!! I liked Lam & Chua’s proposal for knowledge outsourcing. I think they really point out the importance of considering the KO approach for organizations that either do not wish to undergo the significant burden of processing and developing a KMS internally, or that simply do not have the manpower or resources to develop an adequate system on their own. This idea of forming a strategic partnership with an external entity to aide in the development of a KMS is very intriguing and may be an ideal solution for many organizations that are looking to develop an in-house platform.

    Liked by 1 person

  5. I loved this post, for the articles you mentioned and the way you structured it. This concept of bounded awareness is genius – it is applicable EVERYWHERE. I did a lot of reading about disasters like Challenger for my final paper in this class, and in many cases you see this idea show up, from oil rig explosions to natural disasters to nuclear meltdowns. We think we know everything we need to know, until it’s too late. I think the Chua article is also really interesting. I think I have mentioned this on another blog post, but Rita was a disaster in its own way. Coming right on the heels of Katrina, local officials in Houston were desperate not to have another Katrina, so they ordered evacuations for large areas along the coast. However, the evacuation was not well planned out, so evacuees were stuck in miles-long traffic jams. People ran out of gas and were stranded with no water during a sweltering heat wave. Over 100 people died just from this botched evacuation. So while they may have “consciously avoided” mistakes made during Hurricane Katrina, their bounded awareness led them to fail to consider the ramifications of large-scale evacuations, and this presented its own problems.

    Liked by 1 person

  6. This is such a unique method of blog writing that I think worked out very well. The KO process is one that really stuck with my throughout my reading as well. This process is meaningful due to the feedback system built in that allows for (in my opinion) proper utilization of knowledge in the world. That’s not to say that an internal KMS fit the needs as sufficiently, but the outsourcing model is more versatile for organizations or groups that just can’t get a functioning KMS for any reason.

    Liked by 1 person

Leave a comment

Create a free website or blog at WordPress.com.

Up ↑