Professionalism/Rodney Rocha and Columbia

From Wikibooks, open books for an open world
Jump to: navigation, search

This book is a class project until May 17, 2013. Editors who are not students in the class are requested to refrain voluntarily from substantive content edits until then. Comments on the talk pages, as well as formatting edits (especially those that help the book adhere to Wikibooks conventions), are invited, welcome and appreciated. Thank you.


Case Overview[edit]

The crew of STS-107


The interplay between technological challenges, uncertainty, and cultural factors culminated in the STS-107 Columbia Space Shuttle disaster on February 1, 2003. Seven astronauts' lives were lost as the Columbia disintegrated over Texas upon re-entry in what President George W. Bush called a "great sadness to our country."[1]

Technological Challenges and Uncertainty[edit]

Like the Challenger shuttle disaster before it, the Columbia space shuttle incident was caused by a seemingly insignificant technological flaw that proved to be extremely consequential for the seven passengers aboard the shuttle during the return flight to Earth. Approximately sixty seconds after liftoff, a piece of foam insulation from the External Tank struck the high-strength thermal tile on the leading edge of the left shuttle wing, creating a large gash in the tile surface.[2] Foreign object damage and tile damage were very common in previous missions, but the extent of this tile damage was posthumously determined to be unprecedented. During the remainder of the flight, engineers and managers at NASA investigated the problem but were not completely informed of the extent of damage.[3] This uncertainty caused the Debris Assessment Team, the group responsible for analyzing impact damage on the shuttle, and Mission Control to make improper judgments on the safety of the flight.


The Debris Assessment Team, headed by Rodney Rocha, spent several days watching the video evidence of the impact and performing case analysis in order to determine the potential consequences of the debris strike. To conduct this analysis, engineers used a model to identify tile penetration depth after impact; it was later found that this model was not appropriate for use in this context. Nonetheless, the model predicted complete tile penetration, but the engineers discounted these results based on their previous knowledge of tile damage.[3] The assumptions in this analysis, as well as the risk or uncertainty associated with the outcomes of the analyses, were not made public within NASA, and no effort was made to obtain those estimates. In the end, the engineers determined that high resolution imaging from the Department of Defense would be helpful and necessary in making any conclusive decisions on the safety of the mission. The team was not granted these images after three successive tries. Thus, the team was left only with a large amount of uncertainty in their original models and experiential learning from previous shuttle missions.[3]

Cultural Challenges[edit]

Uncertainty and failed communication plagued the Columbia mission. During the flight, engineers’ and managers’ opinions of the consequences of the tile damage varied widely, and this led to confusion and an obfuscation of the complexity of the problem. In the years prior to this flight, NASA leadership began stressing efficiency over safety in what became an deterioration of the initial emphases placed on safety and health. The combined effects of complexity and turpitude resulted in a dire situation for the mission.

Professional Culture[edit]

Daniel S. Goldin, NASA administrator

For many years before the Columbia disaster, NASA operated under the safety slogan of "If it's not safe, say so." In April of 1992, Dan Goldin was appointed as the chief Administrator of NASA. He held this position until November of 2001, and NASA underwent many significant changes during his tenure. Pressure from the federal government on NASA to cut costs pushed Goldin to introduce a new culture.[4] In a speech to his employees during his first year, Goldin challenged them by asking, "Tell us how we can implement our missions in a more cost-effective manner. How can we do everything better, faster, cheaper, without compromising safety?"[5] This attitude threatened the success of "If it's not safe, say so."

The new, "better, faster, cheaper" (FBC) mantra was criticized by the media and members of congress for its potential to neglect safety in favor of higher risk, lower cost strategies. Senator Kay Hutchinson (R-Texas) bluntly stated, "FBC should be thrown in the waste basket."[6] Goldin repeatedly rejected such criticism, telling an audience at the Jet Propulsion Laboratory in 1994, "When I ask for the budget to be cut, I'm told it's going to impact safety on the Space Shuttle ... I think that's a bunch of crap." However, costs were cut significantly in the area of safety assurance, due to the perception in the 1980s and early 1990s that NASA's safety programs were overly redundant and costly.[3] Criticism of Goldin's management strategies continued throughout the 1990s. Between 1996 and 2000, six out of NASA's twenty-five total launched missions failed, confirming the fears of the critics. These failures included the loss of four spacecrafts.[3] Although no lives were lost, it is clear that mission quality was being sacrificed in favor of higher output and lower cost.

In March of 2000, Goldin accepted responsibility for the recent failures, telling employees and reporters, "I asked these people to do incredibly tough things, to push the limits... and we hit a boundary. They did terrific things and I pushed it too hard."[7] NASA in the 1990s failed to achieve all three of Goldin's goals. Though mission frequency increased while operating costs were cut, mission failures indicated a neglect of the "better" prong of his mantra. The "better, faster, cheaper" attitude may be intrinsically flawed and dangerous. It shares similarities with a commonly discussed issue in public healthcare today. The three most important goals or desired qualities of healthcare are quality, low cost, and accessibility. These three qualities lie at the vertices of the "iron triangle of healthcare." When the administrators in a healthcare system attempt to improve one of these attributes, there are always sacrifices made in either one or both of the others. By moving towards one vertex of the triangle, the system distances itself from at least one of the others. NASA in the 1990s was subjected to an analogous iron triangle, with better, faster, and cheaper at the vertices, as a result of Dan Goldin's management. Unfortunately, the agency appeared to function furthest from the "better" vertex, favoring low-cost, high quantity operation instead.

Organizational Structure[edit]

Technical issues were a major reason for the crash, but they were not the only reason. The culture at NASA in the early 2000s was less than ideal, and the command structure of the agency lended itself to flawed communication and organizational silence. Following the disaster, much focus was placed on Rodney Rocha and his decision not to send an email that expressed his concern. While this was a factor in the crash, the problem was much bigger than him--it involved the entire organizational structure of NASA.

Problems at NASA[edit]

At NASA, especially a decade ago, there was a single chain of command in place. Engineers could only report to their group managers, who would then report to their project managers. This led to two major problems:

1. Information was diluted as it went up the chain—it could be distorted, silenced, or lost depending on the opinion of the manager relaying the information
2. The engineers had no real power themselves—they had report to their superiors to get anything done.

Effects on Columbia Mission[edit]

Columbia lifting off on its final mission.

So what went wrong during the Columbia mission? The first problem was with the chain of command. There were three main groups involved in the analysis of the foam strike:

1. Debris Assessment Team—a group of engineers headed by Rodney Rocha and Pam Madera that was created after the foam strike to assess the damage
2. Mission Evaluation Room- the group tasked with evaluating the entire flight from an engineering perspective; liaison between the Debris Assessment Team and the Mission Management Team
3. Mission Management Team- the group, headed by Linda Ham, that was in charge of the making the key decisions throughout the mission [8]


After analyzing the initial images of the foam collision, the engineers of the Debris Assessment Team determined that they needed more images; they couldn’t make proper calculations and assessments without more information. As co-chair of the Debris Assessment Team, Rodney Rocha had serious doubts about the safety of the flight, and notified several others about his concerns. However, without sufficient data to prove that his concerns were legitimate, he experienced difficulty in validating his fears to his superiors. His managers, including Linda Ham and Ron Dittemore, Shuttle Program Manager, found previous shuttle impact damage history to be a more compelling benchmark to judge the severity of the current problem at hand[3]. Furthermore, with the frequency with which tile damage occurred in the past, these managers became normalized to the situation and quickly deemed this foam strike an "in-family issue." This meant that it was something NASA had seen before and knew how to deal with, and they did not consider it to be a pressing issue. The tile damage was subsequently classified a maintenance issue rather than a "safety of flight" issue. The issue was dismissed by program leaders, and the flight continued without repair to the vessel[3].


Engineers in the Debris Assessment Team didn't have the power themselves to demand images--they had to get permission from their superiors. And when the Mission Management Team showed little interest in helping them, Rocha tried to get images via informal chains of command, requests were denied for failing to go through the proper channels. The engineers were confused by these decisions[9], but due to the management structure at NASA, they were left with no place to turn.


Things may have turned out differently if Rocha had sent an email he drafted voicing his concerns about the mission. In this email, he outlined his concerns with the decisions being made by management and stressed the need for additional images.[10] But Rocha didn't send the email because he didn't want to jump the chain of command. At NASA, the boss had the final say; since he'd brought this problems to his boss, he referred to management's judgment of the case. Furthermore, while engineers packed the room during the Mission Evaluation Room presentation to the Mission Management Team about the scenario, not a single one spoke up when the presenter, Don McCormack, said there was no safety-of-flight concern. None of the engineers wanted to lose their jobs by speaking out against their superiors.[8] Thus, this single chain of command was a major contributing factor in the Columbia disaster.

Safety Reporting System[edit]

NASAs implementation of its "If it's not safe, say so" mantra involves a complex sequence of reporting steps. An employee who becomes aware of a safety risk is required to abide by the NASA Safety/Hazard Reporting Hierarchy. These guidelines instruct the employee to first report the issue to their immediate supervisor. They contain a series of complicated contingencies depending on whether or not the issue is resolved at each step. The system requires that the employee progress up through the chain of command, maintaining personal responsibility for reporting the problem until it is resolved.

This system is contrasted by the Aviation Safety Reporting System (ASRS), which allows airline employees to report safety issues directly to a central agency. The system's home page contains a link to an anonymous report form, which can be filed electronically. The employee has the option to provide either contact information in case questions arise or completely anonymize their report.

NASA's system requires that a reporter maintain personal responsibility, thus potentially resulting in a sacrifice of social capital in the workplace. The process itself may dissuade an employee from actually reporting an issue. However, the ASRS allows reporters to maintain anonymity, encouraging open reporting of all potential problems.

DuPont Case Study[edit]

Although there are some who blame Rodney Rocha for not voicing his qualms about the foam hit to prevent the Columbia disaster, it is clear that there were many other factors involved in this case of professional ethics. This begs the question of whether the organization and culture of a company can affect the decisions made and the ethical rules followed by professionals. One alternative case study helps to clarify the problems at NASA in 2003 and lends some justification to Rocha's ultimate decision to remain quiet.

Pierre S. du Pont[edit]

Pierre S. du Pont was the president of chemical engineering company, DuPont, from 1915 to 1919. In these years, he developed a novel management structure that challenged the traditional model of company culture and organization at DuPont. With this modern system, rationality and efficiency were preferred over tradition. This model was later termed rational management. As president, du Pont was influenced by the ideas of the Enlightenment, which sought to promote truth via honest and open debate. Pierre du Pont hoped to apply these ideals and modern management practices to his family's company.Walter S. Carpenter was the president of DuPont from 1940 to 1948, and in his time there, he further promoted this idea of rational management. Under the leadership of these individuals and presidents to follow, DuPont developed into the world's third largest chemical company serving 16 different industries ranging from agriculture to health care and medicine.[11]

Rational Management[edit]

At its core, the model of rational management emphasized that good ideas rather than personal empires are the key to continued success.[12] This required leaders and managers to ask for, and in fact encourage, constructive criticism from employees. In essence, this created a culture within the company that promoted "devil's advocates" at every level of organization. This ensured that problems were always examined from a range of perspectives, and that the chain of reasoning was always logical and consistent. Through the development of this working environment, any ideas or concerns at DuPont could get a fair hearing, even if senior managers did not favor them.

Organizational Structure[edit]

As DuPont grew, the company also implemented a novel organizational technique to support their management ideals. Whenever new divisions or departments were added, the organizational structure was altered such that there were semi-autonomous operating departments, each with its own production, sales, and research divisions. Each department's general manager was held accountable to the Executive Committee for the department's performance.[13] This flattened hierarchy of organization worked well with the existing culture of constructive criticism. DuPont was a pioneer in this type of business organization, which was soon copied by companies around the world. DuPont is now considered one of the best EHS organizations in industry, and their structural organization, along with a culture that believes negative feedback is at the heart of sound decision-making, has contributed to this success.[14]

Conclusion[edit]

Immediately following the crash of the Columbia, many NASA executives and employees faced directed criticism for their actions or inactions, as in the case of Rodney Rocha. Although the individuals involved in the chain of command and decision-making process certainly affected the tragic outcome of this case, there were two major flaws within NASA at the time that played a a critical role in shaping their decisions. First, NASA was embedded in a culture that encouraged efficiency at the expense of dissent and second, its organizational hierarchy made dissent from lower ranks exceedingly difficult. Thus, although Rodney Rocha was blamed for his role as an individual professional faced with a tough decision, we propose that the cultural and hierarchical environment at NASA was the major contributor to this tragedy.

References[edit]

  1. [1], Bush, G. W. (2003, February 1). Space shuttle Columbia tragedy speech to the nation. Address presented at the White House Cabinet Room.
  2. [2], Dunn, M. (2003, February 1). Columbia's problems began on left wing. Retrieved May 6, 2012, from The Baltimore Sun website: baltimoresun.com.
  3. a b c d e f g [3], Gehrman, H. W. National Aeronautics and Space Administration, (2003). Columbia accident investigation board report. Washington, D.C.: Government Printing Office.
  4. [4], Lambright, Henry (2007). Leading Change at NASA: The case of Dan Goldin. Space Policy Vol. 29(1) 33.43.
  5. [5], Mars Program Independent Assessment Team (2000). Mars Program Independent Assessment Team Report.
  6. [6], Cowing, Keith (2003). Farewell to Faster-Better-Cheaper. SpaceRef.
  7. [7], Goldin D. Remarks, at Jet Propulsion Laboratory, March 29, 2000, Space.com, March 29, 2000.
  8. a b [8], Marsen, S. (n.d.). Analysis of case study.
  9. [9], Rocha, R. (2011, January 27). Accidental case study of organizational silence & communication breakdown: Shuttle Columbia, mission STS-107. [PDF Document]
  10. [10] Rocha, R. (2003, January 22). RE: STS-107 wing debris impact, request for outside photo-imaging help.
  11. DuPont. http://www2.dupont.com/home/en-us/index.html#
  12. [11], Smith, John. DuPont: The Enlightened Organization.
  13. DuPont: The Enlightened Organization
  14. [12], MacLean, Richard. (2004). EHS Organizational Quality: A DuPont Case Study. Environmental Quality Management.