NASA Office of Logic Design

NASA Office of Logic Design

A scientific study of the problems of digital engineering for space flight systems,
with a view to their practical solution.


SP-287 What Made Apollo a Success?

3. TESTING TO ENSURE MISSION SUCCESS

 

By Scott H. Simpkinson
Manned Spacecraft Center

[21] All programs, once the overall objectives are fixed, must meet well-defined design goals, or management fails. However, this hard fact does not prevail in the world of test. Theoretically, one no longer needs to test hardware or software, after developing a new concept, except occasionally to gather empirical data needed to operate the equipment. Unfortunately, designs are not perfect, materials and processes often do not work as the designers expect, manufacturing techniques sometimes inadvertently alter the design, assembly procedures leave room for mistakes, engineering and development tests do not necessarily provide all the required data, and, finally, substandard workmanship and human error creep in.

All of these factors require attention at the outset of a program. Some factors, such as human error, demand vigilance until delivery of the last item. Experience has shown that only a well-balanced test program can instill confidence in the delivered hardware and software for a space vehicle.

At the beginning of the Apollo Program, high priority went to setting up a program for one-time qualification of a component or system design and to manufacturer execution of the program. All contracts contained specific clauses relating to qualification tests. These tests provided a reasonable margin of safety, taking into account the expected environments the pieces would pass through during storage, transportation and handling, ground-test duty cycles, and two-mission duty cycles. After the early unmanned flight-test program had started, actual measurements of the launch environment led to adjustments in the qualification vibration levels. Equipment already tested at too low an amplitude had to pass an additional (or delta) qualification test program. Rigorous monitoring and careful failure reporting allowed correction of design and process failures.

Even with this exacting qualification program, a number of experienced engineers believed each flight item should have to pass some environmental testing before NASA accepted it for installation and flight. Thus, nearly all functional components and systems underwent acceptance testing. However, the detailed test plans were left in the hands of the individual designers and system engineers. During the early stages of the Apollo Program, most components and systems were limited to a complete functional bench test at room temperature and pressure and a survival test after a brief exposure to random vibration in the axis suspected of being the most sensitive and at the expected flight-power spectral density.

A few vendors, who were experienced on other critical military and NASA programs and who were supplying electronic components for the lunar module (LM), also performed temperature-limit tests at their own discretion during buildup or during the [22] final acceptance tests (or, in some cases, during both). Unfortunately. the expected flight vibration levels were so low in many cases that these early environmental acceptance tests did not reveal errors of workmanship and manufacturing processes. Many system manufacturing and processing errors came to light late in the cycle, delaying the program and wasting manpower.

After the spacecraft fire, NASA launched an extensive review of the Apollo acceptance test practices. Subcontractors and vendors for 33 Apollo spacecraft assemblies, representing a cross section of electrical, electronic, and electromechanical equipment throughout the spacecraft, received 79 detailed questions concerning their individual acceptance test plans and objectives. This survey revealed the inadequacy of environmental acceptance tests or, in many cases, the nonexistence of acceptance tests. Soon, both the command and service module (CSM) and the LM would carry men for the first time. The decision was made by NASA to review completely all Apollo spacecraft acceptance, checkout, and pre-launch test plans and procedures.

In general, the review found factory checkout and pre-launch tests at the launch site adequate and, in many cases, with overly tight tolerances. Between installation and launch, the equipment passed the same tests a number of times.

The revised overall testing ground rules, which came out of the review, resulted in a more efficient test plan from pre-delivery acceptance tests to launch. However, the results of the environmental acceptance test review carry much more significance for those who will make the decisions for future programs.

The prime contractor for the LM required nearly all subcontractors and vendors to subject their equipment to 1 minute of random vibration in each of three mutually perpendicular axes. However, most of the vibration levels were very low, as table 3-I shows. A decision early in the Apollo Program had set acceptance vibration test levels 6 decibels below those for qualification, or at 0.005 g2/Hz, whichever was greater. (On the average random vibration table, one cannot practically set up a vibration level lower than 0. 005 g2/Hz. ) Many of the components did not have to function or pass continuity checks during vibration tests, only before and after.

To determine proper acceptance vibration tests, a study reviewed 20 major aerospace programs under nine different prime contractors. Bearing in mind the true purpose of acceptance testing-to prevent the installation and flight of substandard equipment-NASA combined the results with an understanding of the nature of the failures encountered after acceptance testing. The resulting program fashioned Apollo vibration tests after those for Gemini.

A component would have to withstand the vibration levels shown in figure 3-1 in each of three mutually perpendicular axes for a minimum of 1 minute and a maximum of 5 minutes. In addition, a firm ground rule pegged the minimum qualification vibration test level at 1.66 times the acceptance test level at all frequencies. In addition, testers had to monitor all pilot-safety functions and continuously check all electrical paths for continuity and short circuits during each of the three vibration cycles. The testers also had to monitor all mission-success functions, if at all feasible. within schedule and cost constraints. Of the original acceptance vibration test plans for approximately 150 deliverable LM items, 80 plans were changed significantly.


[23] TABLE 3-I. LUNAR MODULE ACCEPTANCE VIBRATION TEST IN 1967.

Subsystem

Quantity to be vibrated

Not vibrated

Number vibrated at vibration level, g2/Hz

<0.01

greater or equal to0.01 <0.02

greater or equal to0.02 <0.03

greater or equal to0.03 <0.04

greater or equal to0.04

.

Communication and data

35

.

6

6

2

.

21

Electrical power

19

.

10

3

1

1

4

Environmental control

6

.

5

.

.

.

1

Guidance, navigation, and control

16

.

4

.

7

1

4

Propulsion

7

1

4

.

.

.

2

Reaction control

2

.

.

1

.

.

1

Crew provisions

5

.

1

2

.

1

1

Displays and controls

22

.

4

12

2

2

2


A special procedure governed mission success equipment that could not be removed from LM 3 and LM 4 without a serious impact. Similar equipment for later spacecraft underwent the new tests. Some of the test failures indicated the possibility of similar failures on LM 3 and LM 4. Depending on the failure probability and the impact of the failure, we either changed to newly tested equipment or accepted the risk.

A similar review of the CSM again found vibration levels too low to detect workmanship errors. A considerable number of the components experienced only single-axis sine-wave excitation. Of over 200 deliverable items of CSM equipment tabulated in table 3-II, the requirements for 80 were changed. 


Figure 3-1. Revised Apollo acceptance vibration test guidelines.

Figure 3-1. Revised Apollo acceptance vibration test guidelines.


[24] TABLE 3-II. COMMAND AND SERVICE MODULE ACCEPTANCE VIBRATION TEST IN 1967.

Subsystem

Quantity to be vibrated

Not vibrated

Number vibrated at vibration level, g2/Hz

<0.01

greater or equal to0.01 <0.02

greater or equal to0.02 <0.03

greater or equal to0.03 <0.04

greater or equal to0.04

.

Communication and data

33

.

11

3

1

16

2

Electrical power

24

10

3

.

.

.

11

Environmental control

65

64

.

1

.

.

Guidance, navigation, and control

28

.

10

3

1

3

11

Propulsion

11

6

.

.

1

3

1

Reaction control

1

.

.

.

.

.

1

Sequential events control

7

.

1

2

2

.

2


The need for thermal or thermal vacuum testing as a tool for finding workmanship faults became apparent during the review of components for reacceptance vibration testing. The construction of certain items prevented vibration tests from revealing critical workmanship errors. As a result, some items were deleted from the acceptance vibration test list and required an acceptance thermal test. The review turned up several Apollo components subject to acceptance thermal tests for this reason. However, the review team found the criteria governing acceptance and qualification thermal testing and the relation between the two to be unacceptable.

A joint meeting of North American Rockwell, Grumman Aerospace, Massachusetts Institute of Technology, Boeing, and Manned Spacecraft Center representatives replaced the old standards shown in table 3-III with the new standards graphed in figure 3-2. The new guidelines called for 1-1/2 temperature cycles with a swing of 100 F or more, starting and ending at room temperature. The guidelines specified holding the test article at the two high-temperature and one low-temperature limits for 1 hour after the temperature had stabilized. The equipment should operate throughout the test and undergo continuous monitoring for continuity. It should pass complete functional tests immediately before and after the thermal test, and an adequate functional test after stabilization at the high and low temperatures. Equipment suspected of being adversely affected by temperature gradients should also complete functional tests during the two transitions between the high-temperature and low-temperature limits. An arbitrary decision set the acceptance test limits at 20 F less than the qualification test limits. Equipment normally cooled on a cold plate should be mounted on one during the test with the coolant entering the cold plate externally controlled to between 10 and 15 F cooler than the environment. Of approximately 260 LM and 215 CSM items of deliverable equipment reviewed, 70 LM and 55 CSM items required additional or new acceptance thermal tests to augment or take the place of the acceptance vibration tests.


[25] TABLE 3-III. CRITERIA FOR LUNAR MODULE QUALIFICATION AND ACCEPTANCE THERMAL TESTING IN 1967.

Parameter

Cold-plate cooled

Radiation cooled

Qualification

.

Pressure, torr

1 x 10-5

1 x 10-5

Temperature, F

Root of flange

35 to 135

Not controlled

Environment a

0 to 160

0 to 160

.

Acceptance

.

Pressure, torr

1 x 10-5

Ambient

Temperature, F

Root of flange

35 to 135

Not controlled

Environment b

0 to 160

30 to 130

a Equipment thermally isolated for 24 hours at each level.

b Equipment thermally isolated for 4 hours at each level.


By December 1969, over 15 000 tests had been performed to the revised environmental acceptance test requirements. The results are shown in figures 3-3 and 3-4, and installation of acceptance-tested crew equipment for the command module is shown in figure 3-5. While workmanship errors accounted for the majority of failures, design deficiencies not revealed during the qualification tests caused more failures than expected. Because of the design problems and unavoidable test errors, the startup phase was painful. Individual failure rates often exceeded 25 percent. Some exceeded 50 percent. With many new components entering environmental acceptance test during the first year of the reacceptance test program, design and test error rates came down slowly. However, both finally reached approximately 1. 5 percent and have stayed there for the past 9 months. Workmanship errors, on the other hand, have remained relatively constant at approximately 5 percent.


Figure 3- 2. - Revised Apollo acceptance thermal test guidelines.

Figure 3- 2. Revised Apollo acceptance thermal test guidelines.


[
26]

Figure 3-3. Results of acceptance vibration tests for 11447 tests of 166 different components.

Figure 3-3. Results of acceptance vibration tests for 11447 tests of 166 different components.


Figure 3-4. Results of acceptance thermal tests for 3685 tests of 127 different components. [MISSING GRAPHIC]

In retrospect, several recommendations and points of interest stand out from the test experience gained during this nation's three major manned spacecraft programs over the past 10 years. 

  1. Design and development testing plays an important part in the overall test plan. Perform it as early as possible. Document the results well, and hold the data for future reference. Pay particular attention to what seem minor details, especially for substitute parts and "explained" failures.

  2. Perform qualification testing at the highest possible level of assembly practical, within reasonable cost and schedule constraints.

  3. Before subjecting qualification test specimens to qualification test levels, acceptance test the specimens by following approved test procedures under strict change control. Include applicable environmental acceptance tests.

  4. Qualification test specimens should come from among normal production items manufactured and assembled using final blueprints and processes, under normal quality control, and with production tooling and handling fixtures.

  5. Make qualification tests rigorous and complete, yet realistic. A strong tendency exists to qualify equipment to the designer's desires rather than to the actual requirements. Where flight equipment will never leave controlled clean room conditions and need operate only in outer space after launch, take care not to fall into the classical qualification test programs, which require such things as salt-water immersion, rain and dust, cyclic humidity, fungus, and other environmental extremes.

  6. Carefully document, track, explain, and take necessary corrective action on test failures encountered on all production hardware. Qualification test hardware, by definition and from bitter experience, must count as production hardware. No suspected failure encountered during any test on production hardware should escape from this rule, no matter how insignificant or unrelated the failure may seem at the time. Experience has shown that major failures always receive adequate attention. The minor unreported failure is the one that slips by and shows up late in the vehicle test cycle or, worst of all, in flight.

  7. Perform qualification vibration testing at excitation levels that provide a 50-percent margin of safety over the expected environments, including acceptance vibration testing. Because of weight and volume considerations, power consumption, or [28] other limiting factors, this margin may have to decrease. When this becomes evident treat each case on an individual basis and make sure all parties involved understand.

  8. Thermal test for qualification to temperature limits at least 20 F outside the expected temperature limits, including acceptance thermal testing.

  9. Monitor the functioning of test specimens during the dynamic phases of qualification and environmental acceptance testing as completely as possible. Test all functions with 100-percent monitoring, including all redundant paths, before and after appropriate phases of the qualification and acceptance tests.

  10. Subject to environmental acceptance tests components of critical equipment that must function electrically or mechanically, when the complete deliverable item cannot be visually inspected or functionally tested (or both) for design, manufacturing, assembly, handling, procedural, or workmanship errors.

  11. Determine environmental acceptance test requirements on an individual basis, considering what types and what levels of tests reveal quality or workmanship defects. Examine the failure history of each component during the engineering and development stages and during qualification testing. Look at the failure history of like components using similar manufacturing and assembly techniques, particularly those made by the same vendor, in the same plant, and with the same people.

  12. Carry out environmental acceptance testing at the lowest practical level of assembly. For example, it is much better to find the solder balls in a sealed relay before building it into an assembly than to cope with an intermittent system failure on the launch pad. The earlier one uncovers a problem and eliminates it from downstream hardware, the less its impact on the overall program.

  13. Carefully examine changes of any nature to the hardware for their effect on qualification and acceptance test results. Qualification by similarity can give very misleading results and should take place only with full knowledge of all parties concerned. In the case of acceptance testing, the simple act of removing a cover plate for inspection purposes constitutes potential grounds for a reacceptance test.

  14. Total vehicle environmental acceptance tests are desirable. However. tests of this nature become virtually impossible to perform on manned spacecraft. Thorough qualification of spacecraft components. including wiring and tubing installations combined with proper environmental acceptance tests of the equipment before installation has thus far assured mission success.

  15. Always retest after changes to the hardware or software have been made. Set up rigorous controls to assure it.

  16. When possible, test all functions and paths on the installed systems at least once prior to delivery to the launch site. As a general rule. when changes or replacements require retesting, do it at the factory. Pre-launch testing at the launch site should demonstrate total space-vehicle and launch-complex compatibility and readiness. They should not simply prove the adequacy of a given component or single system.


[
27] [PICTURE MISSING]

Figure 3-5.- Installation of the acceptance-tested crew equipment in the Apollo command module at the NASA Manned Spacecraft Center.

[29] At the start of a program, devise a thorough overall integrated test plan that includes all testing (including engineering and development, qualification, reliability and life, pre-delivery environmental acceptance, pre-installation acceptance, installed system, altitude. pre-launch, and early unmanned flight tests). The plan should include as much testing as necessary to gain confidence in the hardware, the software, the test equipment. the test procedures, the launch procedures, and the flight crew procedures. The plan should provide for deleting unnecessary phases of testing as confidence grows.

We believe these measures have proved themselves in the Apollo Program. By calculating from the design and workmanship failure rates during reacceptance tests, the program corrected or removed before launch approximately 65 potential spacecraft pilot-safety or mission-critical hardware failures per flight. Some faults remained. Each of the first 10 vehicles flown on the first six manned Apollo missions experienced approximately eight hardware failures. But fewer than two failures per vehicle stemmed from workmanship or quality. Of the total flight failures from these two causes, better or more thorough acceptance testing could conceivably have revealed only five. Also, no evidence in the flight-failure history indicates a failure caused by too much testing.

The real effectiveness of the test program comes out in examining the results of hardware failures during the first six manned Apollo flights. None of the flight failures affected pilot safety or mission success.

previouscontentsnext

Home - NASA Office of Logic Design
Last Revised: February 03, 2010
Web Grunt: Richard Katz
NACA Seal