2005 MAPLD International Conference
Ronald Reagan
Building and International Trade Center
Washington, D.C.
September 6, 2005
Design Integrity
Seminar Leaders:
- John Stone, Southwest Research Institute
- Shahana Aziz, Goddard Space Flight Center
- and a panel of 6 (more info below)
Summary
This seminar is designed to provide conceptual and concrete techniques for ensuring that aerospace designs have “integrity” -- that is that they are definable, verifiable, maintainable, and efficient; and most important, that they WORK! The seminar will include three sessions. The first session is in two parts, the first part focusing on specification and design issues, the second on verification and validation (including related techniques). The second session will discuss power and signal integrity issues. The third session will consist of a panel discussion on the topic of FPGA verification with audience participation opportunity:
Presentation Material
Modules
Logistics
Location: Hemisphere A
Time: 9:00 am to 5:30 pm.
Seminar Schedule
- Start: 9:00 am
- Session 1, Part 1: 9:00 am to 10:30 am
- Break: 10:30 am to 10:45 am
- Session 1, Part 2: 10:45 am to 12:30 pm
- Lunch: 12:30 pm to 1:30 pm
- Session 2: 1:30 pm to 3:30 pm
- Break: 3:30 pm to 3:45 pm
- Session 3: 3:45 pm to 5:30 pm
- End: 5:30 pm
Session 1, Part 1 – Design Integrity Concepts
- Consistent terminology, consistent results. (Basic Definitions)
- What does it have to do? (Specifying a design)
- Letting constraints work FOR you. (Proportional design)
- More than the sum of its parts. (Partitioning a design)
- You mean we’re still working on it? (Sustaining a design)
Session 1, Part 2 – Design Integrity Concepts (continued)
- What’s the exit strategy? (Verifying a design) – verification early and often, tiered verification techniques
- What do you mean, it doesn’t do what we thought? (Validating a design) – contributing techniques and test set characterisitcs
Session 2 – It's Still an Analog World - Signal and Power Integrity
- Introduction: What is signal and power integrity?
- Common signal integrity considerations
- Common power integrity considerations
- Guaranteeing the design – simulation techniques
- Simulation examples
- Comparison examples
- PCB design methododology
Session 3 - FPGA verification approaches
An open discussion with some panelists.
Charlie Howard - Southwest Research Institute
Wesley A. Powell - NASA Goddard Space Flight Center
Ronnie Killough - Southwest Research Institute
Dave Moser - BAE Systems
Jim Westfall - University of Colorado’s Laboratory for Atmospheric and Space Physics (LASP)
Rod Barto - NASA Office of Logic Design
Twenty questions for discussion (as many as time allows):
How do we produce FSW type interactions (in simulations or HW tests) when FSW is in flux (even after launch)? (Especially hard in the presence of interrupt servicing routines)
Can all flow control inducing cases be covered during verification; can they be envisioned?
What methodology is used to formulate worst case scenario tests (i.e. are all the relays on at once, what happens if device "A" is reset, etc...)?
Given the increasing size of today's FPGAs and the growing complexity of the designs that must be implemented in them, there is desire to use high-level design tools and IP Cores to streamline the application development process. However, there is uncertainty over the quality of the designs these tools produce. To what extent can the large and complex designs produced by these tools be verified?
What constraints need to be placed on where and how and where designs developed by these tools should be used? 3) With tools now available that can automatically instantiate redundancy in a user design, to what extent can and should the output of these tools be verified?
What level of requirements analysis and capture is appropriate to apply to the specification of functionality to be implemented in an FPGA?
What approaches in software design can/should be applied to the design of an FPGA?
What are appropriate levels of testing in the context of an FPGA?
How do you determine what features should be tested in the FPGA and which ones should be tested in simulation?
How has assertion verification methods affected your verification processes?
Are you using formal verification with FPGA designs and if so what is your experience?
There are a number of programmatic factors that play into when you start testing a new FPGA design at the board level. Start testing too early and the software and test engineers may end up spinning their wheels while the hardware folks are debugging the design. Start too late and they might spin their wheels waiting for any kind of hardware to test. Where is the “sweet spot” between extensive simulation and analysis and extensive board level testing?
Over that last few years, as we’ve made the transition (at LASP) from schematic based designs to VHDL based designs, it appears that we are killing fewer FPGAs during the development cycle. Is this experience duplicated at other organizations? Why would this be the case? Are we getting better at hiring good engineers? Does the software development model used in VHDL designs provide better insight and review capability than schematic entry? Are the simulation, verification and test bench capabilities easier to use or more accurate?
We’ve successfully used Actel’s ProASIC parts to “breadboard” and verify everything from simple logic modules all the way up to complex, multiple FPGA designs where the discrete FPGAs and interfaces are partitioned in the test device. What are the limits to using reprogrammable FPGAs in the verification process? How comfortable are we extending this to any reprogrammable device even if the evaluation development environment differs from the flight development environment? For example, how successful are verifications performed when the breadboard uses Vendor A parts and tools and the flight uses parts and tools from Vendor B?
The digital design paradigm was at one time using proven components from data books and building systems upwards. Similarly, in the software world, Ada programmers could use the proven Booch components to build software systems. Why has the paradigm of starting with proven components and building upwards not carried over to VHDL deisgns?
Given that a digital design actually creates a three-dimensional object with complex temporal relationships between its constituent parts, why has the dominant design paradigm become a text description in which the details of connectivity and temporal relationships are suppressed?
How can the space electronics community be convinced that reviewability and transferability are important attributes of a good design?
Under what circumstances can we create a design which is correct and verifiable by inspection?
Has simulation become “too” important in verifying high-reliability design implementations?
Should there be an "independent verification and validation team?" Does the answer to this question depend on the size/complexity of the design?
Charlie Howard is a Senior Research Engineer at Southwest Research Institute and brings a diverse background in defense electronics, access telephony, core (IP) routing hardware, and space-flight avionics design. He has significant expertise in design of both FPGAs and ASICs and is skilled in both Verilog and VHDL.
At SWRI, Mr. Howard has provided multiple FPGA designs for the Orbital Express and Kepler avionics boxes and is the lead engineer for the Command and Telemetry circuitry contained therein. . For OE, he directed the board test team and provided system integration support through delivery of flight hardware. He has experience in all aspects of complex electronics design including product formulation, implementation, analysis, test, and support.
Wesley Powell is the Associate Head of the Microelectronics and Signal Processing Branch at the NASA Goddard Space Flight Center. He holds a B.S. degree in electrical engineering from the University of Maryland and M.S. degrees in electrical engineering and systems engineering from the Johns Hopkins University. He was lead engineer for the Transponder Remote Services Node (XRSN) on the Microwave Anisotropy Probe (MAP) mission , completing the design of the Uplink and Downlink FPGAs. He also lead the development of a reconfigurable computing application to perform onboard signal extraction for the airborne Multi-kiloHertz MicroLaser Altimeter (MMLA). He has also designed a number of smaller FPGAs for ground support equipment.
Mr. Powell's current interests include investigating methodologies for using the Xilinx Virtex-II Pro for spaceflight applications , reconfigurable computing technology and applications , and radiation tolerant microelectronics.
Ronnie Killough is currently Director of the Communications and Embedded Systems Department at Southwest Research Institute. Mr. Killough has experience in real-time embedded systems, satellite flight software systems, networking and communication interface protocols, open systems architectures, compilers and computer language design, and model integrated computing.
Mr. Killough served as Software Systems Manager for the Imager for Magnetopause to Aurora Global Exploration (IMAGE) satellite and ground systems, and was directly involved with the development of the flight software for the satellite's Central Instrument Data Processor. Mr. Killough also managed the development and verification of flight software for the two narrow-field instruments on the Swift gamma ray burst explorer. Mr. Killough has extensive experience in project management and software and systems engineering, and routinely serves on independent review boards for NASA missions.
Mr. Killough has a Bachelor's degree in Computer Science from Angelo State University, and a Master's degree in Computer Science/Electrical Engineering from Texas A&M University.
Dave Moser is a Senior Principle Engineer as BAE Systems in Manassas Virginia. He has a background in FPGA and ASIC design, mainly for space applications. His overall experience ranges from system engineering and software development to physical design of ASICs. He was the Lead Engineer for several ASICs. The ASICs had a wide range of functions such as signal processing, bus protocol bridging, and micro-processing. He has lead several FPGA designs with sizes ranging from 5k gates to 2 million gates. He has experience in design and verification methodology development. He directed the creation of a methodology for developing ASICs with a parallel FPGA targeting path while minimizing parallel code development. He also has worked on several box level architectures including 2 narrow channel data switches (LEO1 and MUOS).
Jim Westfall is a Senior Professional Research Assistant at the University of Colorado’s Laboratory for Atmospheric and Space Physics (LASP). He is the Electrical Engineering functional lead currently overseeing the efforts of about a dozen electrical engineers, the EEE parts group and an electrical drafter. In addition Mr. Westfall is the Chief Engineer for the AIM SMEX mission.
Mr. Westfall has been involved in all aspects of instrument design at LASP since 1979. Instrumentation he has developed includes low level signal detection, low and high voltage power supplies, precision mechanism controllers, and instrument interface and control electronics. He was the Spacecraft Manager and Systems Engineer for the Student Nitric Oxide Explorer (SNOE) spacecraft that was developed for the Student Explorer Demonstration Initiative (StEDI) program managed by USRA.
Over the last 10 years, Mr. Westfall has gradually become more involved in electrical engineering management and systems engineering (i.e. more management and less real engineering).
Rod Barto is a consultant with the NASA Office of Logic Design at NASA/GSFC. He received his M.S. in Mathematics from U.T. El Paso and his Ph.D. in Electrical Engineering from U.T. Austin, then began his spacecraft electronics career as a designer on the Galileo ADCS computer, for which he received the NASA Public Service Medal. He has since been a designer and analyst on Magellan, NASA Scatterometer, INTELSAT telecommunications satellites, and numerous other NASA spacecraft, working with ADCS, telemetry and command systems, and science instruments. He has also been a researcher and designer of high performance computer systems including fault-tolerant computer architectures, logic simulation accelerators, and artificial intelligence applications.
Seminars: 2005 MAPLD International Conference
Home - NASA Office of Logic Design
Last Revised:
February 03, 2010
Digital Engineering Institute
Web Grunt: Richard Katz