NASA Logo, National Aeronautics and Space Administration

Approach

We have developed a framework called DXF that allows systematic comparison and evaluation of diagnostic algorithms under identical experimental conditions. The key components of this framework include representation languages for the physical system description, sensor data and diagnosis results, a runtime architecture for executing diagnostic algorithms and diagnostic scenarios, and an evaluation component that computes performance metrics based on the results from diagnostic algorithm execution.

benchmark figure

The architecture of the benchmarking framework

The process to set up the framework in order to perform comparison/evaluation of a selected set of diagnostic algorithms on a specific physical system is as follows:

  1. The system is specified in an XML file called the System Catalog. The catalog includes the system’s components, connections, components’ operating modes, and a textual description of component behavior in each mode.
  2. The set of sensor points is chosen and sample data for nominal and fault scenarios are generated.
  3. Diagnostic algorithm (DA) developers use the system catalog and sample data to create their algorithms using a predefined Application Programming Interface (API) in order to receive sensor data and send the diagnosis results.
  4. A set of test scenarios (nominal and faulty) is selected to evaluate the DAs.
  5. The run-time architecture is used to execute the DAs on the selected test scenarios in a controlled experiment setting, and the diagnosis results are archived.
  6. Selected metrics are computed by comparing actual scenarios and diagnosis results from DAs. These metrics are then used to compute secondary metrics.

First Gov logo
NASA Logo - nasa.gov