Date of Completion

6-30-2020

Embargo Period

6-30-2020

Advisors

Prof. George Bollas, Prof. Ranjan Srivastava, Prof. Douglas Cooper

Field of Study

Chemical Engineering

Degree

Master of Science

Open Access

Open Access

Abstract

Methods for Fault Detection and Isolation (FDI) in systems with uncertainty have been studied extensively due to the increasing value and complexity of the maintenance and operation of modern Cyber-Physical Systems (CPS). CPS are characterized by nonlinearity, environmental and system uncertainty, fault complexity and highly non-linear fault propagation, which require advanced fault detection and isolation algorithms. Therefore, modern efforts develop active FDI (methods that require system reconfiguration) based on information theory to design tests rich in information for fault assessment. Information-based criteria for test design are often deployed as a Frequentist Optimal Experimental Design (FOED) problem, which utilizes the information matrix of the system. D- and Ds-optimality criteria for the information matrix have been used extensively in the literature since they usually calculate more robust test designs, which are less likely to be susceptible to uncertainty. However, FOED methods provide only locally informative tests, as they find optimal solutions around a neighborhood of an anticipated set of values for system uncertainty and fault severity. On the other hand, Bayesian Optimal Experimental Design (BOED) overcomes the issue of local optimality by exploring the entire parameter space of a system. BOED can, thus, provide robust test designs for active FDI. The literature on BOED for FDI is limited and mostly examines the case of normally distributed parameter priors. In some cases, such as in newly installed systems, a more generalized inference can be derived by using uniform distributions as parameter priors, when existing knowledge about the parameters is limited.

In BOED, an optimal design can be found by maximizing an expected utility based on observed data. There is a plethora of utility functions, but the choice of utility function impacts the robustness of the solution and the computational cost of BOED. For instance, BOED that is based on the Fisher Information matrix can lead to an alphabetical criterion such as D- and Ds-optimality for the objective function of the BOED, but this also increases the computational cost for optimization since these criteria involve sensitivity analysis with the system model. On the other hand, when an observation-based method such as the Kullback-Leibler divergence from posterior to prior is used to make an inference on parameters, the expected utility calculations involve nested Monte Carlo calculations which, in turn, affect computation time. The challenge in these approaches is to find an adequate but relatively low Monte Carlo sampling rate, without introducing a significant bias on the result. Theory shows that for normally distributed parameter priors, the Kullback-Leibler divergence expected utility reduces to a Bayesian D-optimality. Similarly, Bayesian Ds-optimality can be used when the parameter priors are normally distributed. In this thesis, we prove the validity of the theory on a three-tank system using normally and uniformly distributed parameter priors to compare the Bayesian D-optimal design criterion and the Kullback-Leibler divergence expected utility. Nevertheless, there is no observation-based metric similar to Bayesian Ds-optimality when the parameter priors are not normally distributed.

The main objective of this thesis is to derive an observation-based utility function similar to the Ds-optimality that can be used even when the requirement for normally distributed priors is not met. We begin our presentation with a formalistic comparison of FOED and BOED for different objective metrics. We focus on the impact different utility functions have on the optimal design and their computation time. The value of BOED is illustrated using a variation of the benchmark three-tank system as a case study. At the same time, we present the deterministic variance of the optimal design for different utility functions for this case study. The performance of the various utility functions of BOED and the corresponding FOED optimal designs are compared in terms of Hellinger distance. Hellinger distance is a bounded distribution metric between 0 and 1, where 0 indicates a complete overlap of the distributions and 1 indicates the absence of common points between the distributions. Analysis of the Hellinger distances calculated for the benchmark system shows that BOED designs can better separate the distributions of system measurements and, consequently, can classify the fault scenarios and the no-fault case with less uncertainty. When a uniform distribution is used as a parameter prior, the observation-based utility functions give better designs than FOED and Bayesian D-optimality, which use the Fisher information matrix. The observation-based method, similar to Ds-optimality, finds a better design than the observation-based method similar to D-optimality, but it is computationally more expensive. The computational cost can be lowered by reducing the Monte Carlo sampling, but, if the sampling rate is reduced significantly, an uneven solution plane is created affecting the FDI test design and assessment. Based on the results of this analysis, future research should focus on decreasing the computational cost without affecting the test design robustness.

Major Advisor

Prof. George Bollas

Share

COinS