Motivation Decision-making plays a central role in engineering design. The decisions made throughout the design process need to be informed in order to ensure that design requirements are met. Processes such as engineering optimization, multi-disciplinary analysis and optimization, design space exploration, or sensitivity analysis, are used to inform these decisions. Applying them directly to the original computer code may become impractical because they require many “online” evaluations of the underlying computer analyses. Surrogate models are enablers for such approaches: they replace the original analysis with a virtually free-to-evaluate mathematical model whose training only incurs a one-time “offline” cost.
Challenges The two main observations that motivated my doctoral research are that 1) computationally expensive computer simulations and 2) high-dimensional independent parameter spaces (or more simply put, input spaces) are increasingly used in the context of engineering design. The first observation poses a challenge because it limits the number of analysis observations that may be gathered to build surrogate models under a limited computational budget. The second observation leads to the curse of dimensionality, which, in the context of surrogate modeling, means that increasingly more observations are needed to build accurate surrogate models. While methods exist to address each problem individually, it remains a challenge to address both problems at once.
Part 1: Fully Bayesian Approach First, I proposed an approach to approximation by ridge functions that is fully Bayesian. Approximation by ridge functions may be thought of as first projecting inputs on a low-dimensional subspace of the original input space, and then use these projected inputs to create a mapping. By a fully Bayesian approach, I mean that distributions for all parameters including the orthogonal projection matrix are inferred instead of the values of these parameters being optimized. I also investigated ways to determine the dimension of the low-dimensional subspace in which inputs the are projected. I showed that the proposed approach outperforms existing state-of-the-art methods in terms of predictive accuracy on a set of benchmark engineering datasets.
Part 2: Multi-Fidelity Extension Second, I extended this approach to the multi-fidelity context using deep multi-fidelity Gaussian processes and showed that, under certain circumstances, it is beneficial to leverage both low- and high-fidelity evaluations when both are available. I also investigated whether the low-dimensional subspaces should be the same, similar, or different for low- and high-fidelity mappings and found that best predictive accuracy are obtained when they are the same.
Part 3: Sampling Leveraging the Low-Dim. Space Finally, I studied ways to leverage the uncovered low-dimensional space when making new evaluations of the expensive analysis, for both design of experiments and adaptive sampling. I found that leveraging the knowledge of the low-dimensional space for performing designs of experiment leads to an increase of the resulting model’s predictive accuracy, especially for higher-dimensional input spaces (50 to 100) and when few observations are available.
Conclusion This thesis demonstrated the viability of the proposed fully Bayesian approach to approximation based on ridge approximations when only few observations of the underlying process are available. In addition, further developments leveraging the proposed approach were investigated. First, the method was extended to the multi-fidelity context using a modified deep multi-fidelity Gaussian process and it was shown that, under certain circumstances, it was beneficial to use both low- and high-fidelity evaluations. Finally, ways to use the uncovered low-dimensional subspace to sample the expensive function were explored and it was found that using the subspace to perform a design of experiments (DOE) leads to predictive models with higher predictive accuracy than if the DOE was done in the original high-dimensional input space.