Date of Completion
Bayesian model diagnostics, Bayesian model selection, Bregman clustering, Bregman divergence, Gaussian and Diffused-gamma (GD) prior, Iterated Conditional Modes, Posterior consistency, Rank reduction, Sparse high-dimensional data
Dipak K. Dey
Field of Study
Doctor of Philosophy
This dissertation has mainly focused on the development of statistical theory, methodology, and application from a Bayesian perspective using a general class of divergence measures (or loss functions), called Bregman divergence. Many applications of Bregman divergence have played a key role in recent advances in machine learning. My goal is to turn the spotlight on Bregman divergence and its applications in Bayesian modeling. Since Bregman divergence includes many well-known loss functions such as squared error loss, Kullback-Leibler divergence, Itakura-Saito distance, and Mahalanobis distance, the theoretical and methodological development unify and extend many existing Bayesian methods. The broad applicability of both Bregman divergence and Bayesian approach can handle diverse types of data such as circular data, high-dimensional data, multivariate data and functional data. Furthermore, the developed methods are flexible to be applied to real applications in various scientific fields including biology, physical sciences, and engineering.
Goh, Gyuhyeong, "Applications of Bregman Divergence Measures in Bayesian Modeling" (2015). Doctoral Dissertations. 785.