Το έργο με τίτλο Regularized optimization applied to clustering and joint estimation of multiple undirected graphical models από τον/τους δημιουργό/ούς Georgogiannis Alexandros διατίθεται με την άδεια Creative Commons Αναφορά Δημιουργού 4.0 Διεθνές
Βιβλιογραφική Αναφορά
Alexandros Georgogiannis, "Regularized optimization applied to clustering and joint estimation of multiple undirected graphical models", Master Thesis, School of Electronic and Computer Engineering, Technical University of Crete, Chania, Greece, 2014
https://doi.org/10.26233/heallink.tuc.21011
Since its earliest days as a discipline, machine learning has made use of optimization formulations and algorithms. Likewise, machine learning has contributed to optimization, driving the develop- ment of new optimization approaches that address the significant challenges presented by machine learning applications. This influence continues to deepen, producing a growing literature at the intersection of the two fields while attracting leading researchers to the effort. While techniques proposed twenty years ago continue to be refined, the increased complexity, size, and variety of today’s machine learning models demand a principled reassessment of existing assumptions and techniques. This thesis makes a small step toward such a reassessment. It describes novel contexts of established frameworks such as convex relaxation, splitting methods, and regularized estimation and how we can use them to solve significant problems in data mining and statistical learning.The thesis is organised in two parts. In the first part, we present a new clustering algorithm. The task of clustering aims at discovering structures in data. This algorithm is an extension of recently proposed convex relaxations of k-means and hierarchical clustering. In the second part, we present a new algorithm for discovering dependencies among common variables in multiple undirected graphical models. Graphical models are useful for the description and modelling of multivariate systems. In the appendix, we comment on a core problem underlying the whole study and we give an alternative solution based on recent advances in convex optimization.