Abstract
This presentation introduces our work on enabling inference of deep networks that retain high accuracy for the least possible model complexity, with the latter de- duced from the data during inference. To this end, we revisit deep networks that comprise competing linear units, as opposed to nonlinear units that do not entail any form of (local) competition. In this context, our main technical innovation consists in an inferential setup that leverages solid arguments from Bayesian nonparametrics. We infer both the needed set of connections or locally competing sets of units, as well as the required floating- point precision for storing the network parameters. As we experimentally show using benchmark datasets, our approach yields networks with less computational footprint than the state-of-the-art, and with no compromises in predictive accuracy.
Bio
Dr. Sotirios P. Chatzis is an Assistant Professor at the department of Electrical Engineering, Computer Engineering and Informatics (EECEI) at CUT. He has first-authored more than 60 papers in the most prestigious journals and conferences of his research field. He has totally published more than 80 papers in the aforementioned venues. He is PC member of several top-tier venues in Machine Learning, including NIPS, AAAI, ICLR and ICML. He is serving as an Associate Editor of the IEEE Transactions on Signal Processing. His interests focus on nonparametric statistics, deep graphical models, and variational inference. He is the coordinator of the research projects totalling more than 3M Euro in funding. His current work focuses on Deep Learning for difficult problems, such as Natural Language Understanding and Computer Vision. His research lab at CUT comprises 14 full-time researchers, including 7 PhD students.