Front Cover
Functional Error Correction for Robust Neural Networks
Extracting Robust and Accurate Features via a Robust Information Bottleneck
Physical Layer Communication via Deep Learning
Reliable digital communication is a primary workhorse of the modern information age. The disciplines of communication, coding, and information theories drive the innovation by designing efficient codes that allow transmissions to be robustly and efficiently decoded. Progress in near optimal codes is made by individual human ingenuity over the decades, and breakthroughs have been, befittingly, sporadic and spread over several decades. Deep learning is a part of daily life where its successes can be attributed to a lack of a (mathematical) generative model.
Deep Learning Techniques for Inverse Problems in Imaging
Expression of Fractals Through Neural Network Functions
MaxiMin Active Learning in Overparameterized Model Classes
The Information Bottleneck Problem and its Applications in Machine Learning
Inference capabilities of machine learning (ML) systems skyrocketed in recent years, now playing a pivotal role in various aspect of society. The goal in statistical learning is to use data to obtain simple algorithms for predicting a random variable Y from a correlated observation X. Since the dimension of X is typically huge, computationally feasible solutions should summarize it into a lower-dimensional feature vector T, from which Y is predicted.