1. Introduction Miguel Rodrigues, Stark Draper, Waheed Bajwa and Yonina Eldar; 2. An information theoretic approach to analog-to-digital compression Alon Knipis, Yonina Eldar and Andrea Goldsmith; 3. Compressed sensing via compression codes Shirin Jalali and Vincent Poor; 4. Information-theoretic bounds on sketching Mert Pillanci; 5. Sample complexity bounds for dictionary learning from vector- and tensor-valued data Zahra Shakeri, Anand Sarwate and Waheed Bajwa; 6. Uncertainty relations and sparse signal recovery Erwin Riegler and Helmut Bölcskei; 7. Understanding phase transitions via mutual Information and MMSE Galen Reeves and Henry Pfister; 8. Computing choice: learning distributions over permutations Devavrat Shah; 9. Universal clustering Ravi Raman and Lav Varshney; 10. Information-theoretic stability and generalization Maxim Raginsky, Alexander Rakhlin and Aolin Xu; 11. Information bottleneck and representation learning Pablo Piantanida and Leonardo Rey Vega; 12. Fundamental limits in model selection for modern data analysis Jie Ding, Yuhong Yang and Vahid Tarokh; 13. Statistical problems with planted structures: information-theoretical and computational limits Yihong Wu and Jiaming Xu; 14. Distributed statistical inference with compressed data Wenwen Zhao and Lifeng Lai; 15. Network functional compression Soheil Feizi and Muriel Médard; 16. An introductory guide to Fano's inequality with applications in statistical estimation Jonathan Scarlett and Volkan Cevher.