Da Tang's world

Dream for future

Short Bio

I am a fifth year Ph.D. student at the Machine Learning Lab of Columbia University (under the supervision of Prof. Tony Jebara). Previously, I was an undergraduate student at Institute for Interdisciplinary Information Sciences, Tsinghua University (under the supervision of Prof. Jun Zhu). I am interested in Machine Learning and Artificial Intelligence as well as their applications to related fields, like Computer Vision and Natual Language Processing.

To contact me, please email me at qltedxtc@gmail.com or datang@cs.columbia.edu.

Education Background

Time (or Expected Time) Degree Level Major Second Major School
2015-2020 Doctoral Degree Computer Science Columbia University
2011-2015 Bachelor's Degree Computer Science Pure and Applied Math Tsinghua University

Work Experiences

1. Research Intern, instructed by Dr. Dawen Liang, Netflix Inc., Los Gatos, California, United States.

2. Research Intern, instructed by Dr. Jianfeng Gao, Dr. Chong Wang, Dr. Lihong Li, and Mr. Xiujun Li, Microsoft Research, Redmond, Washington, United States.

3. Software Engineering Intern, instructed by Dr. Lan Nie and Dr. Sajid Siddiqi, Google Inc., Mountain View, California, United States.

4. Research Intern, instructed by Dr. Tong Zhang, Baidu Inc., Beijing, China.

For more information, please look at my CURRICULUM VITAE.

Publications

Preprints

1. Da Tang, Dawen Liang, Nicholas Ruozzi and Tony Jebara, Learning Correlated Latent Representations with Adaptive Priors, arXiv preprint arXiv:1906.06419, 2019 [arXiv]

2. Da Tang and Tong Zhang, On the Duality Gap Convergence of ADMM Methods, arXiv preprint arXiv:1508.03702, 2015 [arXiv]

Conference Papers

1. Da Tang, Dawen Liang, Tony Jebara and Nicholas Ruozzi, Correlated Variational Auto-Encoders, International Conferences on Machine Learning (ICML), 2019 (short oral) [arXiv, code, slides, oral (1:16:16 ~ 1:20:43)]

2. Da Tang and Rajesh Ranganath, The Variational Predictive Natural Gradient, International Conferences on Machine Learning (ICML), 2019 (short oral) [arXiv, code, slides, oral (33:18 ~ 37:46)]

3. Da Tang, Xiujun Li, Jianfeng Gao, Chong Wang, Lihong Li and Tony Jebara, Subgoal Discovery for Hierarchical Dialogue Policy Learning, Conference on Empirical Methods in Natural Language Processing (EMNLP), 2018 (long paper, oral) [arXiv, oral]

4. Da Tang and Tony Jebara, Initialization and Coordinate Optimization for Multi-way Matching, International Conference on Artificial Intelligence and Statistics (AISTATS), 2017 [arXiv]

5. Tianlin Shi, Da Tang, Liwen Xu and Thomas Moscibroda, Correlated Compressive Sensing for Networked Data, Conference on Uncertainty in Artificial Intelligence (UAI), 2014 [PDF]

Workshop Papers

1. Jingxi Xu, Da Tang and Tony Jebara, Active Multitask Learning with Committees, Workshop on Adaptive and Multitask Learning: Algorithms and Systems, International Conferences on Machine Learning (ICML), 2019 [link]

2. Da Tang, Dawen Liang, Tony Jebara and Nicolas Ruozzi, Correlated Variational Auto-Encoders, Workshop on Deep Generative Models for Highly Structured Data, International Conferences on Learning Representations (ICLR), 2019 [link]

3. Da Tang, Dawen Liang and Tony Jebara, Correlated Variational Auto-Encoders, Workshop on Bayesian Deep Learning, Conference on Neural Information Processing Systems (NeurIPS), 2018 [PDF]

4. Giannis Karamanolakis, Kevin Raji Cherian, Ananth Ravi Narayan, Jie Yuan, Da Tang and Tony Jebara, Item Recommendation with Variational Autoencoders and Heterogenous Priors, Workshop on Deep Learning for Recommender Systems, ACM Conference on Recommender Systems (RecSys), 2018 (oral) [arXiv]

5. Da Tang and Rajesh Ranganath, Natural Gradients via the Variational Predictive Distribution, Workshop on Advances in Approximate Bayesian Inference, Conference on Neural Information Processing Systems (NeurIPS), 2017 (spotlight) [PDF]