My long-term research goal is to address a computational question: How can we build general problem-solving machines with human-like efficiency and adaptability? In particular, my research interests focus on the development of efficient learning algorithms for deep neural networks. I am also broadly interested in reinforcement learning, natural language processing and artificial intelligence.

For future students: I am starting the Assistant Professor position at the Department of Computer Science in mid 2018. Please apply through the department admission.

Short bio: I'm completing my PhD under the supervision of Geoffrey Hinton. Both my master (2014) and undergrad degrees (2011) are from the University of Toronto under Brendan Frey and Ruslan Salakhutdinov. I was a recipient of Facebook Graduate Fellowship 2016 in machine learning.

--Google scholar page contact me: jba at

Current postdocs

Bradly C. Stadie

Current students

Harris Chan (joint with Sanja Fidler)

Danijar Hafner

Jenny Liu

Silviu Pitis

Tingwu Wang (joint with Sanja Fidler)

Yeming Wen

Denny Wu (joint with Marzyeh Ghassemi)

Michael Zhang


On the Convergence and Robustness of Training GANs with Regularized Optimal Transport, Sanjabi, M., Ba, J., Razaviyayn M., Lee, J., NIPS 2018.

Reversible Recurrent Neural Networks, MacKay, M., Vicol P., Ba, J., Grosse, R., NIPS 2018.

NerveNet: Learning Structured Policy with Graph Neural Networks, Wang, T., Liao, R., Ba, J. and Fidler, S., ICLR 2018.

Kronecker-factored Curvature Approximations for Recurrent Neural Networks, Martens, J., Ba, J. and Johnson, M., ICLR 2018.

Flipout: Efficient Pseudo-Independent Weight Perturbations on Mini-Batches, Wen, Y., Vicol, P., Ba, J., Tran, D. and Grosse, R., ICLR 2018.

Scalable trust-region method for deep reinforcement learning using Kronecker-factored approximation, Wu, Y., Mansimov, E., Liao, S., Grosse, R. and Ba, J., NIPS 2017.

Automated Analysis of High‐content Microscopy Data with Deep Learning, Kraus, O., Grys, B., Ba, J., Chong, Y., Frey, B., Boone, C. and Andrews, B., Molecular Systems Biology, 2017.

Distrubted Second-order Optimization using Kronecker-factored Approximations, Ba, J., Grosse, R. and Martens, J., ICLR, 2017.

Layer Normalization, Ba, J., Kiros, J. R. and Hinton, G., arXiv preprint arXiv:1607.06450, 2016.

Using Fast Weight to Attend to the Recent Past, Ba, J., Hinton, G., Mnih, V., Leibo, J. and Ionescu, C., NIPS 2016.

Classifying Microscopy Images Using Convolutional Multiple Instance Learning, Kraus, O., Ba, J. and Frey, B., Bioinformatics 32(12) 2016.

Generating Images From Captions with Attention, Mansim, E., Parisotto, E., Ba, J. and Salakhutdinov, R., ICLR 2016.

Actor-Mimic: Deep Multitask and Transfer Reinforcement Learning, Parisotto, E., Ba, J. and Salakhutdinov, R., ICLR 2016.

Learning Wake-Sleep Recurrent Attention Models, Ba, J., Grosse, R., Salakhutdinov, R. and Frey, B., NIPS 2015.

Predicting Deep Zero-Shot Convolutional Neural Networks using Textual Descriptions, Ba, J., Swersky, K., Fidler, S. and Salakhutdinov, R., ICCV 2015.

Show, Attend and Tell: Neural Image Caption Generation with Visual Attention, Xu, K., Ba, J., Kiros, R., Cho, K., Courville, A., Salakhutdinov, R., Zemel, R. and Bengio, Y., ICML 2015.

Adam: A Method for Stochastic Optimization, Kingma D. and Ba, J., ICLR 2015.

Multiple Object Recognition with Visual Attention, Ba, J., Mnih, V. and Kavukcuoglu K., ICLR 2015.

Do deep nets really need to be deep?, Ba, J. and Caruana, R., NIPS 2014.

Adaptive Dropout for Training Deep Neural Networks, Ba, J. and Frey, B., NIPS 2013.