Sungjin Ahn
   Assistant Professor
   CBIM 9, 617 Bowser Rd., Piscataway, NJ, 08854
   sjn.lastname at

About Me
  • I'm an Assistant Professor at the Department of Computer Science at Rutgers UniversityI did my Ph.D. at the University of California, Irvine by studying on scalable approximate Bayesian inference under the supervision of Prof. Max WellingThen, I did my postdoc at MILA to work on deep learning under Prof. Yoshua BengioMy research interests include deep learning, Bayesian learning, deep reinforcement learning, and their connection to cognitive/neuroscience. My long term goal is to develop these methodologies to make an AI agent learn like humans in complex environments like the real world.

  • Our paper, Bayesian Model-Agnostic Meta-Learning, is accepted to NIPS 18. It is a spotlight!

  • I'm looking for highly-motivated PhD students and postdocs. 
    • PhD applicants: Please, put my name in your PhD application.
    • Postdocs: Please, send me your CV.
  • I joined the Department of Computer Science at Rutgers University as a tenure-track Assistant Professor.
    • About Artificial Intelligence Research at Rutgers
      • According to, as of Aug. 2018, Rutgers CS is ranked #17 in Artificial Intelligence, #12 in AI + ML + CV + Robotics + Data Mining in the US.

  • Ph.D. Students
    • Starting from Fall 2019
  • Postdoc, visitor, or internship

Research Focuses
  • I'm currently working on the following problems
    • Generative Models, Meta Learning, Representation Learning
  • Using the following methodologies
    • Deep Learning, Bayesian Learning, Reinforcement Learning
  • To apply to
    • Agent Learning & Artificial General Intelligence

  • Bayesian Model-Agnostic Meta-Learning
    T Kim*, J Yoon*, O. Dia, S. Kim, Y. BengioS. Ahn 
    [*first two authors contributed equally]

    [NIPS18] [pdf] Spotlight (top 3.5%)

  • Hierarchical Multiscale Recurrent Neural Networks
    J. Chung, S. AhnY. Bengio
    [ICLR17] [pdf]

  • Denoising Criterion for Variational Auto-Encoding Framework 
    D. Im, S. Ahn, R. Memisevic, Y. Bengio
    [AAAI17] [pdf]

  • SENA: Preserving Social Structure for Network Embedding 
  • S. Hong, T. Chakraborty, S. Ahn, G. Husari and N. Park 
    2017, ACM Conference on Hypertext and Social Media

  • A Neural Knowledge Language Model
    S. Ahn, H. Choi, T. Parnamaa, Y. Bengio
    [ArXiv16] [pdf] [dataset]

  • Hierarchical Memory Networks
    S. Chandar, S. Ahn, H. Larochelle, P. Vincent, G. Tasauro, Y. Bengio
    [ArXiv16] [pdf]

  • Learning Latent Multiscale Structure using Recurrent Neural Networks
    J. Chung, S. AhnY. Bengio
    NIPS 2016 Workshop on Neural Abstract Machines & Program Induction (NAMPI)

  • Pointing the Unknown Words 
    C. Gulcehre, S. Ahn, R. Nallapati, B. Zhou, Y. Bengio.
    [ACL16] [pdf]

  • Generating Factoid Questions with Recurrent Neural Networks: The 30M Factoid Question-Answer Corpus
    I. V. Serban*, A. G. Duran*, C. Gulcehre, S. Ahn, S. Chandar, A. Courville, Y. Bengio
    (* Equal contribution)
    [ACL16] [pdf] [dataset

  • Scalable MCMC for Mixed Membership Stochastic Blockmodels 
    W. Li*S. Ahn* and M. Welling (* Equal contribution)
    [AISTATS16] [pdf

  • Scalable Overlapping Community Detection
    I. El-Helw, R. Hofman, W. Li, S. Ahn, M. Welling, H. Bal
  • [ParLearning16] [pdfBest Paper Award

  • Stochastic Gradient MCMC: Algorithms and Applications 
    [PhD Dissertation 15] [pdf]

  • Large-Scale Distributed Bayesian Matrix Factorization using Stochastic Gradient MCMC
    S. Ahn, A. Korattikara, N. Liu, S. Rajan, and M. Welling
    [KDD15] [pdf] (Acceptance Rate: 19%)

  • Distributed Stochastic Gradient MCMC
    S. Ahn, B. Shahbaba, and M. Welling
    [ICML14] [pdf

  • Distributed and Adaptive Darting Monte Carlo through Regenerations
    S. Ahn, Y. Chen, and M. Welling
    [AISTATS13] [pdf

  • Bayesian Posterior Sampling via Stochastic Gradient Fisher Scoring
    S. Ahn, A. Korattikara, and M. Welling
    [ICML12] [pdfBest Paper Award

  • Proactive Context-Aware Sensor Networks 
    S. Ahn and D. Kim
    [EWSN06] (Acceptance Rate: 15%)