Sungjin Ahn
  Research Scientist 
  Element AI
  
  




About Me
  • I'm a Research Scientist at Element AI. Prior to this, I was a postdoc working on Deep Learning under Prof. Yoshua Bengio at the
    University of Montreal. I did my Ph.D. under the supervision of Prof. 
    Max Welling at the University of California, Irvine, working
    on scalable approximate Bayesian inference. Starting from Fall 2018, I will be joining the Department of Computer Science at 
    Rutgers University as a tenure-track Assistant Professor. 
    My current research interests include deep learning, deep reinforcement
    learning, Bayesian learning, and neuroscience/psychology-inspired learning algorithms. I apply these methodologies to make
    cognitive agents learn continually in complex environments like humans.

News!
  • Our new paper on Bayesian meta-learning is arxived! [pdf]

  • I will be joining the Department of Computer Science at Rutgers University as a tenure-track Assistant Professor starting Fall 2018.
    • In the US, Rutgers CS is ranked #17 in Artificial Intelligence, #13 in (ML + CV + Robotics + Data Mining) by csrankings.org

  • Openings
    • Full-time Summer Research Internship @ Element AI
    • Ph.D. @ Rutgers
      • Starting from Fall 2018 2019
    • Postdoc, visitor, or internship @ Rutgers

Research Focuses
  • I'm currently working on the following problems
    • Meta Learning + Representation Learning + Generative Models
  • Using the following methodologies
    • Deep Learning + Reinforcement Learning + Bayesian Learning
  • To achieve / apply to
    • Agent Learning & Artificial General Intelligence

Contact
  • sjn.lastname at gmail.com

Publications
  • Bayesian Model-Agnostic Meta-Learning
    T Kim*, J Yoon*, O. Dia, S. Kim, Y. BengioS. Ahn 
    [*first two authors contributed equally]

    [ArXiv18] [pdf]

  • Hierarchical Multiscale Recurrent Neural Networks
    J. Chung, S. AhnY. Bengio
    [ICLR17] [pdf]

  • Denoising Criterion for Variational Auto-Encoding Framework 
    D. Im, S. Ahn, R. Memisevic, Y. Bengio
    [AAAI17] [pdf]

  • SENA: Preserving Social Structure for Network Embedding 
  • S. Hong, T. Chakraborty, S. Ahn, G. Husari and N. Park 
    2017, ACM Conference on Hypertext and Social Media

  • A Neural Knowledge Language Model
    S. Ahn, H. Choi, T. Parnamaa, Y. Bengio
    [ArXiv16] [pdf] [dataset]

  • Hierarchical Memory Networks
    S. Chandar, S. Ahn, H. Larochelle, P. Vincent, G. Tasauro, Y. Bengio
    [ArXiv16] [pdf]

  • Learning Latent Multiscale Structure using Recurrent Neural Networks
    J. Chung, S. AhnY. Bengio
    NIPS 2016 Workshop on Neural Abstract Machines & Program Induction (NAMPI)

  • Pointing the Unknown Words 
    C. Gulcehre, S. Ahn, R. Nallapati, B. Zhou, Y. Bengio.
    [ACL16] [pdf]

  • Generating Factoid Questions with Recurrent Neural Networks: The 30M Factoid Question-Answer Corpus
    I. V. Serban*, A. G. Duran*, C. Gulcehre, S. Ahn, S. Chandar, A. Courville, Y. Bengio (* Equal contribution)
    [ACL16] [pdf] [dataset

  • Scalable MCMC for Mixed Membership Stochastic Blockmodels 
    W. Li*S. Ahn* and M. Welling (* Equal contribution)
    [AISTATS16] [pdf

  • Scalable Overlapping Community Detection
    I. El-Helw, R. Hofman, W. Li, S. Ahn, M. Welling, H. Bal
  • [ParLearning16] [pdfBest Paper Award

  • Stochastic Gradient MCMC: Algorithms and Applications 
    [PhD Dissertation 15] [pdf]

  • Large-Scale Distributed Bayesian Matrix Factorization using Stochastic Gradient MCMC
    S. Ahn, A. Korattikara, N. Liu, S. Rajan, and M. Welling
    [KDD15] [pdf] (Acceptance Rate: 19%)

  • Distributed Stochastic Gradient MCMC
    S. Ahn, B. Shahbaba, and M. Welling
    [ICML14] [pdf

  • Distributed and Adaptive Darting Monte Carlo through Regenerations
    S. Ahn, Y. Chen, and M. Welling
    [AISTATS13] [pdf

  • Bayesian Posterior Sampling via Stochastic Gradient Fisher Scoring
    S. Ahn, A. Korattikara, and M. Welling
    [ICML12] [pdfBest Paper Award

  • Proactive Context-Aware Sensor Networks 
    S. Ahn and D. Kim
    [EWSN06] (Acceptance Rate: 15%)