Current Topics in Natural Language Processing (WS 2017-2018)


Deep Learning is an interesting new branch of machine learning where neural networks consisting of multiple layers have shown new generalization capabilities. The seminar will look at advances in both general deep learning approaches, and at the specific case of Neural Machine Translation (NMT). NMT is a new paradigm in data-driven machine translation. In Neural Machine Translation, the entire translation process is posed as an end-to-end supervised classification problem, where the training data is pairs of sentences and the full sequence to sequence task is handled in one model.

Here is a link to last semester's seminar.

There is a Munich interest group for Deep Learning, which has an associated mailing list (initially organized by David Kaumanns), the paper announcements are sent out on this list. See the link here:


Alexander Fraser

Email Address:

CIS, LMU Munich

Hinrich Schütze

CIS, LMU Munich


Thursdays 14:30 (s.t.), location is room CIS Meeting Room (C105).

Click here for directions to CIS.

New attendees are welcome. Read the paper and bring a paper or electronic copy with you, you will need to refer to it during the discussion.

If this page appears to be out of date, use the refresh button of your browser

Date Paper Links Discussion Leader
Thursday, October 12th Philipp Koehn and Rebecca Knowles. Six Challenges for Neural Machine Translation. Workshop on Neural Machine Translation 2017. paper Alex Fraser
Thursday, October 19th Recent conference papers no paper Ben Roth
Thursday, October 26th Long Duong, Hiroshi Kanayama, Tengfei Ma, Steven Bird, Trevor Cohn (2016). Learning Crosslingual Word Embeddings without Bilingual Corpora. EMNLP paper Fabienne Braune
Thursday, November 9th Tao Shen, Tianyi Zhou, Guodong Long, Jing Jiang, Shirui Pan, Chengqi Zhang (2017). DiSAN: Directional Self-Attention Network for RNN/CNN-free Language Understanding. arXiv. paper Dario Stojanovski
Thursday, November 16th Rajarshi Das, Manzil Zaheer, Siva Reddy, Andrew McCallum (2017). Question Answering on Knowledge Bases and Text using Universal Schema and Memory Networks (2017). ACL. paper Ben Roth
Thursday, November 23rd Guillaume Lample, Ludovic Denoyer, Marc'Aurelio Ranzato (2017). Unsupervised Machine Translation Using Monolingual Corpora Only. arXiv. ICLR 2018
(outdated) arxiv
Helmut Schmid
Thursday, November 30th Alexis Conneau, Guillaume Lample, Marc'Aurelio Ranzato, Ludovic Denoyer, Hervé Jégou (2017). Word Translation Without Parallel Data. arXiv. (updated) paper Sebastian Wagner
Thursday, December 7th Zhaopeng Tu, Yang Liu, Shuming Shi, Tong Zhang (2018). Learning to Remember Translation History with a Continuous Cache. TACL. paper Matthias Huck
Thursday, January 11th Sara Sabour, Nicholas Frosst, Geoffrey Hinton (2017). Dynamic Routing Between Capsules. NIPS. paper Ben Roth
Thursday, January 18th James Bradbury, Stephen Merity, Caiming Xiong, Richard Socher. Quasi-Recurrent Neural Networks. ICLR 2017 paper (arxiv is outdated) Dario Stojanovski
Thursday, January 25th Peters et al. (was Anonymous). Deep contextualized word representations. NAACL 2018. paper Hinrich Schütze
Thursday, February 1st Holger Schwenk and Matthijs Douze. Learning Joint Multilingual Sentence Representations with Neural Machine Translation. RepL4NLP Workshop 2017. paper Matthias Huck
Thursday, February 8th David Alvarez-Melis, Tommi S. Jaakkola (2017). A causal framework for explaining the predictions of black-box sequence-to-sequence models. EMNLP paper Nina Pörner
Thursday, March 1st Yonatan Belinkov, Yonatan Bisk (2018). Synthetic and Natural Noise Both Break Neural Machine Translation. ICLR 2018 paper Alex Fraser
Thursday, March 8th Chao Qiao, Bo Huang, et al. (2018). A new method of region embedding for text classification. ICLR 2018 paper Philipp Dufter
Thursday, March 29th Jason D. Williams, Kavosh Asadi, Geoffrey Zweig (2017). Hybrid Code Networks: practical and efficient end-to-end dialog control with supervised and reinforcement learning. ACL 2017. paper Alena Moiseeva

Further literature:

Please click here for an NMT reading list, but also see the more general RNN reading list here (scroll down). You can also go back through the previous semesters by clicking on the link near the top of the page.