Deep Learning is an interesting new branch of machine learning where neural networks consisting of multiple layers have shown new generalization capabilities. The seminar will look at advances in both general deep learning approaches, and at the specific case of Neural Machine Translation (NMT). NMT is a new paradigm in data-driven machine translation. In Neural Machine Translation, the entire translation process is posed as an end-to-end supervised classification problem, where the training data is pairs of sentences and the full sequence to sequence task is handled in one model.
Here is a link to last semester's seminar.
There is a Munich interest group for Deep Learning, which has an associated mailing list (initially organized by David Kaumanns). See the link here: http://www.cis.uni-muenchen.de/~davidk/deep-munich/
Email Address: SubstituteLastName@cis.uni-muenchen.de
CIS, LMU Munich
CIS, LMU Munich
Thursdays 14:30 s.t., *** NEW *** location is room 133 (this is to the right when standing in front of the CIS corridor).
Click here for directions to CIS.
New attendees are welcome. Read the paper and bring a paper or electronic copy with you, you will need to refer to it during the discussion.
If this page appears to be out of date, use the refresh button of your browser
|Thursday, April 27th||Chiyuan Zhang, Samy Bengio, Moritz Hardt, Benjamin Recht, Oriol Vinyals. Understanding deep learning requires rethinking generalization. ICLR 2017||paper||Alex Fraser|
|Thursday, May 11th||Roee Aharoni, Yoav Goldberg. Towards String-to-Tree Neural Machine Translation. ACL 2017||paper||Matthias Huck|
|Thursday, May 18th||Kawakami, Dyer and Blunsom. Learning to Create and Reuse Words in Open-Vocabulary Neural Language Modeling. ACL 2017||paper||Yadollah Yaghoobzadeh and Hinrich Schütze|
|Thursday, June 1st||Hot new trends in deep learning and NLP (discussion)||Yadollah, Ben, Wenpeng, and your ideas|
|Tuesday, June 13th at 12:15 ***ROOM CHANGE L155***||Talk: An Analysis of Neural Machine Translation and Combination with Statistical Machine Translation||Jan Niehues, Karlsruhe|
|Tuesday, June 20th at 12:15 ***ROOM CHANGE 131***||Talk: Paraphrasing with NMT, ngram embeddings and interactivity (industrial collaboration)||Mark Fishel, Tartu|
|Thursday, June 22nd||Jonas Gehring, Michael Auli, David Grangier, Denis Yarats, Yann N. Dauphin. Convolutional Sequence to Sequence Learning. ACL 2017||paper||Alex Fraser|
|Thursday, June 29th||Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser, Illia Polosukhin. Attention Is All You Need. arXiv 2017||paper||Helmut Schmid|
|Thursday, July 6th||Lijun Wu, Yingce Xia, Li Zhao, Fei Tian, Tao Qin, Jianhuang Lai, Tie-Yan Liu. Adversarial Neural Machine Translation. arXiv 2017 (new version from June 24th!)||paper||Dario Stojanovski|
|Thursday, July 13th||Philip Häusser, Alexander Mordvintsev, Daniel Cremers. Learning by Association - A versatile semi-supervised training method for neural networks. CVPR 2017.||paper||Martin Schmitt|
|Thursday, July 20th||Omer Levy, Anders Søgaard, Yoav Goldberg. A Strong Baseline for Learning Cross-Lingual Word Embeddings from Sentence Alignments. EACL 2017.||paper||Fabienne Braune|
|Thursday, July 27th||Natural Language Computing Group, Microsoft Research Asia. R-net: Machine reading comprehension with self-matching networks. (Technical Report version, newer than the ACL 2017 paper)||paper||Wenpeng Yin|
|Thursday, August 31st||Longyue Wang, Zhaopeng Tu, Andy Way, Qun Liu. Exploiting Cross-Sentence Context for Neural Machine Translation. arXiv 2017||paper||Dario Stojanovski|
Please click here for an NMT reading list, but also see the more general RNN reading list here (scroll down).