On the causality view of “Context-Aware Learning for Neural Machine Translation”

[Notice: what an unfortunate timing! This post is definitely NOT an april fool’s joke.] Sebastien Jean and I had a paper titled <context-aware learning for neural machine translation> rejected from NAACL’19, perhaps understandable because we did not report any substantial gain in the BLEU score. As I finally found some time to read Pearl’s <Book of Why> due to a personal reason  (yes, personal reasons sometimes can help), I thought I wrote a short note on how the idea in this paper was originally motivated. As I was never educated in causal inference or learning, I was scared of using a term

Best Paper Award at ISMIR 2017

Keunwoo Choi, George Fazekas, Mark Sandler and I have received the Best Paper Award at the 18th Annual International Society for Music Information Retrieval Conference (ISMIR). The paper is <Transfer Learning for Music Classification and Regression Tasks> which investigates different ways to exploit the knowledge captured within a deep convolutional network trained to tag a song for other relevant tasks. The main idea is to use not only the final hidden activation vector (as has been usual in computer vision) but to use the activations from all the layers, as some target tasks may require low-level details. Check it out

CIFAR Azrieli Global Scholar 2017

I’m happy to announce that I’ve been selected as a CIFAR Azrieli Global Scholar this year (announcement). This appointment is for two years, and I will be able to attend awesome CIFAR meetings to hang out with and hear from awesome CIFAR fellows. I am especially looking forward to meetings organized by the Learning in Brain and Machine (LMB) Programme of CIFAR co-directed by Yoshua Bengion and Yann LeCun, in addition to interacting with other Azrieli Global Scholars and a broader set of fellows from other programmes at CIFAR . It is my understanding that this is not intended to award my

Lecture note “Brief Introduction to Machine Learning without Deep Learning”

This past Spring (2017), I taught the undergrad <Intro to Machine Learning> course. This was not only the first time for me to teach <Intro to Machine Learning> but also the first time for me to teach an undergrad course (!) This course was taught a year before by David Sontag who has now moved to MIT. Obviously, I thought about re-using David’s materials as they were, which you can find at http://cs.nyu.edu/~dsontag/courses/ml16/. These materials are really great, and the coverage of various topics in ML is simply amazing. I highly recommend all the materials on this web page. All the things

Google Faculty Award: 2016

My research proposal on <A Trainable Decoding Algorithm for Neural Machine Translation> has been selected for Google Research Award 2016 (it’s a bit confusing whether it’s 2016 or 2017; deadline in 2016 but decision in 2017.) I’d like to thank Google for this award which would greatly help my research. Gotta go buy a few more GPU’s! For more info, see https://research.googleblog.com/2017/02/google-research-awards-2016.html.

1 8 9 10 11 12 15