Recent NLP models have outpaced the benchmarks to test for them. This post provides an overview of challenges and opportunities for NLP benchmarks. | Continue reading
7000+ languages are spoken around the world but NLP research has mostly focused on English. This post outlines why you should work on languages other than English. | Continue reading
This post outlines 10 things that I did during my PhD and found particularly helpful in the long run. | Continue reading
This post expands on the NAACL 2019 tutorial on Transfer Learning in NLP. It highlights key insights and takeaways and provides updates based on recent work. | Continue reading
Big changes are underway in the world of NLP. The long reign of word vectors as NLP's core representation technique has seen an exciting new line of challengers emerge. These approaches demonstrated that pretrained language models can achieve state-of-the-art results and herald a … | Continue reading
This is the second post based on the Frontiers of NLP session at the Deep Learning Indaba 2018. It discusses 4 major open problems in NLP. | Continue reading
Gradient descent is the preferred way to optimize neural networks and many other machine learning algorithms but is often used as a black box. This post explores how many of the most popular gradient-based optimization algorithms such as Momentum, Adagrad, and Adam actually work. | Continue reading
This post gathers 10 ideas that I found exciting and impactful this year—and that we'll likely see more of in the future. For each idea, it highlights 1-2 papers that execute them well. | Continue reading
Monolingual word embeddings are pervasive in NLP. To represent meaning and transfer knowledge across different languages, cross-lingual word embeddings can be used. Such methods learn representations of words in a joint embedding space. | Continue reading
This blog post gives an overview of transfer learning, outlines why it is important, and presents applications and practical methods. | Continue reading
This post presents a new resource to track the progress in NLP, including the datasets and the current state-of-the-art for the most common NLP tasks. | Continue reading
This blog post looks at variants of gradient descent and the algorithms that are commonly used to optimize them. | Continue reading