• ACL 2020: My Highlights

    ACL 2020 has been very special, since it is my first conference to attend. I have found the virtual version to be very nice (Altough my testimony is a bit undermined by the fact that I have not experienced an actual conference before. So, I cannot really compare the virtual version to the actual one). Anyway, I found the discussions, Q&A, chat rooms, and the live talks were very engaging and interesting!

  • Current Issues with Transfer Learning in NLP

    Natural Language Processing (NLP) has recently witnessed dramatic progress with state-of-the-art results being published every few days. Leaderboard madness is diriving the most common NLP benchmarks such as GLUE and SUPERGLUE with scores that are getting closer and closer to human-level performance. Most of these results are driven by transfer learning from large scale datasets through super large (Billions of parameters) models. My aim in this article is to point out the issues and challenges facing transfer learning and point out some possible solutions to such problems.

  • Lightweight and Dynamic Convolutions Explained

    Self-attention models suffer from quadratic time complexity in terms of the the input size. We discuss a paper that proposes a variant of the convolution operation named Lightweight Convolutions that scales linearly with the input size while performaing comparably with state-of-the-art self-attention models.

  • Paper Discussion: Area Attention

    Discussion of a recent pre-print using a variant of attention called Area Attention: Instead of considereing single entities for attention, why don’t we consider an aggregate of an area of adjacent items. (The paper was rejected in ICLR 2019, but I thought the idea is worth exploring nonetheless)

  • Paper Discussion: Discrete Generative Models for Sentence Compression

    I will discuss the 2016 paper Language as a Latent Variable: Discrete Generative Models for Sentence Compression. The reason why I chose this paper is two-fold : First, it combines a lot of important ideas and concepts such as Variational Auto Encoders, Semi-Supervised learning and Reinforcement Learning.

  • Step-by-Step Text Classification Tutorial using Tensorflow

    In this post, I will walk you through using Tensorflow to classify news articles. Before you begin, you should have tensorflow, numpy and scikit-learn installed.

  • Predicting Movie Genre from Movie Title

    In this post we will attempt at the interesting classification problem : Predicting a movie genre from only its title. It would be very interesting to be able to make such prediction. It can be used to cluster movies based on genre. Plus it’s a great way to explore various classification problems and the very famous word embeddings as well.