Avsnitt
-
In this episode, we talk about Word2Vec
References -
CS224n: Natural Language Processing with Deep Learning
Efficient Estimation of Word Representations in Vector Space
CBOW, Skip-Gram, Word Vectors, Word Embeddings -
In this episode, Abhishek and Shubhi discuss Language Models and how they've evolved from initial stages to the state-of-the-art model BERT.
Link to BERT : 1810.04805.pdf (arxiv.org)
Other Resources :
What is BERT | BERT For Text Classification (analyticsvidhya.com)Language Model In NLP | Build Language Model in Python (analyticsvidhya.com)
Understanding Word Embeddings: From Word2Vec to Count Vectors (analyticsvidhya.com)
-
Saknas det avsnitt?
-
In this episode, we talk about what's the intuition behind Question-Answering and also go over our favourite papers in this domain in 2020.
Links for the paper:
Unsupervised Question Decomposition for Question Answering
Fluent Response Generation for Conversational Question Answering -
In this episode, we cover the papers we liked the most in ACL 2020 and EMNLP 2020 on the topic of Text Summarisation.
Link to papers covered here:
What Have We Achieved on Text Summarization?
From Arguments to Key Points: Towards Automatic Argument Summarization
Discourse-Aware Neural Extractive Text Summarization
Towards Zero-Shot Conditional Summarization with Adaptive Multi-Task Fine-Tuning
-
Abhishek and Shubhi talk about 2020 and how it's really difficult to find good content on NLP. In this age of information overload, they decide to cut through the clutter and find some gems of NLP.