In coming tutorials on this blog I will be dealing with how to create deep learning models that predict text sequences. However, before we get to that point we have to understand some key Natural Language Processing (NLP) ideas. One of the key ideas in NLP is how we can efficiently convert words into numeric vectors which can then be “fed into” various machine learning models to perform predictions. The current key technique to do this is called “Word2Vec” and this is what will be covered in this tutorial. After discussing the relevant background material, we will be implementing Word2Vec embedding using TensorFlow (which makes our lives a lot easier). To get up to speed in TensorFlow, check out my TensorFlow tutorial.
Recommended online course: If you are more of a video course learner, check out this inexpensive Udemy course: Natural Language Processing with Deep Learning in Python