Acest curs se adresează Machine Learning Engineers.
În cadrul acestui curs, studenții vor învăța despre cele mai populare arhitecturi, incluzând Recurrent Natural Networks and Hidden Markov Models.
Pentru a putea participa în cadrul acestui curs, studenții trebuie să fi parcurs modulul Basic Machine Learning in Tensorflow/Keras.
Este util ca participanții sa aibă următoarele cunoștințe:
Basic Deep Learning
● Neurons
● Types of Layers
● Networks
● Loss Functions
● Optimizers
● Overfitting
● Tensorflow
Basic Neural Language Processing
● Tokenization
● Bag of words
● tf-idf
● Stemming
● Lemmatization
● Language models
● Sentiment analysi
Module 1: NLP applications
Module 2: Word vectors
- What are vectors?
- Word analogies
- TF-IDF and t-SNE
- NLTK
- GloVe
- word2vec
- Text classification using word vectors
Hands-on Lab: Performing a basic text classification using multiple word vectors models. Improve it by using basic text processing and language models to get the data ready for machine learning.
Module 3: Language modeling
- Bigrams
- Language models
- Neural Network Bigram Model
Hands-on Lab: Performing text classification using neural networks based on language models. Understand the probabilistic modeling of language model, how to improve the context of a word and how synonyms can be generated and how basic neural networks generate powerful language models.
Module 4: Word Embeddings
- CBOW
- Skip-Gram
- Negative Sampling
Hands-on Lab: Understand advanced techniques for language modeling like Skip-Gram and Negative Sampling by implementing them and learn to predict the next most likely word in a conversation.
Module 5: NLP techniques
- What is POS Tagging?
- POS Tagging Recurrent Neural Network
- POS Tagging Hidden Markov Model (HMM)
- Named Entity Recognition (NER)
- POS vs. NER
Hands-on Lab: Use NLTK and SCIPY to improve your classification using grammar rules and POS, then use NER to highlight the most valuable content of a phrase, afterwards implement summarization.
Module 6: Recurrent Neural Networks
- LSTM
- GRU
- Text Generation
Hands-on Lab: Implement in Keras a basic RNN architecture for word prediction, using the already studied word embeddings. Benchmark the performances of LSTM compared to GRU and BiLSTM.
Module 7: Generative Neural Networks
Hands-on Lab: Implement in Keras your own generative model that generates lyrics similar to the ones from Shakespeare. Learn to make Transfer Learning on text.