SUSTech 2024 Spring CS310 Natural Language Processing by Prof. Yang Xu.
All labs and assignments are based on Pytorch. GPU is recommended, which can save a lot of time during training.
- Python/Neural Network basics & NLP intro
- language models
- Word2vec
- RNN/LSTM
- Sequence labeling (part-of-speech tagging, named entity recognition)
- Context-free grammar and parsing
- Dependency parsing
- Transformer
- BERT (Pretraining)
- Natural language generation (decoding)
- Instruction tuning
- Prompting and Parameter Efficient Fine-Tuning (PEFT)
- Question anwsering
- Cognitive science basics
- Neural text classification
- Word2vec with skip-gram + negative sampling
- RNN language model + Bi-LSTM NER(named entity recognition)
- Nerual dependency parsing
- BERT pretraining (masked language modeling & next sentence prediction)
- There should be an assignment 6, but was canceled.
minBERT and Downstream Multitasking
Check out our repository here.
SLP3: Speech and Language Processing (3rd edition) by Daniel Jurafsky and James H. Martin