Seminar on Math of NLP
multigrid.org • • #NLP#seminar
State 1
- Introduction to NLP, Benyou Wang (CHHK Shenzhen)
- Word Vectors and Word Window Classification
- Dependency Parsing
- Recurrent Neural Networks and Language Models
- Vanishing Gradients, Fancy RNNs, Seq2Seq
- Machine Translation, Attention, Subword Models
- Transformers
- More about Transformers and Pretraining
- Pretrained models: GPT, Llama, …
State 2
- Natural Language Generation
- Integrating knowledge in language models
- Bias, toxicity, and fairness
- Retrieval Augmented Models + Knowledge
- ConvNets, Tree Recursive Neural Networks and Constituency
- Scaling laws for large models
- Editing Neural Networks