S-IT0068 - Python and PyTorch: Hands-On AI for Language Models
Target Audience
This seminar is designed for intermediate Python programmers and AI practitioners who are comfortable with Python, have basic machine learning knowledge, and are eager to deepen their skills in NLP and PyTorch-based model development.
Objectives
- Understand the Basics of Language Models
- Build and Fine-Tune Custom Models
- Apply Techniques to Solve Real-World NLP Problems
Content
- Introduction to Language Models and Transformers
- Fundamentals of NLP and language models (LMs), covering key concepts like tokenization, embeddings, and attention mechanisms.
- Overview of Transformer architectures and popular models like BERT and GPT.
- Building a Language Model from Scratch with Python and PyTorch
- Step-by-step guide to implementing a simple language model in PyTorch, covering layers, tokenization, and loss functions.
- Training and evaluating the model on a small dataset to understand performance and limitations.
- Fine-Tuning Pre-trained Models with Python and PyTorch
- Introduction to Hugging Face Transformers and PyTorch Lightning.
- Hands-on exercise: fine-tuning a pre-trained model for a specific task, such as sentiment analysis or text classification.
Prerequisites
Basic programming experience with Python and confidence in writing Python code are required.