Course Schedule

13 Sessions | September - December 2025

Course Progress

0%
0 of 13 sessions completed Course starts Sep 12, 2025

The course follows a progressive structure from foundational statistical models to cutting-edge neural architectures. Each session builds on previous concepts through hands-on coding exercises and theoretical deep dives. Red-bordered sessions are mandatory presentations.

Course Structure: Weeks 1-5 cover foundations (N-grams to LSTMs), Weeks 6-9 explore transformers and advanced architectures, Weeks 10-13 focus on practical applications and optimization.

Timeline

12.09.25
Shakespeare und N-Grams
Unser eigenes Shakespeare Sonnet
Understand probabilistic language models and build a text generator using n-gram statistics
19.09.25
Word Embeddings
Einfuhrung in Word Embeddings
Learn how words can be represented as vectors and explore semantic relationships in embedding space
26.09.25
Neuronale Netze
Function Approximation
Master neural network fundamentals: forward propagation, backpropagation, and universal function approximation
03.10.25
RNNs
Recurrent Neural Networks
Discover how recurrent architectures process sequential data and handle variable-length inputs
17.10.25
LSTMs
Long Short-Term Memory
Understand the vanishing gradient problem and how LSTM gates enable long-term memory
24.10.25
Zwischenprasentation
10 min + 10 min Q&A
31.10.25
Sequence-to-Sequence
Predicting the next sentence
Build encoder-decoder models for translation and learn attention mechanisms
07.11.25
Transformers
The Transformer revolution
Master self-attention, multi-head attention, and the transformer architecture
14.11.25
Multi-Agent LLMs
Multi-Agent Systems
Explore how multiple language models can collaborate to solve complex tasks
21.11.25
Kurzprasentation Option 1
5 min + 5 min Q&A
28.11.25
Kurzprasentation Option 2
5 min + 5 min Q&A
05.12.25
Decoding Strategies
Text Generation Methods
Compare greedy, beam search, sampling, and nucleus sampling for text generation
12.12.25
Abschlussprasentation
15-20 min + 5-10 min Q&A