LLM Basics
Understand the fundamental building blocks of Generative AI. This module covers how LLMs work under the hood, from tokenization to the Transformer architecture.
Module Contents
1. What are LLMs?
Understand LLMs as probabilistic engines. Includes a Next Token Prediction simulator.
2. Tokenization
How text becomes numbers. Explore BPE vs Character tokenization with an Interactive Tokenizer Playground.
3. The Transformer Architecture
Dive into the “Attention Is All You Need” paper. Visualize Self-Attention weights interactively.
Review: Flashcards & Cheat Sheet
Test your knowledge with interactive flashcards and a quick reference guide.
Module Chapters
Chapter 01
What are LLMs?
What are LLMs?
Start Learning
Chapter 02
Tokenization
Tokenization
Start Learning
Chapter 03
The Transformer Architecture
The Transformer Architecture
Start Learning
Chapter 04
Module Review: LLM Basics
Module Review: LLM Basics
Start Learning