Context Window
The maximum amount of text (measured in tokens) that a language model can process in a single interaction. Larger context windows allow models to handle longer documents and conversations. Context windows have grown from 4K tokens in early GPT models to 200K+ in models like Claude.