Generative Pre-Trained Transformer
Author: Nicolas Sacotte • created on October 22, 2025
A Generative Pre-Trained Transformer (GPT) is an AI model for natural language processing that generates human-like text. It is pre-trained on diverse datasets, enabling it to understand context and semantics. GPT has various applications, including content creation and customer support.