Jump to content

Generative pre-trained transformer

From Hackerpedia
Revision as of 01:42, 15 January 2026 by imported>Unknown user

Languages: English | Français

Generative pre-trained transformer

A family of machine learning models based on the transformer architecture that are pre-trained through self-supervised learning on large data sets of unlabelled text. This is the current predominant architecture for large language models.


Source: NIST AI 100-2e2025 | Category: