Source: wikibot/foundation-models
= Foundation models
{wiki=Foundation_models}
Foundation models are large-scale machine learning models trained on diverse data sources to perform a wide range of tasks, often with little to no fine-tuning. These models, such as GPT (Generative Pre-trained Transformer), BERT (Bidirectional Encoder Representations from Transformers), and others, serve as a foundational platform upon which more specialized models can be built.