AI Blog

AI Blog

by Michele Laurelli

Foundation Model

/faʊnˈdeɪʃən ˈmɒdəl/
Concept
Definition

Large-scale models trained on broad data that can be adapted to a wide range of downstream tasks.

Foundation models like GPT, BERT, and CLIP are pre-trained on massive datasets and fine-tuned for specific applications. They represent a paradigm shift in AI development.

Examples

1

GPT-4 as a foundation for many applications

2

BERT for NLP tasks

3

SAM for image segmentation

Michele Laurelli - AI Research & Engineering