AI Blog

AI Blog

by Michele Laurelli

GPT (Generative Pre-trained Transformer)

/dʒiː piː tiː/
Model
Definition

A family of large language models developed by OpenAI that use transformer architecture for text generation.

GPT models are trained on massive text corpora using unsupervised learning, then fine-tuned for specific tasks. They excel at text generation, translation, summarization, and question-answering. GPT-4 is one of the most advanced language models.

Examples

1

ChatGPT for conversational AI

2

Code generation with GitHub Copilot

3

Creative writing assistance

Michele Laurelli - AI Research & Engineering