|
- Generative pre-trained transformer - Wikipedia
A generative pre-trained transformer (GPT) is a type of large language model (LLM) [1] [2] [3] and a prominent framework for generative artificial intelligence [4] [5] It is an artificial neural network that is used in natural language processing [6]
- GPT-3 - Wikipedia
Generative Pre-trained Transformer 3 (GPT-3) is a large language model released by OpenAI in 2020 Like its predecessor, GPT-2, it is a decoder-only [2] transformer model of deep neural network, which supersedes recurrence and convolution-based architectures with a technique known as "attention" [3]
- ChatGPT - Wikipedia
Generative Pre-trained Transformer 4 is a multimodal large language model trained and created by OpenAI and the fourth in its series of GPT foundation models [110] It was launched on March 14, 2023, [ 110 ] and made publicly available via the paid chatbot product ChatGPT Plus until being replaced in 2025, via OpenAI's API , and via the free
- Generativer vortrainierter Transformer – Wikipedia
In der Künstlichen Intelligenz (KI) ist ein generativer vortrainierter Transformer (englisch Generative pre-trained transformers, GPT) ein großes Sprachmodell (englisch Large Language Model, LLM)
- Transformador generativo preentrenado - Wikipedia, la enciclopedia libre
Los transformadores generativos preentrenados (GPT) son un tipo de modelo de lenguaje grande (LLM) 1 2 3 y un marco prominente para la inteligencia artificial generativa 4 5 El primer GPT fue presentado en 2018 por OpenAI 6 Los modelos GPT son redes neuronales artificiales que se basan en la arquitectura del transformador, preentre
- Transformeur génératif préentraîné — Wikipédia
Un transformeur génératif préentraîné 1 (ou GPT, de l’anglais generative pre-trained transformer) est un type de grand modèle de langage basé sur l'architecture transformeur Le « préapprentissage » consiste à prédire le prochain mot dans une séquence de texte
- What is GPT (generative pre-trained transformer)? | IBM
Generative pretrained transformers (GPTs) are a family of large language models (LLMs) based on a transformer deep learning architecture Developed by OpenAI, these foundation models power ChatGPT and other generative AI applications capable of simulating human-created output
- Generative pre-trained transformer – Wikipedie
Generative pre-trained transformer, zkratka GPT, česky Generativní předtrénovaný transformátor, je typ velkého jazykového modelu, který funguje jako umělá neuronová síť, založená na architektuře transformerů
|
|
|