Pre-Trained Transformer
An AI model that has been pre-trained on a large dataset and can be fine-tuned for specific tasks.
An AI model that has been pre-trained on a large dataset and can be fine-tuned for specific tasks.
Generative Pre-trained Transformer (GPT) is a type of AI model that uses deep learning to generate human-like text based on given input.
A learning method that involves teaching a concept to a novice to identify gaps in understanding and reinforce knowledge.
A type of model architecture primarily used in natural language processing tasks, known for its efficiency and scalability.
A cognitive approach that involves meaningful analysis of information, leading to better understanding and retention.
A cognitive architecture model that explains how humans can learn and adapt to new tasks.
A theory that suggests the depth of processing (shallow to deep) affects how well information is remembered.
The process of self-examination and adaptation in AI systems, where models evaluate and improve their own outputs or behaviors based on feedback.
A type of artificial intelligence capable of generating new content, such as text, images, and music, by learning from existing data.