Proportionality Bias
The tendency to believe that large or significant events must have large or significant causes. Important for understanding cognitive biases in decision-making and designing systems that present accurate causal relationships.
The tendency to believe that large or significant events must have large or significant causes. Important for understanding cognitive biases in decision-making and designing systems that present accurate causal relationships.
A cognitive bias where people place too much importance on one aspect of an event, causing errors in judgment. Important for understanding decision-making and designing interfaces that provide balanced information.
A cognitive bias where people focus on the most noticeable or prominent information while ignoring less conspicuous details. Important for understanding user decision-making and ensuring balanced presentation of information.
The tendency to overvalue new innovations and technologies while undervaluing existing or traditional approaches. Important for balanced decision-making and avoiding unnecessary risks in adopting new technologies.
A behavioral economics model that explains decision-making as a conflict between a present-oriented "doer" and a future-oriented "planner". Useful for understanding user decision-making and designing interventions that balance short-term and long-term goals.
A decision-making strategy that involves choosing an option that meets the minimum requirements rather than seeking the optimal solution, balancing effort and outcome. Important for designing user experiences that accommodate decision-making under constraints.
A cognitive bias where people overestimate the importance of information that is readily available. Essential for designers to understand and mitigate how easily accessible information can disproportionately influence decisions.
A cognitive bias that causes people to overestimate the likelihood of negative outcomes. Important for understanding user risk perception and designing systems that address irrational pessimism.
The tendency to search for, interpret, and remember information in a way that confirms one's preexisting beliefs or hypotheses. Crucial for understanding cognitive biases that affect user decision-making and designing interventions to mitigate them.
A cognitive bias where one negative trait of a person or thing influences the perception of other traits. Important for designing experiences that counteract or mitigate negative biases in user perception.
A cognitive bias where people overestimate the probability of success for difficult tasks and underestimate it for easy tasks. Useful for designers to understand user confidence and design
A cognitive bias where people judge the likelihood of an event based on its relative size rather than absolute probability. Important for understanding user decision-making biases and designing systems that present information accurately.
A cognitive bias where people disproportionately prefer smaller, immediate rewards over larger, later rewards. Important for understanding and designing around user decision-making and reward structures.
A cognitive bias that causes people to believe they are less likely to experience negative events and more likely to experience positive events than others. Crucial for understanding user risk perception and designing systems that account for unrealistic optimism.
The tendency to give more weight to negative experiences or information than positive ones. Crucial for understanding user behavior and designing systems that balance positive and negative feedback.
A cognitive bias where people's decisions are influenced by how information is presented rather than just the information itself. Crucial for designers to minimize bias in how information is presented to users.
A cognitive bias where individuals tend to avoid risks when they perceive potential losses more acutely than potential gains. Important for understanding decision-making behavior in users and designing systems that mitigate risk aversion.
A cognitive bias where people allow themselves to indulge after doing something positive, believing they have earned it. Important for understanding user behavior and designing systems that account for self-regulation.
The tendency for individuals to favor information that aligns with their existing beliefs and to avoid information that contradicts them. Crucial for understanding how users engage with content and designing systems that present balanced perspectives.
A theory that emphasizes the role of emotions in risk perception and decision-making, where feelings about risk often diverge from cognitive assessments. Important for designing systems that account for emotional responses to risk and improve decision-making.
A design technique that involves showing only essential information initially, revealing additional details as needed to prevent information overload. Crucial for creating user-friendly interfaces that enhance usability and reduce cognitive load.
A theory that suggests there is an optimal level of arousal for peak performance, and too much or too little arousal can negatively impact performance. Important for designing experiences that keep users engaged without overwhelming them.
A cognitive bias where individuals overlook or underestimate the cost of opportunities they forego when making decisions. Crucial for understanding user decision-making behavior and designing systems that highlight opportunity costs.
Information Visualization (InfoVis) is the study and practice of visual representations of abstract data to reinforce human cognition. Crucial for transforming complex data into intuitive visual formats, enabling faster insights and better decision-making.
The tendency to forget information that can be easily found online, also known as digital amnesia. Important for understanding how access to information impacts memory and designing experiences accordingly.
The phenomenon where having too many options leads to anxiety and difficulty making a decision, reducing overall satisfaction. Important for designing user experiences that balance choice and simplicity to enhance satisfaction.
A cognitive bias where people judge harmful actions as worse, or less moral, than equally harmful omissions (inactions). Important for understanding user decision-making and designing systems that mitigate this bias.
A cognitive bias where people prefer familiar things over unfamiliar ones, even if the unfamiliar options are objectively better. Useful for designing interfaces and products that leverage familiar elements to enhance user comfort.
A cognitive bias where people rely too heavily on their own perspective and experiences when making decisions. Important for designers to recognize and mitigate their own perspectives influencing design decisions.
A cognitive bias where individuals overestimate their own abilities, qualities, or performance relative to others. Important for understanding user self-perception and designing systems that account for inflated self-assessments.
The tendency for people to overestimate their ability to control events. Important for understanding user behavior and designing experiences that manage expectations.
Anchoring (also known as Focalism) is a cognitive bias where individuals rely heavily on the first piece of information (the "anchor") when making decisions. Crucial for understanding and mitigating initial information's impact on user decision-making processes.
The principle stating that there is a limit to the amount of complexity that users can handle, and if designers don't manage complexity, users will. Crucial for designing user-friendly systems that manage complexity effectively.
A cognitive bias that occurs when conclusions are drawn from a non-representative sample, focusing only on successful cases and ignoring failures. Crucial for making accurate assessments and designing systems that consider both successes and failures.
A cognitive bias where individuals overestimate the likelihood of extreme events regressing to the mean. Crucial for understanding decision-making and judgment under uncertainty.
A decision-making strategy where individuals allocate resources proportionally to the probability of an outcome occurring, rather than optimizing the most likely outcome. Important for understanding decision-making behaviors and designing systems that guide better resource allocation.
The tendency for negative information to have a greater impact on one's psychological state and processes than neutral or positive information. Important for understanding and mitigating the impact of negative information.
The error of making decisions based solely on quantitative observations and ignoring all other factors. Important for ensuring a holistic approach to decision-making.
A cognitive bias where repeated statements are more likely to be perceived as true, regardless of their actual accuracy. Crucial for understanding how repetition influences beliefs and designing communication strategies for users.
The theory that people adjust their behavior in response to the perceived level of risk, often taking more risks when they feel more protected. Important for designing safety features and understanding behavior changes in response to risk perception.
The study of how people make choices about what and how much to do at various points in time, often involving trade-offs between costs and benefits occurring at different times. Crucial for designing systems that account for delayed gratification and long-term planning.
A design approach that prioritizes the practical purpose and usability of digital products over purely aesthetic considerations. Important for creating efficient, user-centered designs that effectively fulfill their intended functions.
A strategic framework that designs user experiences to guide behavior and decisions towards desired outcomes. Crucial for creating effective and ethical influence in digital interfaces.