1
|
Chow JYL, Goldwater MB, Colagiuri B, Livesey EJ. Instruction on the Scientific Method Provides (Some) Protection Against Illusions of Causality. Open Mind (Camb) 2024; 8:639-665. [PMID: 38828432 PMCID: PMC11142631 DOI: 10.1162/opmi_a_00141] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/05/2023] [Accepted: 03/27/2024] [Indexed: 06/05/2024] Open
Abstract
People tend to overestimate the efficacy of an ineffective treatment when they experience the treatment and its supposed outcome co-occurring frequently. This is referred to as the outcome density effect. Here, we attempted to improve the accuracy of participants' assessments of an ineffective treatment by instructing them about the scientific practice of comparing treatment effects against a relevant base-rate, i.e., when no treatment is delivered. The effect of these instructions was assessed in both a trial-by-trial contingency learning task, where cue administration was either decided by the participant (Experiments 1 & 2) or pre-determined by the experimenter (Experiment 3), as well as in summary format where all information was presented on a single screen (Experiment 4). Overall, we found two means by which base-rate instructions influence efficacy ratings for the ineffective treatment: 1) When information was presented sequentially, the benefit of base-rate instructions on illusory belief was mediated by reduced sampling of cue-present trials, and 2) When information was presented in summary format, we found a direct effect of base-rate instruction on reducing causal illusion. Together, these findings suggest that simple instructions on the scientific method were able to decrease participants' (over-)weighting of cue-outcome coincidences when making causal judgements, as well as decrease their tendency to over-sample cue-present events. However, the effect of base-rate instructions on correcting illusory beliefs was incomplete, and participants still showed illusory causal judgements when the probability of the outcome occurring was high. Thus, simple textual information about assessing causal relationships is partially effective in influencing people's judgements of treatment efficacy, suggesting an important role of scientific instruction in debiasing cognitive errors.
Collapse
Affiliation(s)
- Julie Y. L. Chow
- School of Psychology, University of New South Wales, Sydney
- School of Psychology, The University of Sydney, Sydney
| | | | - Ben Colagiuri
- School of Psychology, The University of Sydney, Sydney
| | | |
Collapse
|
2
|
Zhang Y, Rottman BM. Causal learning with delays up to 21 hours. Psychon Bull Rev 2024; 31:312-324. [PMID: 37580453 DOI: 10.3758/s13423-023-02342-x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 07/13/2023] [Indexed: 08/16/2023]
Abstract
Considerable delays between causes and effects are commonly found in real life. However, previous studies have only investigated how well people can learn probabilistic relations with delays on the order of seconds. In the current study we tested whether people can learn a cause-effect relation with delays of 0, 3, 9, or 21 hours, and the study lasted 16 days. We found that learning was slowed with longer delays, but by the end of 16 days participants had learned the cause-effect relation in all four conditions, and they had learned the relation about equally well in all four conditions. This suggests that in real-world situations people may still be fairly accurate at inferring cause-effect relations with delays if they have enough experience. We also discuss ways that delays may interact with other real-world factors that could complicate learning.
Collapse
|
3
|
Bona SD, Vicovaro M. Does perceptual disfluency affect the illusion of causality? Q J Exp Psychol (Hove) 2024:17470218231220928. [PMID: 38053312 DOI: 10.1177/17470218231220928] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/07/2023]
Abstract
When a subjective experience of difficulty is associated with a mental task, people tend to engage in systematic and deliberative reasoning, which can reduce the usage of intuitive and effortless thinking that gives rise to cognitive biases. One such bias is the illusion of causality, where people perceive a causal link between two unrelated events. In 2019, Díaz-Lago and Matute found that a superficial perceptual feature of the task could modulate the magnitude of the illusion (i.e., a hard-to-read font led to a decrease in the magnitude of the illusion). The present study explored the generalisability of the idea that perceptual disfluency can lead to a decrease in the magnitude of the illusion. In the first experiment, we tested whether a physical-perceptual manipulation of the stimuli, specifically the contrast between the written text and the background, could modulate the illusion in a contingency learning task. The results of the online experiment (N = 200) showed no effect of contrast on the magnitude of the illusion, despite our manipulation having successfully induced task fluency or disfluency. Building upon this null result, our second experiment (N = 100) focused on manipulating the font type, in an attempt to replicate the results obtained by Díaz-Lago and Matute. In contrast to their findings, we found no discernible effect of font type on the magnitude of the illusion, even though this manipulation also effectively induced variations in task fluency or disfluency. These results underscore the notion that not all categories of (dis)fluency in cognitive processing wield a modulatory influence on cognitive biases, and they call for a re-evaluation and a more precise delineation of the (dis)fluency construct.
Collapse
Affiliation(s)
| | - Michele Vicovaro
- Department of General Psychology, University of Padova, Padova, Italy
| |
Collapse
|
4
|
Béghin G, Markovits H. Interpretation of ambiguous trials along with reasoning strategy is related to causal judgements in zero-contingency learning. Q J Exp Psychol (Hove) 2023; 76:2704-2717. [PMID: 36718805 PMCID: PMC10663643 DOI: 10.1177/17470218231155897] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/04/2022] [Revised: 01/06/2023] [Accepted: 01/09/2023] [Indexed: 02/01/2023]
Abstract
The dual strategy model suggests that people can use either a Statistical or a Counterexample reasoning strategy, which reflects two qualitatively different ways of processing information. This model has been shown to capture individual differences in a wide array of tasks, such as contingency learning. Here, we examined whether this extends to individual differences in the interpretation of contingency information where effects are ambiguous. Previous studies, using perceptually complex stimuli, have shown that the way in which participants interpret ambiguous effects predicts causal judgements. In two studies, we attempted to replicate this effect using a small number of clearly identifiable cues. Results show that the interpretation of ambiguous effects as effect present is related to final contingency judgements. In addition, results showed that Statistical reasoners had a stronger tendency to interpret ambiguous effects as effect present than Counterexample reasoners, which mediates the difference in contingency judgements.
Collapse
Affiliation(s)
- Gaëtan Béghin
- Université du Québec à Montréal, Montreal, QC, Canada
| | | |
Collapse
|
5
|
Higuchi K, Oyo K, Takahashi T. Causal intuition in the indefinite world: Meta-analysis and simulations. Biosystems 2023; 225:104842. [PMID: 36716912 DOI: 10.1016/j.biosystems.2023.104842] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/18/2022] [Revised: 01/20/2023] [Accepted: 01/24/2023] [Indexed: 01/29/2023]
Abstract
Modeling our causal intuition can contribute to understanding our behavior. In this paper, we introduce a causal induction model called proportion of assumed-to-be rare instances (pARIs) and examine its adaptive properties. We employ the two-stage theory of causal induction proposed by Hattori and Oaksford in 2007, which divides causal induction into two stages: first, observed events are sifted and likely candidates are extracted; second, each of them is verified through intervention. Here, we focus on the first stage. We conducted a meta-analysis and computer simulations in a similar way to Hattori and Oaksford (2007) but with some corrections and improvements. We added two experiments and excluded one in our reconstructed meta-analysis and augmented the simulations by correcting two problems. Our meta-analysis results show that pARIs outperforms more than 40 existing models in terms of data fit from human causal induction experiments while being simpler. Additionally, our simulation results show that pARIs outperforms DFH in terms of population covariation detection, especially under small sample sizes and rarity of events. Overall, pARIs qualifies as one of the best models for the first stage of causal induction. These findings may enable a deeper understanding of our cognitive biases. The first stage can now be considered a causal discovery stage where the topology of causal models is to be hypothesized.
Collapse
Affiliation(s)
- Kohki Higuchi
- School of Science and Engineering, Tokyo Denki University, Ishizaka, Hatoyama, Hiki, Saitama 350-0394, Japan
| | - Kuratomo Oyo
- School of Policy Studies, Kwansei Gakuin University, 1 Gakuen Uegahara, Sanda, Hyogo, 669-1330, Japan
| | - Tatsuji Takahashi
- School of Science and Engineering, Tokyo Denki University, Ishizaka, Hatoyama, Hiki, Saitama 350-0394, Japan.
| |
Collapse
|
6
|
Reasoning strategies and prior knowledge effects in contingency learning. Mem Cognit 2022; 50:1269-1283. [DOI: 10.3758/s13421-022-01319-w] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 04/21/2022] [Indexed: 11/08/2022]
|
7
|
A cognitive model of delusion propensity through dysregulated correlation detection. Schizophr Res 2021; 237:93-100. [PMID: 34509105 DOI: 10.1016/j.schres.2021.08.025] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 05/20/2020] [Revised: 08/22/2021] [Accepted: 08/23/2021] [Indexed: 11/20/2022]
Abstract
BACKGROUND We present a novel account of delusion propensity that integrates the roles of working memory (WM), decision criteria, and information gathering biases. This framework emphasises the role of aberrant correlation detection, which leads to the spurious perception of relationships between one's experiences. The frequency of such outcomes is moderated by the scaling of one's decision criteria which, for reasons discussed, must also account for WM capacity. The proposed dysregulated correlation detection account posits that propensity for delusional ideation is influenced by disturbances in this mechanism. METHODS Hypotheses were tested using a novel task that required participants (N = 92) to identify correlation between binary manipulations of simple shapes, presented as sequential pairs. Decision criteria and correlation detection were assessed under a Signal Detection Theory framework, while WM capacity was assessed through the Automated Operation Span Task and delusion propensity was measured using the Peters Delusion Inventory. Structural equation modeling was conducted to evaluate the proposed model. RESULTS Consistent with the central hypothesis, an interaction between decision criteria and WM was found to contribute significantly to delusion propensity through its effect on correlation detection accuracy. Greater delusion propensity was observed among participants with more liberal decision criteria, which was also in accordance with hypotheses. At the same time, the total effect of WM on delusion propensity was not found to be significant. CONCLUSIONS These findings provide preliminary support for the proposed dysregulated correlation detection account of propensity for delusional ideation.
Collapse
|
8
|
Anderson JR, Betts S, Bothell D, Lebiere C. Discovering skill. Cogn Psychol 2021; 129:101410. [PMID: 34246846 DOI: 10.1016/j.cogpsych.2021.101410] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2021] [Revised: 06/18/2021] [Accepted: 06/27/2021] [Indexed: 11/27/2022]
Abstract
This paper shows how identical skills can emerge either from instruction or discovery when both result in an understanding of the causal structure of the task domain. The paper focuses on the discovery process, extending the skill acquisition model of Anderson et al. (2019) to address learning by discovery. The discovery process involves exploring the environment and developing associations between discontinuities in the task and events that precede them. The growth of associative strength in ACT-R serves to identify potential causal connections. The model can derive operators from these discovered causal relations just as does with the instructed causal information. Subjects were given a task of learning to play a video game either with a description of the game's causal structure (Instruction) or not (Discovery). The Instruction subjects learned faster, but successful Discovery subjects caught up. After 20 3-minute games the behavior of the successful subjects in the two groups was largely indistinguishable. The play of these Discovery subjects jumped in the same discrete way as did the behavior of simulated subjects in the model. These results show how implicit processes (associative learning, control tuning) and explicit processes (causal inference, planning) can combine to produce human learning in complex environments.
Collapse
Affiliation(s)
| | - Shawn Betts
- Department of Psychology, Carnegie Mellon, United States
| | - Daniel Bothell
- Department of Psychology, Carnegie Mellon, United States
| | | |
Collapse
|
9
|
Béghin G, Gagnon-St-Pierre É, Markovits H. A dual strategy account of individual differences in information processing in contingency judgments. JOURNAL OF COGNITIVE PSYCHOLOGY 2021. [DOI: 10.1080/20445911.2021.1900200] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/21/2022]
|
10
|
Moreno-Fernández MM, Blanco F, Matute H. The tendency to stop collecting information is linked to illusions of causality. Sci Rep 2021; 11:3942. [PMID: 33594129 PMCID: PMC7887230 DOI: 10.1038/s41598-021-82075-w] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/05/2020] [Accepted: 01/12/2021] [Indexed: 11/18/2022] Open
Abstract
Previous research proposed that cognitive biases contribute to produce and maintain the symptoms exhibited by deluded patients. Specifically, the tendency to jump to conclusions (i.e., to stop collecting evidence soon before making a decision) has been claimed to contribute to delusion formation. Additionally, deluded patients show an abnormal understanding of cause-effect relationships, often leading to causal illusions (i.e., the belief that two events are causally connected, when they are not). Both types of bias appear in psychotic disorders, but also in healthy individuals. In two studies, we test the hypothesis that the two biases (jumping to conclusions and causal illusions) appear in the general population and correlate with each other. The rationale is based on current theories of associative learning that explain causal illusions as the result of a learning bias that tends to wear off as additional information is incorporated. We propose that participants with higher tendency to jump to conclusions will stop collecting information sooner in a causal learning study than those participants with lower tendency to jump to conclusions, which means that the former will not reach the learning asymptote, leading to biased judgments. The studies provide evidence in favour that the two biases are correlated but suggest that the proposed mechanism is not responsible for this association.
Collapse
Affiliation(s)
- María Manuela Moreno-Fernández
- Department of Developmental and Educational Psychology, Faculty of Psychology, University of Granada, Granada, Spain. .,Department of Methods and Experimental Psychology, Faculty of Psychology and Education, University of Deusto, Bilbao, Spain.
| | - Fernando Blanco
- Department of Methods and Experimental Psychology, Faculty of Psychology and Education, University of Deusto, Bilbao, Spain.,Department of Social Psychology, Faculty of Psychology, University of Granada, Granada, Spain
| | - Helena Matute
- Department of Methods and Experimental Psychology, Faculty of Psychology and Education, University of Deusto, Bilbao, Spain
| |
Collapse
|
11
|
Blanco F, Moreno-Fernández MM, Matute H. When Success Is Not Enough: The Symptom Base-Rate Can Influence Judgments of Effectiveness of a Successful Treatment. Front Psychol 2020; 11:560273. [PMID: 33192826 PMCID: PMC7644667 DOI: 10.3389/fpsyg.2020.560273] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/08/2020] [Accepted: 08/26/2020] [Indexed: 12/12/2022] Open
Abstract
Patients’ beliefs about the effectiveness of their treatments are key to the success of any intervention. However, since these beliefs are usually formed by sequentially accumulating evidence in the form of the covariation between the treatment use and the symptoms, it is not always easy to detect when a treatment is actually working. In Experiments 1 and 2, we presented participants with a contingency learning task in which a fictitious treatment was actually effective to reduce the symptoms of fictitious patients. However, the base-rate of the symptoms was manipulated so that, for half of participants, the symptoms were very frequent before the treatment, whereas for the rest of participants, the symptoms were less frequently observed. Although the treatment was equally effective in all cases according to the objective contingency between the treatment and healings, the participants’ beliefs on the effectiveness of the treatment were influenced by the base-rate of the symptoms, so that those who observed frequent symptoms before the treatment tended to produce lower judgments of effectiveness. Experiment 3 showed that participants were probably basing their judgments on an estimate of effectiveness relative to the symptom base-rate, rather than on contingency in absolute terms. Data, materials, and R scripts to reproduce the figures are publicly available at the Open Science Framework: https://osf.io/emzbj/.
Collapse
Affiliation(s)
- Fernando Blanco
- Faculty of Psychology, University of Granada, Granada, Spain
| | | | - Helena Matute
- Faculty of Psychology and Education, University of Deusto, Bilbao, Spain
| |
Collapse
|
12
|
Moreno-Fernández MM, Matute H. Biased Sampling and Causal Estimation of Health-Related Information: Laboratory-Based Experimental Research. J Med Internet Res 2020; 22:e17502. [PMID: 32706735 PMCID: PMC7414405 DOI: 10.2196/17502] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/19/2019] [Revised: 05/03/2020] [Accepted: 05/14/2020] [Indexed: 12/14/2022] Open
Abstract
Background The internet is a relevant source of health-related information. The huge amount of information available on the internet forces users to engage in an active process of information selection. Previous research conducted in the field of experimental psychology showed that information selection itself may promote the development of erroneous beliefs, even if the information collected does not. Objective The aim of this study was to assess the relationship between information searching strategy (ie, which cues are used to guide information retrieval) and causal inferences about health while controlling for the effect of additional information features. Methods We adapted a standard laboratory task that has previously been used in research on contingency learning to mimic an information searching situation. Participants (N=193) were asked to gather information to determine whether a fictitious drug caused an allergic reaction. They collected individual pieces of evidence in order to support or reject the causal relationship between the two events by inspecting individual cases in which the drug was or was not used or in which the allergic reaction appeared or not. Thus, one group (cause group, n=105) was allowed to sample information based on the potential cause, whereas a second group (effect group, n=88) was allowed to sample information based on the effect. Although participants could select which medical records they wanted to check—cases in which the medicine was used or not (in the cause group) or cases in which the effect appeared or not (in the effect group)—they all received similar evidence that indicated the absence of a causal link between the drug and the reaction. After observing 40 cases, they estimated the drug–allergic reaction causal relationship. Results Participants used different strategies for collecting information. In some cases, participants displayed a biased sampling strategy compatible with positive testing, that is, they required a high proportion of evidence in which the drug was administered (in the cause group) or in which the allergic reaction appeared (in the effect group). Biased strategies produced an overrepresentation of certain pieces of evidence at the detriment of the representation of others, which was associated with the accuracy of causal inferences. Thus, how the information was collected (sampling strategy) demonstrated a significant effect on causal inferences (F1,185=32.53, P<.001, η2p=0.15) suggesting that inferences of the causal relationship between events are related to how the information is gathered. Conclusions Mistaken beliefs about health may arise from accurate pieces of information partially because of the way in which information is collected. Patient or person autonomy in gathering health information through the internet, for instance, may contribute to the development of false beliefs from accurate pieces of information because search strategies can be biased.
Collapse
Affiliation(s)
- María Manuela Moreno-Fernández
- Departamento de Fundamentos y Métodos de la Psicología, Faculty of Psychology and Education, University of Deusto, Bilbao, Spain
| | - Helena Matute
- Departamento de Fundamentos y Métodos de la Psicología, Faculty of Psychology and Education, University of Deusto, Bilbao, Spain
| |
Collapse
|
13
|
Are the symptoms really remitting? How the subjective interpretation of outcomes can produce an illusion of causality. JUDGMENT AND DECISION MAKING 2020. [DOI: 10.1017/s1930297500007506] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/04/2023]
Abstract
AbstractJudgments of a treatment’s effectiveness are usually biased by the probability with which the outcome (e.g., symptom relief) appears: even when the treatment is completely ineffective (i.e., there is a null contingency between cause and outcome), judgments tend to be higher when outcomes appear with high probability. In this research, we present ambiguous stimuli, expecting to find individual differences in the tendency to interpret them as outcomes. In Experiment 1, judgments of effectiveness of a completely ineffective treatment increased with the spontaneous tendency of participants to interpret ambiguous stimuli as outcome occurrences (i.e., healings). In Experiment 2, this interpretation bias was affected by the overall treatment-outcome contingency, suggesting that the tendency to interpret ambiguous stimuli as outcomes is learned and context-dependent. In conclusion, we show that, to understand how judgments of effectiveness are affected by outcome probability, we need to also take into account the variable tendency of people to interpret ambiguous information as outcome occurrences.
Collapse
|
14
|
Gierth L, Bromme R. Beware of vested interests: Epistemic vigilance improves reasoning about scientific evidence (for some people). PLoS One 2020; 15:e0231387. [PMID: 32294109 PMCID: PMC7159212 DOI: 10.1371/journal.pone.0231387] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/15/2019] [Accepted: 03/23/2020] [Indexed: 11/19/2022] Open
Abstract
In public disputes, stakeholders sometimes misrepresent statistics or other types of scientific evidence to support their claims. One of the reasons this is problematic is that citizens often do not have the motivation nor the cognitive skills to accurately judge the meaning of statistics and thus run the risk of being misinformed. This study reports an experiment investigating the conditions under which people become vigilant towards a source’s claim and thus reason more carefully about the supporting evidence. For this, participants were presented with a claim by a vested-interest or a neutral source and with statistical evidence which was cited by the source as being in support of the claim. However, this statistical evidence actually contradicted the source’s claim but was presented as a contingency table, which are typically difficult for people to interpret correctly. When the source was a lobbyist arguing for his company’s product people were better at interpreting the evidence compared to when the same source argued against the product. This was not the case for a different vested-interests source nor for the neutral source. Further, while all sources were rated as less trustworthy when participants realized that the source had misrepresented the evidence, only for the lobbyist source was this seen as a deliberate attempt at deception. Implications for research on epistemic trust, source credibility effects and science communication are discussed.
Collapse
Affiliation(s)
- Lukas Gierth
- Institute of Psychology, University of Münster, Münster, North Rhine-Westphalia, Germany
- * E-mail:
| | - Rainer Bromme
- Institute of Psychology, University of Münster, Münster, North Rhine-Westphalia, Germany
| |
Collapse
|
15
|
A hard to read font reduces the causality bias. JUDGMENT AND DECISION MAKING 2019. [DOI: 10.1017/s1930297500004848] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/04/2023]
Abstract
AbstractPrevious studies have demonstrated that fluency affects judgment and decision-making. The purpose of the present research was to investigate the effect of perceptual fluency in a causal learning task that usually induces an illusion of causality in non-contingent conditions. We predicted that a reduction of fluency could improve accuracy in the detection of non-contingency and, therefore, could be used to debias illusory perceptions of causality. Participants were randomly assigned to either an easy-to-read or a hard-to-read condition. Our results showed a strong bias (i.e., overestimation) of causality in those participants who performed the non-contingent task in the easy-to-read font, which replicated the standard causality bias effect. This effect was reduced when the same task was presented in a hard-to-read font. Overall, our results provide evidence for a reduction of the causality bias when presenting the problem in a hard-to-read font. This suggests that perceptual fluency affects causal judgments.
Collapse
|
16
|
Wang M, Zhu M. The Preference for Joint Attributions Over Contrast-Factor Attributions in Causal Contrast Situations. Front Psychol 2019; 10:1881. [PMID: 31507479 PMCID: PMC6716399 DOI: 10.3389/fpsyg.2019.01881] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/15/2019] [Accepted: 07/31/2019] [Indexed: 11/17/2022] Open
Abstract
A current issue about causal attribution is whether people take simple contrast-factor attributions or complex joint attributions in contrast situations. For example, a stone does not dissolve in water and a piece of salt dissolves in water. That the piece of salt dissolves in water is due to: (A) the influence of the piece of salt; (B) the influence of the water; (C) the joint influence of the piece of salt and the water. We propose a mechanism-based sufficiency account for such questions. It argues that causal attributions are guided by mechanism-based explanatory sufficiency, and people prefer a mechanism-based attribution with explanatory sufficiency. This account predicts the sufficient joint attribution (the C option), whereas the conventional covariation approach predicts the contrast-factor attribution (the A option). Two experiments investigated whether contrast situations affect causal attributions for compound causation with explicit mechanism information and simple causation without explicit mechanism information, respectively. Both experiments found that in both the presence and absence of contrast situations, the majority of participants preferred sufficient joint attributions to simple contrast-factor attributions regardless of whether explicit mechanism information was present, and contrast situations did not affect causal attributions. These findings favor the mechanism-based sufficiency account rather than the covariation approach and the complexity account. In contrast situations, the predominance of joint attributions implies that explanatory complexity affects causal attributions by the modulation of explanatory sufficiency, and people prefer mechanism-based joint attributions that provide sufficient explanations for effects. The present findings are beyond the existing approaches to causal attributions.
Collapse
Affiliation(s)
- Moyun Wang
- Shaanxi Key Laboratory of Behavior and Cognitive Neuroscience, School of Psychology, Shaanxi Normal University, Xi'an, China
| | - Mingyi Zhu
- Shaanxi Key Laboratory of Behavior and Cognitive Neuroscience, School of Psychology, Shaanxi Normal University, Xi'an, China
| |
Collapse
|
17
|
Morris A, Phillips J, Gerstenberg T, Cushman F. Quantitative causal selection patterns in token causation. PLoS One 2019; 14:e0219704. [PMID: 31369584 PMCID: PMC6675094 DOI: 10.1371/journal.pone.0219704] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/14/2019] [Accepted: 06/28/2019] [Indexed: 11/17/2022] Open
Abstract
When many events contributed to an outcome, people consistently judge some more causal than others, based in part on the prior probabilities of those events. For instance, when a tree bursts into flames, people judge the lightning strike more of a cause than the presence of oxygen in the air—in part because oxygen is so common, and lightning strikes are so rare. These effects, which play a major role in several prominent theories of token causation, have largely been studied through qualitative manipulations of the prior probabilities. Yet, there is good reason to think that people’s causal judgments are on a continuum—and relatively little is known about how these judgments vary quantitatively as the prior probabilities change. In this paper, we measure people’s causal judgment across parametric manipulations of the prior probabilities of antecedent events. Our experiments replicate previous qualitative findings, and also reveal several novel patterns that are not well-described by existing theories.
Collapse
Affiliation(s)
- Adam Morris
- Psychology Department, Harvard University, Cambridge, MA, United States of America
| | - Jonathan Phillips
- Psychology Department, Harvard University, Cambridge, MA, United States of America
| | - Tobias Gerstenberg
- Psychology Department, Stanford University, Stanford, CA, United States of America
| | - Fiery Cushman
- Psychology Department, Harvard University, Cambridge, MA, United States of America
| |
Collapse
|
18
|
Blanco F, Matute H. Base-rate expectations modulate the causal illusion. PLoS One 2019; 14:e0212615. [PMID: 30835775 PMCID: PMC6400408 DOI: 10.1371/journal.pone.0212615] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/13/2018] [Accepted: 02/06/2019] [Indexed: 11/18/2022] Open
Abstract
Previous research revealed that people's judgments of causality between a target cause and an outcome in null contingency settings can be biased by various factors, leading to causal illusions (i.e., incorrectly reporting a causal relationship where there is none). In two experiments, we examined whether this causal illusion is sensitive to prior expectations about base-rates. Thus, we pretrained participants to expect either a high outcome base-rate (Experiment 1) or a low outcome base-rate (Experiment 2). This pretraining was followed by a standard contingency task in which the target cause and the outcome were not contingent with each other (i.e., there was no causal relation between them). Subsequent causal judgments were affected by the pretraining: When the outcome base-rate was expected to be high, the causal illusion was reduced, and the opposite was observed when the outcome base-rate was expected to be low. The results are discussed in the light of several explanatory accounts (associative and computational). A rational account of contingency learning based on the evidential value of information can predict our findings.
Collapse
Affiliation(s)
- Fernando Blanco
- Departamento de Fundamentos y Métodos de la Psicología, Universidad de Deusto, Bilbao, Spain
| | - Helena Matute
- Departamento de Fundamentos y Métodos de la Psicología, Universidad de Deusto, Bilbao, Spain
| |
Collapse
|
19
|
Barberia I, Vadillo MA, Rodríguez-Ferreiro J. Persistence of Causal Illusions After Extensive Training. Front Psychol 2019; 10:24. [PMID: 30733692 PMCID: PMC6353834 DOI: 10.3389/fpsyg.2019.00024] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/21/2018] [Accepted: 01/07/2019] [Indexed: 11/25/2022] Open
Abstract
We carried out an experiment using a conventional causal learning task but extending the number of learning trials participants were exposed to. Participants in the standard training group were exposed to 48 learning trials before being asked about the potential causal relationship under examination, whereas for participants in the long training group the length of training was extended to 288 trials. In both groups, the event acting as the potential cause had zero correlation with the occurrence of the outcome, but both the outcome density and the cause density were high, therefore providing a breeding ground for the emergence of a causal illusion. In contradiction to the predictions of associative models such the Rescorla-Wagner model, we found moderate evidence against the hypothesis that extending the learning phase alters the causal illusion. However, assessing causal impressions recurrently did weaken participants’ causal illusions.
Collapse
Affiliation(s)
- Itxaso Barberia
- Departament de Cognició, Desenvolupament y Psicologia de la Educació, Universitat de Barcelona, Barcelona, Spain
| | - Miguel A Vadillo
- Departamento de Psicología Básica, Universidad Autónoma de Madrid, Madrid, Spain
| | - Javier Rodríguez-Ferreiro
- Departament de Cognició, Desenvolupament y Psicologia de la Educació, Universitat de Barcelona, Barcelona, Spain.,Institut de Neurociències, Universitat de Barcelona, Barcelona, Spain
| |
Collapse
|
20
|
Wang M, Yin P. What Is Transferred in Causal Generalization Across Contexts? Exp Psychol 2018; 65:306-313. [PMID: 30232939 DOI: 10.1027/1618-3169/a000413] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/23/2022]
Abstract
The covariation and causal power account for causal induction make different predictions for what is transferred in causal generalization across contexts. Two experiments tested these predictions using hypothetical scenarios in which the effect of an intervention was evaluated between (Experiment 1) or within (Experiment 2) groups. Each experiment contained a manipulation of ΔP, power and their combination. Both experiments found that causal transfer was determined by ΔP rather than causal power. The overall transfer pattern supports ΔP transfer account rather than the other transfer accounts. Causal transfers based on ΔP are irrational, violating the coherence criterion of the causal power framework. The ΔP transfer is consistent with previous findings that ΔP is a main mental non-normative measure of causal strength in causal induction.
Collapse
Affiliation(s)
- Moyun Wang
- 1 School of Psychology, Shaanxi Normal University, Xi'an, China
| | - Pengfei Yin
- 1 School of Psychology, Shaanxi Normal University, Xi'an, China
| |
Collapse
|
21
|
Blanco F, Gómez-Fortes B, Matute H. Causal Illusions in the Service of Political Attitudes in Spain and the United Kingdom. Front Psychol 2018; 9:1033. [PMID: 30002636 PMCID: PMC6032155 DOI: 10.3389/fpsyg.2018.01033] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/17/2017] [Accepted: 06/01/2018] [Indexed: 11/13/2022] Open
Abstract
The causal illusion is a cognitive bias that results in the perception of causality where there is no supporting evidence. We show that people selectively exhibit the bias, especially in those situations where it favors their current worldview as revealed by their political orientation. In our two experiments (one conducted in Spain and one conducted in the United Kingdom), participants who self-positioned themselves on the ideological left formed the illusion that a left-wing ruling party was more successful in improving city indicators than a right-wing party, while participants on the ideological right tended to show the opposite pattern. In sum, despite the fact that the same information was presented to all participants, people developed the causal illusion bias selectively, providing very different interpretations that aligned with their previous attitudes. This result occurs in situations where participants inspect the relationship between the government's actions and positive outcomes (improving city indicators) but not when the outcomes are negative (worsening city indicators).
Collapse
Affiliation(s)
- Fernando Blanco
- Departamento de Fundamentos y Métodos de la Psicología, University of Deusto, Bilbao, Spain
| | | | - Helena Matute
- Departamento de Fundamentos y Métodos de la Psicología, University of Deusto, Bilbao, Spain
| |
Collapse
|
22
|
A comparator-hypothesis account of biased contingency detection. Behav Processes 2018; 154:45-51. [PMID: 29447853 DOI: 10.1016/j.beproc.2018.02.009] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/15/2017] [Revised: 02/10/2018] [Accepted: 02/10/2018] [Indexed: 11/20/2022]
Abstract
Our ability to detect statistical dependencies between different events in the environment is strongly biased by the number of coincidences between them. Even when there is no true covariation between a cue and an outcome, if the marginal probability of either of them is high, people tend to perceive some degree of statistical contingency between both events. The present paper explores the ability of the Comparator Hypothesis to explain the general pattern of results observed in this literature. Our simulations show that this model can account for the biasing effects of the marginal probabilities of cues and outcomes. Furthermore, the overall fit of the Comparator Hypothesis to a sample of experimental conditions from previous studies is comparable to that of the popular Rescorla-Wagner model. These results should encourage researchers to further explore and put to the test the predictions of the Comparator Hypothesis in the domain of biased contingency detection.
Collapse
|
23
|
Abstract
The purpose of this research is to investigate the impact of a foreign language
on the causality bias (i.e., the illusion that two events are causally related
when they are not). We predict that using a foreign language could reduce the
illusions of causality. A total of 36 native English speakers participated in
Experiment 1, 80 native Spanish speakers in Experiment 2. They performed a
standard contingency learning task, which can be used to detect causal
illusions. Participants who performed the task in their native tongue replicated
the illusion of causality effect, whereas those performing the task in their
foreign language were more accurate in detecting that the two events were
causally unrelated. Our results suggest that presenting the information in a
foreign language could be used as a strategy to debias individuals against
causal illusions, thereby facilitating more accurate judgements and decisions in
non-contingent situations. They also contribute to the debate on the nature and
underlying mechanisms of the foreign language effect, given that the illusion of
causality is rooted in basic associative processes.
Collapse
|
24
|
Statistical numeracy as a moderator of (pseudo)contingency effects on decision behavior. Acta Psychol (Amst) 2017; 174:68-79. [PMID: 28189707 DOI: 10.1016/j.actpsy.2017.01.002] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/13/2016] [Revised: 12/08/2016] [Accepted: 01/09/2017] [Indexed: 11/22/2022] Open
Abstract
Pseudocontingencies denote contingency estimates inferred from base rates rather than from cell frequencies. We examined the role of statistical numeracy for effects of such fallible but adaptive inferences on choice behavior. In Experiment 1, we provided information on single observations as well as on base rates and tracked participants' eye movements. In Experiment 2, we manipulated the availability of information on cell frequencies and base rates between conditions. Our results demonstrate that a focus on base rates rather than cell frequencies benefits pseudocontingency effects. Learners who are more proficient in (conditional) probability calculation prefer to rely on cell frequencies in order to judge contingencies, though, as was evident from their gaze behavior. If cell frequencies are available in summarized format, they may infer the true contingency between options and outcomes. Otherwise, however, even highly numerate learners are susceptible to pseudocontingency effects.
Collapse
|
25
|
White PA. Causal judgments about empirical information in an interrupted time series design. Q J Exp Psychol (Hove) 2017; 70:18-35. [DOI: 10.1080/17470218.2015.1115886] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/22/2022]
Abstract
Empirical information available for causal judgment in everyday life tends to take the form of quasi-experimental designs, lacking control groups, more than the form of contingency information that is usually presented in experiments. Stimuli were presented in which values of an outcome variable for a single individual were recorded over six time periods, and an intervention was introduced between the fifth and sixth time periods. Participants judged whether and how much the intervention affected the outcome. With numerical stimulus information, judgments were higher for a pre-intervention profile in which all values were the same than for pre-intervention profiles with any other kind of trend. With graphical stimulus information, judgments were more sensitive to trends, tending to be higher when an increase after the intervention was preceded by a decreasing series than when it was preceded by an increasing series ending on the same value at the fifth time period. It is suggested that a feature-analytic model, in which the salience of different features of information varies between presentation formats, may provide the best prospect of explaining the results.
Collapse
Affiliation(s)
- Peter A. White
- School of Psychology, Cardiff University, Cardiff, Wales, UK
| |
Collapse
|
26
|
Abstract
Causal learning is the ability to progressively incorporate raw information about dependencies between events, or between one's behavior and its outcomes, into beliefs of the causal structure of the world. In spite of the fact that some cognitive biases in gambling disorder can be described as alterations of causal learning involving gambling-relevant cues, behaviors, and outcomes, general causal learning mechanisms in gamblers have not been systematically investigated. In the present study, we compared gambling disorder patients against controls in an instrumental causal learning task. Evidence of illusion of control, namely, overestimation of the relationship between one's behavior and an uncorrelated outcome, showed up only in gamblers with strong current symptoms. Interestingly, this effect was part of a more complex pattern, in which gambling disorder patients manifested a poorer ability to discriminate between null and positive contingencies. Additionally, anomalies were related to gambling severity and current gambling disorder symptoms. Gambling-related biases, as measured by a standard psychometric tool, correlated with performance in the causal learning task, but not in the expected direction. Indeed, performance of gamblers with stronger biases tended to resemble the one of controls, which could imply that anomalies of causal learning processes play a role in gambling disorder, but do not seem to underlie gambling-specific biases, at least in a simple, direct way.
Collapse
|
27
|
Causal competition based on generic priors. Cogn Psychol 2016; 86:62-86. [DOI: 10.1016/j.cogpsych.2016.02.001] [Citation(s) in RCA: 7] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/29/2015] [Revised: 01/30/2016] [Accepted: 02/01/2016] [Indexed: 11/17/2022]
|
28
|
Interactive effects of the probability of the cue and the probability of the outcome on the overestimation of null contingency. Learn Behav 2016; 41:333-40. [PMID: 23529636 DOI: 10.3758/s13420-013-0108-8] [Citation(s) in RCA: 43] [Impact Index Per Article: 5.4] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Overestimations of null contingencies between a cue, C, and an outcome, O, are widely reported effects that can arise for multiple reasons. For instance, a high probability of the cue, P(C), and a high probability of the outcome, P(O), are conditions that promote such overestimations. In two experiments, participants were asked to judge the contingency between a cue and an outcome. Both P(C) and P(O) were given extreme values (high and low) in a factorial design, while maintaining the contingency between the two events at zero. While we were able to observe main effects of the probability of each event, our experiments showed that the cue- and outcome-density biases interacted such that a high probability of the two stimuli enhanced the overestimation beyond the effects observed when only one of the two events was frequent. This evidence can be used to better understand certain societal issues, such as belief in pseudoscience, that can be the result of overestimations of null contingencies in high-P(C) or high-P(O) situations.
Collapse
|
29
|
Mayrhofer R, Waldmann MR. Sufficiency and Necessity Assumptions in Causal Structure Induction. Cogn Sci 2015; 40:2137-2150. [DOI: 10.1111/cogs.12318] [Citation(s) in RCA: 18] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/14/2013] [Revised: 09/14/2015] [Accepted: 09/16/2015] [Indexed: 11/28/2022]
|
30
|
Adapting to an Uncertain World: Cognitive Capacity and Causal Reasoning with Ambiguous Observations. PLoS One 2015; 10:e0140608. [PMID: 26468653 PMCID: PMC4607167 DOI: 10.1371/journal.pone.0140608] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/06/2015] [Accepted: 09/27/2015] [Indexed: 12/02/2022] Open
Abstract
Ambiguous causal evidence in which the covariance of the cause and effect is partially known is pervasive in real life situations. Little is known about how people reason about causal associations with ambiguous information and the underlying cognitive mechanisms. This paper presents three experiments exploring the cognitive mechanisms of causal reasoning with ambiguous observations. Results revealed that the influence of ambiguous observations manifested by missing information on causal reasoning depended on the availability of cognitive resources, suggesting that processing ambiguous information may involve deliberative cognitive processes. Experiment 1 demonstrated that subjects did not ignore the ambiguous observations in causal reasoning. They also had a general tendency to treat the ambiguous observations as negative evidence against the causal association. Experiment 2 and Experiment 3 included a causal learning task requiring a high cognitive demand in which paired stimuli were presented to subjects sequentially. Both experiments revealed that processing ambiguous or missing observations can depend on the availability of cognitive resources. Experiment 2 suggested that the contribution of working memory capacity to the comprehensiveness of evidence retention was reduced when there were ambiguous or missing observations. Experiment 3 demonstrated that an increase in cognitive demand due to a change in the task format reduced subjects’ tendency to treat ambiguous-missing observations as negative cues.
Collapse
|
31
|
Blanco F, Barberia I, Matute H. Individuals Who Believe in the Paranormal Expose Themselves to Biased Information and Develop More Causal Illusions than Nonbelievers in the Laboratory. PLoS One 2015; 10:e0131378. [PMID: 26177025 PMCID: PMC4503786 DOI: 10.1371/journal.pone.0131378] [Citation(s) in RCA: 27] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/16/2014] [Accepted: 06/01/2015] [Indexed: 11/18/2022] Open
Abstract
In the reasoning literature, paranormal beliefs have been proposed to be linked to two related phenomena: a biased perception of causality and a biased information-sampling strategy (believers tend to test fewer hypotheses and prefer confirmatory information). In parallel, recent contingency learning studies showed that, when two unrelated events coincide frequently, individuals interpret this ambiguous pattern as evidence of a causal relationship. Moreover, the latter studies indicate that sampling more cause-present cases than cause-absent cases strengthens the illusion. If paranormal believers actually exhibit a biased exposure to the available information, they should also show this bias in the contingency learning task: they would in fact expose themselves to more cause-present cases than cause-absent trials. Thus, by combining the two traditions, we predicted that believers in the paranormal would be more vulnerable to developing causal illusions in the laboratory than nonbelievers because there is a bias in the information they experience. In this study, we found that paranormal beliefs (measured using a questionnaire) correlated with causal illusions (assessed by using contingency judgments). As expected, this correlation was mediated entirely by the believers' tendency to expose themselves to more cause-present cases. The association between paranormal beliefs, biased exposure to information, and causal illusions was only observed for ambiguous materials (i.e., the noncontingent condition). In contrast, the participants' ability to detect causal relationships which did exist (i.e., the contingent condition) was unaffected by their susceptibility to believe in paranormal phenomena.
Collapse
Affiliation(s)
- Fernando Blanco
- Labpsico, Departamento de Fundamentos y Métodos de la Psicología, Universidad de Deusto, Bilbao, Spain
- * E-mail:
| | - Itxaso Barberia
- The Event Lab, Facultat de Psicologia, Universitat de Barcelona, Barcelona, Spain
- Departament de Psicologia Bàsica, Facultat de Psicologia, Universitat de Barcelona, Barcelona, Spain
| | - Helena Matute
- Labpsico, Departamento de Fundamentos y Métodos de la Psicología, Universidad de Deusto, Bilbao, Spain
| |
Collapse
|
32
|
Shou Y, Smithson M. Effects of question formats on causal judgments and model evaluation. Front Psychol 2015; 6:467. [PMID: 25954225 PMCID: PMC4404718 DOI: 10.3389/fpsyg.2015.00467] [Citation(s) in RCA: 6] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/20/2015] [Accepted: 03/31/2015] [Indexed: 11/13/2022] Open
Abstract
Evaluation of causal reasoning models depends on how well the subjects' causal beliefs are assessed. Elicitation of causal beliefs is determined by the experimental questions put to subjects. We examined the impact of question formats commonly used in causal reasoning research on participant's responses. The results of our experiment (Study 1) demonstrate that both the mean and homogeneity of the responses can be substantially influenced by the type of question (structure induction versus strength estimation versus prediction). Study 2A demonstrates that subjects' responses to a question requiring them to predict the effect of a candidate cause can be significantly lower and more heterogeneous than their responses to a question asking them to diagnose a cause when given an effect. Study 2B suggests that diagnostic reasoning can strongly benefit from cues relating to temporal precedence of the cause in the question. Finally, we evaluated 16 variations of recent computational models and found the model fitting was substantially influenced by the type of questions. Our results show that future research in causal reasoning should place a high priority on disentangling the effects of question formats from the effects of experimental manipulations, because that will enable comparisons between models of causal reasoning uncontaminated by method artifact.
Collapse
Affiliation(s)
- Yiyun Shou
- Research School of Psychology, The Australian National University Canberra, ACT, Australia
| | - Michael Smithson
- Research School of Psychology, The Australian National University Canberra, ACT, Australia
| |
Collapse
|
33
|
Yarritu I, Matute H. Previous knowledge can induce an illusion of causality through actively biasing behavior. Front Psychol 2015; 6:389. [PMID: 25904883 PMCID: PMC4389369 DOI: 10.3389/fpsyg.2015.00389] [Citation(s) in RCA: 12] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/31/2014] [Accepted: 03/18/2015] [Indexed: 11/13/2022] Open
Abstract
It is generally assumed that the way people assess the relationship between a cause and an outcome is closely related to the actual evidence existing about the co-occurrence of these events. However, people's estimations are often biased, and this usually translates into illusions of causality. Some have suggested that such illusions could be the result of previous knowledge-based expectations. In the present research we explored the role that previous knowledge has in the development of illusions of causality. We propose that previous knowledge influences the assessment of causality by influencing the decisions about responding or not (i.e., presence or absence of the potential cause), which biases the information people are exposed to, and this in turn produces illusions congruent with such biased information. In a non-contingent situation in which participants decided whether the potential cause was present or absent (Experiment 1), the influence of expectations on participants' judgments was mediated by the probability of occurrence of the potential cause (determined by participants' responses). However, in an identical situation, except that the participants were not allowed to decide the occurrence of the potential cause (Experiment 2), only the probability of the cause was significant, not the expectations or the interaction. Together, these results support our hypothesis that knowledge-based expectations affect the development of causal illusions by the mediation of behavior, which biases the information received.
Collapse
Affiliation(s)
- Ion Yarritu
- Departamento de Fundamentos y Métodos de la Psicología, Universidad de Deusto Bilbao, Spain
| | - Helena Matute
- Departamento de Fundamentos y Métodos de la Psicología, Universidad de Deusto Bilbao, Spain
| |
Collapse
|
34
|
White PA. Causal judgements about temporal sequences of events in single individuals. Q J Exp Psychol (Hove) 2015; 68:2149-74. [PMID: 25728947 DOI: 10.1080/17470218.2015.1009475] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/23/2022]
Abstract
Stimuli were presented in which values of an outcome variable for a single individual were recorded over 24 time periods, and an intervention was introduced at one of the time periods. Participants judged whether and how much the intervention affected the outcome. Judgements were affected by manipulations of the temporal relation between the intervention and a gradual increase in values on the outcome variable, by the size of the increase, by the time taken for the increase to occur, and by variance in the preincrease data. Most results were predicted by a simple model in which the mean outcome value for the preintervention time periods is subtracted from the mean outcome value for the postintervention time periods, though there was also an effect of temporal contiguity that is not predicted by the simple model. This form of information, which is a kind of quasiexperimental design, is more representative of the kind of information generally available for causal judgement than the more commonly investigated binary variables in which the cause is either present or absent, and the outcome either occurs or does not; as such, it is more revealing of how causal judgements are made under the conditions that prevail in the world.
Collapse
Affiliation(s)
- Peter A White
- a School of Psychology , Cardiff University , Cardiff , UK
| |
Collapse
|
35
|
Yeung S, Griffiths TL. Identifying expectations about the strength of causal relationships. Cogn Psychol 2015; 76:1-29. [DOI: 10.1016/j.cogpsych.2014.11.001] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/21/2014] [Revised: 10/21/2014] [Accepted: 11/03/2014] [Indexed: 10/24/2022]
|
36
|
Yarritu I, Matute H, Luque D. The dark side of cognitive illusions: when an illusory belief interferes with the acquisition of evidence-based knowledge. Br J Psychol 2015; 106:597-608. [PMID: 25641547 PMCID: PMC5024046 DOI: 10.1111/bjop.12119] [Citation(s) in RCA: 15] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/12/2014] [Revised: 12/10/2014] [Indexed: 11/28/2022]
Abstract
Cognitive illusions are often associated with mental health and well‐being. However, they are not without risk. This research shows they can interfere with the acquisition of evidence‐based knowledge. During the first phase of the experiment, one group of participants was induced to develop a strong illusion that a placebo medicine was effective to treat a fictitious disease, whereas another group was induced to develop a weak illusion. Then, in Phase 2, both groups observed fictitious patients who always took the bogus treatment simultaneously with a second treatment which was effective. Our results showed that the group who developed the strong illusion about the effectiveness of the bogus treatment during Phase 1 had more difficulties in learning during Phase 2 that the added treatment was effective.
Collapse
Affiliation(s)
- Ion Yarritu
- Department of Experimental Psychology, Deusto University, Bilbao, Spain
| | - Helena Matute
- Department of Experimental Psychology, Deusto University, Bilbao, Spain
| | - David Luque
- Biomedical Research Institute (IBIMA), University of Malaga, Spain.,School of Psychology, UNSW, Sydney, Australia
| |
Collapse
|
37
|
Beller S, Bender A. How contrast situations affect the assignment of causality in symmetric physical settings. Front Psychol 2015; 5:1497. [PMID: 25620937 PMCID: PMC4287057 DOI: 10.3389/fpsyg.2014.01497] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/12/2014] [Accepted: 12/04/2014] [Indexed: 11/13/2022] Open
Abstract
In determining the prime cause of a physical event, people often weight one of two entities in a symmetric physical relation as more important for bringing about the causal effect than the other. In a broad survey (Bender and Beller, 2011), we documented such weighting effects for different kinds of physical events and found that their direction and strength depended on a variety of factors. Here, we focus on one of those: adding a contrast situation that-while being formally irrelevant-foregrounds one of the factors and thus frames the task in a specific way. In two experiments, we generalize and validate our previous findings by using different stimulus material (in Experiment 1), by applying a different response format to elicit causal assignments, an analog rating scale instead of a forced-choice decision (in Experiment 2), and by eliciting explanations for the physical events in question (in both Experiments). The results generally confirm the contrast effects for both response formats; however, the effects were more pronounced with the force-choice format than with the rating format. People tended to refer to the given contrast in their explanations, which validates our manipulation. Finally, people's causal assignments are reflected in the type of explanation given in that contrast and property explanations were associated with biased causal assignments, whereas relational explanations were associated with unbiased assignments. In the discussion, we pick up the normative questions of whether or not these contrast effects constitute a bias in causal reasoning.
Collapse
Affiliation(s)
- Sieghard Beller
- Department of Psychosocial Science, Faculty of Psychology, University of Bergen Bergen, Norway
| | - Andrea Bender
- Department of Psychosocial Science, Faculty of Psychology, University of Bergen Bergen, Norway
| |
Collapse
|
38
|
Khemlani SS, Barbey AK, Johnson-Laird PN. Causal reasoning with mental models. Front Hum Neurosci 2014; 8:849. [PMID: 25389398 PMCID: PMC4211462 DOI: 10.3389/fnhum.2014.00849] [Citation(s) in RCA: 28] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/28/2014] [Accepted: 10/03/2014] [Indexed: 11/29/2022] Open
Abstract
This paper outlines the model-based theory of causal reasoning. It postulates that the core meanings of causal assertions are deterministic and refer to temporally-ordered sets of possibilities: A causes B to occur means that given A, B occurs, whereas A enables B to occur means that given A, it is possible for B to occur. The paper shows how mental models represent such assertions, and how these models underlie deductive, inductive, and abductive reasoning yielding explanations. It reviews evidence both to corroborate the theory and to account for phenomena sometimes taken to be incompatible with it. Finally, it reviews neuroscience evidence indicating that mental models for causal inference are implemented within lateral prefrontal cortex.
Collapse
Affiliation(s)
- Sangeet S Khemlani
- Navy Center for Applied Research in Artificial Intelligence, Naval Research Laboratory Washington, DC, USA
| | - Aron K Barbey
- Beckman Institute for Advanced Science and Technology, University of Illinoi at Urbana-Champaign Urbana, IL, USA
| | - Philip N Johnson-Laird
- Department of Psychology, Princeton University Princeton, NJ, USA ; Department of Psychology, New York University New York, NY, USA
| |
Collapse
|
39
|
Vadillo MA, Ortega-Castro N, Barberia I, Baker AG. Two heads are better than one, but how much? Evidence that people's use of causal integration rules does not always conform to normative standards. Exp Psychol 2014; 61:356-67. [PMID: 24614872 PMCID: PMC4207133 DOI: 10.1027/1618-3169/a000255] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/23/2022]
Abstract
Many theories of causal learning and causal induction differ in their
assumptions about how people combine the causal impact of several causes
presented in compound. Some theories propose that when several causes are
present, their joint causal impact is equal to the linear sum of the individual
impact of each cause. However, some recent theories propose that the causal
impact of several causes needs to be combined by means of a noisy-OR integration
rule. In other words, the probability of the effect given several causes would
be equal to the sum of the probability of the effect given each cause in
isolation minus the overlap between those probabilities. In the present series
of experiments, participants were given information about the causal impact of
several causes and then they were asked what compounds of those causes they
would prefer to use if they wanted to produce the effect. The results of these
experiments suggest that participants actually use a variety of strategies,
including not only the linear and the noisy-OR integration rules, but also
averaging the impact of several causes.
Collapse
|
40
|
Affiliation(s)
- Arne Dietrich
- Department of Psychology, American University of Beirut Beirut, Lebanon
| |
Collapse
|
41
|
Goedert KM, Grimm LR, Markman AB, Spellman BA. Priming interdependence affects processing of context information in causal inference--but not how you might think. Acta Psychol (Amst) 2014; 146:41-50. [PMID: 24374491 DOI: 10.1016/j.actpsy.2013.11.006] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/22/2012] [Revised: 08/30/2013] [Accepted: 11/23/2013] [Indexed: 11/26/2022] Open
Abstract
Cultural mindset is related to performance on a variety of cognitive tasks. In particular, studies of both chronic and situationally-primed mindsets show that individuals with a relatively interdependent mindset (i.e., an emphasis on relationships and connections among individuals) are more sensitive to background contextual information than individuals with a more independent mindset. Two experiments tested whether priming cultural mindset would affect sensitivity to background causes in a contingency learning and causal inference task. Participants were primed (either independent or interdependent), and then saw complete contingency information on each of 12 trials for two cover stories in Experiment 1 (hiking causing skin rashes, severed brakes causing wrecked cars) and two additional cover stories in Experiment 2 (school deadlines causing stress, fertilizers causing plant growth). We expected that relative to independent-primed participants, those interdependent-primed would give more weight to the explicitly-presented data indicative of hidden alternative background causes, but they did not do so. In Experiment 1, interdependents gave less weight to the data indicative of hidden background causes for the car accident cover story and showed a decreased sensitivity to the contingencies for that story. In Experiment 2, interdependents placed less weight on the observable data for cover stories that supported more extra-experimental causes, while independents' sensitivity did not vary with these extra-experimental causes. Thus, interdependents were more sensitive to background causes not explicitly presented in the experiment, but this sensitivity hurt rather than improved their acquisition of the explicitly-presented contingency information.
Collapse
|
42
|
Blanco F, Barberia I, Matute H. The lack of side effects of an ineffective treatment facilitates the development of a belief in its effectiveness. PLoS One 2014; 9:e84084. [PMID: 24416194 PMCID: PMC3885525 DOI: 10.1371/journal.pone.0084084] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/02/2013] [Accepted: 11/10/2013] [Indexed: 11/18/2022] Open
Abstract
Some alternative medicines enjoy widespread use, and in certain situations are preferred over conventional, validated treatments in spite of the fact that they fail to prove effective when tested scientifically. We propose that the causal illusion, a basic cognitive bias, underlies the belief in the effectiveness of bogus treatments. Therefore, the variables that modulate the former might affect the latter. For example, it is well known that the illusion is boosted when a potential cause occurs with high probability. In this study, we examined the effect of this variable in a fictitious medical scenario. First, we showed that people used a fictitious medicine (i.e., a potential cause of remission) more often when they thought it caused no side effects. Second, the more often they used the medicine, the more likely they were to develop an illusory belief in its effectiveness, despite the fact that it was actually useless. This behavior may be parallel to actual pseudomedicine usage; that because a treatment is thought to be harmless, it is used with high frequency, hence the overestimation of its effectiveness in treating diseases with a high rate of spontaneous relief. This study helps shed light on the motivations spurring the widespread preference of pseudomedicines over scientific medicines. This is a valuable first step toward the development of scientifically validated strategies to counteract the impact of pseudomedicine on society.
Collapse
Affiliation(s)
- Fernando Blanco
- Universidad de Deusto, Departamento de Fundamentos y Métodos de la Psicología, Bilbao, Spain
| | - Itxaso Barberia
- Universidad de Deusto, Departamento de Fundamentos y Métodos de la Psicología, Bilbao, Spain
| | - Helena Matute
- Universidad de Deusto, Departamento de Fundamentos y Métodos de la Psicología, Bilbao, Spain
| |
Collapse
|
43
|
White PA. Causal Judgement from Information about Outcome Magnitude. Q J Exp Psychol (Hove) 2013; 66:2268-88. [DOI: 10.1080/17470218.2013.777750] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/27/2022]
Abstract
Information was presented in which a candidate cause was either present or absent, and the outcome variable (number of spots on a patient's skin) could take any of four nonzero values. It was found that cause-absent information carried greater weight than cause-present information. This is contrary to the usual finding for contingency information about binary outcome variables. Judgement was influenced more by extreme values of the outcome variable, and larger outcome values tended to have more effect on judgements than smaller outcome values. The hypothesis that participants compute linear correlation is disconfirmed by these results. Instead, the results show that participants focus disproportionate attention on some kinds of events and neglect others.
Collapse
|
44
|
White PA. Singular Clues to Causality and Their Use in Human Causal Judgment. Cogn Sci 2013; 38:38-75. [DOI: 10.1111/cogs.12075] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/21/2012] [Revised: 10/10/2012] [Accepted: 02/13/2013] [Indexed: 11/30/2022]
|
45
|
Barberia I, Blanco F, Cubillas CP, Matute H. Implementation and assessment of an intervention to debias adolescents against causal illusions. PLoS One 2013; 8:e71303. [PMID: 23967189 PMCID: PMC3743900 DOI: 10.1371/journal.pone.0071303] [Citation(s) in RCA: 25] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/22/2013] [Accepted: 06/25/2013] [Indexed: 11/28/2022] Open
Abstract
Researchers have warned that causal illusions are at the root of many superstitious beliefs and fuel many people’s faith in pseudoscience, thus generating significant suffering in modern society. Therefore, it is critical that we understand the mechanisms by which these illusions develop and persist. A vast amount of research in psychology has investigated these mechanisms, but little work has been done on the extent to which it is possible to debias individuals against causal illusions. We present an intervention in which a sample of adolescents was introduced to the concept of experimental control, focusing on the need to consider the base rate of the outcome variable in order to determine if a causal relationship exists. The effectiveness of the intervention was measured using a standard contingency learning task that involved fake medicines that typically produce causal illusions. Half of the participants performed the contingency learning task before participating in the educational intervention (the control group), and the other half performed the task after they had completed the intervention (the experimental group). The participants in the experimental group made more realistic causal judgments than did those in the control group, which served as a baseline. To the best of our knowledge, this is the first evidence-based educational intervention that could be easily implemented to reduce causal illusions and the many problems associated with them, such as superstitions and belief in pseudoscience.
Collapse
Affiliation(s)
- Itxaso Barberia
- Departamento de Fundamentos y Métodos de la Psicología, Universidad de Deusto, Bilbao, Spain.
| | | | | | | |
Collapse
|
46
|
Barberia I, Baetu I, Sansa J, Baker AG. When is a cause the "same"? Incoherent generalization across contexts. Q J Exp Psychol (Hove) 2013; 67:281-303. [PMID: 23777427 DOI: 10.1080/17470218.2013.804102] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/26/2022]
Abstract
A theory or model of cause such as Cheng's power (p) allows people to predict the effectiveness of a cause in a different causal context from the one in which they observed its actions. Liljeholm and Cheng demonstrated that people could detect differences in the effectiveness of the cause when causal power varied across contexts of different outcome base rates, but that they did not detect similar changes when only the cause-outcome contingency, ∆p, but not power, varied. However, their procedure allowed participants to simplify the causal scenarios and consider only a subsample of observations with a base rate of zero. This confounds p, ∆p, and the probability of an outcome (O) given a cause (C), P(O|C). Furthermore, the contingencies that they used confounded p and P(O|C) in the overall sample. Following the work of Liljeholm and Cheng, we examined whether causal induction in a wider range of situations follows the principles suggested by Cheng. Experiments 1a and 1b compared the procedure used by Liljeholm and Cheng with one that did not allow the sample of observations to be simplified. Experiments 2a and 2b compared the same two procedures using contingencies that controlled for P(O|C). The results indicated that, if the possibility of converting all contexts to a zero base rate situation was avoided, people were sensitive to changes in P(O|C), p, and ∆p when each of these was varied. This is inconsistent with Liljeholm and Cheng's conclusion that people detect only changes in p. These results question the idea that people naturally extract the metric or model of cause from their observation of stochastic events and then, reasonably exclusively, use this theory of a causal mechanism, or for that matter any simple normative theory, to generalize their experience to alternative contexts.
Collapse
|
47
|
Catena A, Perales JC, Megías A, Cándido A, Jara E, Maldonado A. The brain network of expectancy and uncertainty processing. PLoS One 2012; 7:e40252. [PMID: 22768344 PMCID: PMC3388057 DOI: 10.1371/journal.pone.0040252] [Citation(s) in RCA: 39] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/05/2011] [Accepted: 06/03/2012] [Indexed: 11/19/2022] Open
Abstract
BACKGROUND The Stimulus Preceding Negativity (SPN) is a non-motor slow cortical potential elicited by temporally predictable stimuli, customarily interpreted as a physiological index of expectancy. Its origin would be the brain activity responsible for generating the anticipatory mental representation of an expected upcoming event. The SPN manifests itself as a slow cortical potential with negative slope, growing in amplitude as the stimulus approximates. The uncertainty hypothesis we present here postulates that the SPN is linked to control-related areas in the prefrontal cortex that become more active before the occurrence of an upcoming outcome perceived as uncertain. METHODS/FINDINGS We tested the uncertainty hypothesis by using a repeated measures design in a Human Contingency Learning task with two levels of uncertainty. In the high uncertainty condition, the outcome is unpredictable. In the mid uncertainty condition, the outcome can be learnt to be predicted in 75% of the trials. Our experiment shows that the Stimulus Preceding Negativity is larger for probabilistically unpredictable (uncertain) outcomes than for probabilistically predictable ones. sLoreta estimations of the brain activity preceding the outcome suggest that prefrontal and parietal areas can be involved in its generation. Prefrontal sites activation (Anterior Cingulate and Dorsolateral Prefrontal Cortex) seems to be related to the degree of uncertainty. Activation in posterior parietal areas, however, does not correlates with uncertainty. CONCLUSIONS/SIGNIFICANCE We suggest that the Stimulus Preceding Negativity reflects the attempt to predict the outcome, when posterior brain areas fail to generate a stable expectancy. Uncertainty is thus conceptualized, not just as the absence of learned expectancy, but as a state with psychological and physiological entity.
Collapse
Affiliation(s)
- Andrés Catena
- Departamento de Psicología Experimental, Universidad de Granada, Granada, Spain.
| | | | | | | | | | | |
Collapse
|
48
|
Kuhn D. The development of causal reasoning. WILEY INTERDISCIPLINARY REVIEWS. COGNITIVE SCIENCE 2012; 3:327-335. [PMID: 26301465 DOI: 10.1002/wcs.1160] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.6] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
How do inference rules for causal learning themselves change developmentally? A model of the development of causal reasoning must address this question, as well as specify the inference rules. Here, the evidence for developmental changes in processes of causal reasoning is reviewed, with the distinction made between diagnostic causal inference and causal prediction. Also addressed is the paradox of a causal reasoning literature that highlights the competencies of young children and the proneness to error among adults. WIREs Cogn Sci 2012, 3:327-335. doi: 10.1002/wcs.1160 For further resources related to this article, please visit the WIREs website.
Collapse
Affiliation(s)
- Deanna Kuhn
- Department of Human Development, Teachers College Columbia University, New York, NY, USA
| |
Collapse
|
49
|
What the Bayesian framework has contributed to understanding cognition: Causal learning as a case study. Behav Brain Sci 2011. [DOI: 10.1017/s0140525x1100032x] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
Abstract
AbstractThe field of causal learning and reasoning (largely overlooked in the target article) provides an illuminating case study of how the modern Bayesian framework has deepened theoretical understanding, resolved long-standing controversies, and guided development of new and more principled algorithmic models. This progress was guided in large part by the systematic formulation and empirical comparison of multiple alternative Bayesian models.
Collapse
|
50
|
Blanco F, Matute H, Vadillo MA. Making the Uncontrollable Seem Controllable: the Role of Action in the Illusion of Control. Q J Exp Psychol (Hove) 2011; 64:1290-304. [DOI: 10.1080/17470218.2011.552727] [Citation(s) in RCA: 40] [Impact Index Per Article: 3.1] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
Abstract
It is well known that certain variables can bias judgements about the perceived contingency between an action and an outcome, making them depart from the normative predictions. For instance, previous studies have proven that the activity level or probability of responding, P(R), is a crucial variable that can affect these judgements in objectively noncontingent situations. A possible account for the P(R) effect is based on the differential exposure to actual contingencies during the training phase, which is in turn presumably produced by individual differences in participants' P(R). The current two experiments replicate the P(R) effect in a free-response paradigm, and show that participants' judgements are better predicted by P(R) than by the actual contingency to which they expose themselves. Besides, both experiments converge with previous empirical data, showing a persistent bias that does not vanish as training proceeds. These findings contrast with the preasymptotic and transitory effect predicted by several theoretical models.
Collapse
Affiliation(s)
- Fernando Blanco
- Department of Psychology, University of Leuven, Leuven, Belgium
- Departamento de Fundamentos y Métodos de la Psicología, University of Deusto, Bilbao, Spain
| | - Helena Matute
- Departamento de Fundamentos y Métodos de la Psicología, University of Deusto, Bilbao, Spain
| | - Miguel A. Vadillo
- Departamento de Fundamentos y Métodos de la Psicología, University of Deusto, Bilbao, Spain
| |
Collapse
|