Information Uptake: Understanding the Impact of New Numerical Information on Quantitative Judgments and Real-World Knowledge Open Access
- Other title
judgment under uncertainty
seeding the knowledge base
- Type of item
- Degree grantor
University of Alberta
- Author or creator
Schweickart, Oliver K.
- Supervisor and department
Brown, Norman R.
- Examining committee member and department
Brown, Norman R. (Department of Psychology, University of Alberta)
Mou, Weimin (Department of Psychology, University of Alberta)
Häubl, Gerald (School of Business, University of Alberta)
Dixon, Peter (Department of Psychology, University of Alberta)
Friedman, Alinda (Department of Psychology, University of Alberta)
Blank, Hartmut (Department of Psychology, University of Portsmouth)
Department of Psychology
- Date accepted
- Graduation date
Doctor of Philosophy
- Degree level
People are constantly exposed to numerical information in their physical and social environments (e.g., food calories, product prices, etc.). Therefore, an important question is when, how, and why this information impacts people’s beliefs about the world and their judgments and decisions. Unfortunately, the research literature on this topic provides very different, and at times contradictory, conceptualizations of how quantitative information interacts with real-world knowledge and how it impacts human judgment. For example, the well-known anchoring and hindsight bias effects observed in quantitative judgment tasks are often portrayed as prime examples of how external information can trigger unconscious and automatic mental processes that inevitably influence people’s judgments and beliefs. In contrast, studies on numerical advice-taking highlight people’s conservatism when it comes to judgment revision, showing that rejection of new numerical information is quite common, and that people often fail to take advantage of a (generally superior) averaging strategy. Finally, the seeding literature paints a more positive picture, demonstrating that people have a reasonably good ability to draw inductive generalizations from samples of real-world quantitative information. In this thesis, I propose a unified framework for understanding the above-mentioned phenomena—seeding, advice-taking, anchoring, and hindsight bias—in the context of numerical judgment. This framework asserts (a) that the management of numerical information is generally based on controlled processes, (b) that people use different response modes (e.g., rejection, adoption, etc.) when they interact with numerical information, (c) that this latter assertion necessitates an analysis on a response mode level to understand the relevant phenomena, and (d) that seeds, advice, and anchors can be conceptualized as numerical information varying along a source credibility continuum. The central assumptions of this framework were tested in 4 experiments. In Experiments 1-3, a hybrid advice-taking/seeding paradigm was used where participants first generated population estimates for a set of countries, then were exposed to numerical information for a subset of these countries, and finally provided a second set of estimates for all countries. These experiments revealed (a) that information utilization varied as a function of the source credibility of the information, and that the aggregate source credibility effect was driven by the adoption rate; (b) that irrespective of the source credibility level, the provided numerical information elicited transfer; however the presence of transfer was contingent on prior information utilization; (c) that informational context—defined as the numerical information made accessible to the decision maker during the target judgment—impacted the rejection rate, specifically the accessibility of one’s prior estimate increased the likelihood of rejection. The primary objective of Experiment 4 was to test competing predictions between models of hindsight bias that link the effect to automatic processes underlying knowledge revision, and the current framework of controlled information processing. This experiment employed a hybrid advice-taking/hindsight bias paradigm where participants first answered a heterogeneous set of estimation questions, and then were given the opportunity to revise their judgment in response to numerical advice. Finally, participants had to recall their initial estimates. Counter to the predictions of automatic accounts, the results of this study indicated that the presence of hindsight bias depended on the prior utilization of advice. In other words, if advice was rejected, no hindsight bias emerged in the subsequent recall task. In sum, the results of these experiments are consistent with the proposed framework that highlights the role of controlled processes in quantitative judgments under uncertainty. I end by discussing implications of these findings for understanding numerical judgment in real-world knowledge domains.
- This thesis is made available by the University of Alberta Libraries with permission of the copyright owner solely for the purpose of private, scholarly or scientific research. This thesis, or any portion thereof, may not otherwise be copied or reproduced without the written consent of the copyright owner, except to the extent permitted by Canadian copyright law.
- Citation for previous publication
- Date Uploaded
- Date Modified
- Audit Status
- Audits have not yet been run on this file.
File format: pdf (PDF/A)
Mime type: application/pdf
File size: 3273126
Last modified: 2016:11:08 11:16:14-07:00
Original checksum: 6ea9989424adaba44a4596e9cbefe5f9
Activity of users you follow