1
|
Buhler AG, Brannon B, Cataldo TT, Faniel IM, Connaway LS, Valenza JK, Elrod R, Cyr C. How real is real enough? Participant feedback on a behavioral simulation used for information-seeking behavior research. JOURNAL OF LIBRARIANSHIP AND INFORMATION SCIENCE 2022. [DOI: 10.1177/09610006211067799] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
A challenge of studying information-seeking behavior in open web systems is the unpredictability of those systems. One solution to counteract this issue is employing a simulation to ensure experimental control. However, concerns arise over the realism of such an environment. This paper assesses the realism of a behavioral simulation used to study the evaluation behavior of 175 students from fourth grade through graduate school. We assess realism through the examination of targeted participant feedback about what would have made the simulated environment and tasks more realistic to these participants. Based on this feedback, we reflect on decisions made in designing the simulation and offer recommendations for future studies interested in incorporating behavioral simulation in their research design. We find that a thoughtfully designed simulation can elicit naturalistic behavior when the controlled environment is designed to be realistic in meaningful ways. Because the simulation does not have to perfectly match reality to elicit these behaviors, designing a simulation that is real enough is an effective method to study information-seeking behavior.
Collapse
|
2
|
Authentic versus synthetic: An investigation of the influences of study settings and task configurations on search behaviors. J Assoc Inf Sci Technol 2021. [DOI: 10.1002/asi.24554] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
|
3
|
O’Brien HL, Toms EG. Examining the generalizability of the User Engagement Scale (UES) in exploratory search. Inf Process Manag 2013. [DOI: 10.1016/j.ipm.2012.08.005] [Citation(s) in RCA: 45] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/27/2022]
|
4
|
Abstract
Typically studies of information retrieval and interactive information retrieval concentrate on the identification of relevant items. In this study, rather than stop at finding relevant items, we considered how people use a search system in the completion of a broader work task. To conduct the study, we created 12 tasks that required multiple queries and document views in order to find enough information to complete the task. A total of 381 people completed three tasks each in a laboratory setting using the wikiSearch system that was embedded into WiIRE. Results found that two-thirds of time spent on the task was spent after finding a relevant set of documents sufficient for task completion, and that time was mainly spent reviewing documents that had already been retrieved. Findings suggest that an open-source information retrieval system, such as Lucene, was adequate for this task. However, the ultimate challenge will be in building useful systems that aid the user in extracting, interpreting and analysing information to achieve work task completion.
Collapse
|
5
|
Kelly D, Fu X, Shah C. Effects of position and number of relevant documents retrieved on users' evaluations of system performance. ACM T INFORM SYST 2010. [DOI: 10.1145/1740592.1740597] [Citation(s) in RCA: 8] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/19/2022]
Abstract
Information retrieval research has demonstrated that system performance does not always correlate positively with user performance, and that users often assign positive evaluation scores to search systems even when they are unable to complete tasks successfully. This research investigated the relationship between objective measures of system performance and users' perceptions of that performance. In this study, subjects evaluated the performance of four search systems whose search results were manipulated systematically to produce different orderings and numbers of relevant documents. Three laboratory studies were conducted with a total of eighty-one subjects. The first two studies investigated the effect of the order of five relevant and five nonrelevant documents in a search results list containing ten results on subjects' evaluations. The third study investigated the effect of varying the number of relevant documents in a search results list containing ten results on subjects' evaluations. Results demonstrate linear relationships between subjects' evaluations and the position of relevant documents in a search results list and the total number of relevant documents retrieved. Of the two, number of relevant documents retrieved was a stronger predictor of subjects' evaluation ratings and resulted in subjects using a greater range of evaluation scores.
Collapse
Affiliation(s)
- Diane Kelly
- University of North Carolina at Chapel Hill, Chapel Hill, NC
| | - Xin Fu
- University of North Carolina at Chapel Hill, Chapel Hill, NC
| | - Chirag Shah
- University of North Carolina at Chapel Hill, Chapel Hill, NC
| |
Collapse
|
6
|
Blandford A, Adams A, Attfield S, Buchanan G, Gow J, Makri S, Rimmer J, Warwick C. The PRET A Rapporter framework: Evaluating digital libraries from the perspective of information work. Inf Process Manag 2008. [DOI: 10.1016/j.ipm.2007.01.021] [Citation(s) in RCA: 23] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
|
7
|
Kelly D, Harper DJ, Landau B. Questionnaire mode effects in interactive information retrieval experiments. Inf Process Manag 2008. [DOI: 10.1016/j.ipm.2007.02.007] [Citation(s) in RCA: 20] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/23/2022]
|
8
|
Jansen BJ. Search log analysis: What it is, what's been done, how to do it. LIBRARY & INFORMATION SCIENCE RESEARCH 2006. [DOI: 10.1016/j.lisr.2006.06.005] [Citation(s) in RCA: 133] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/24/2022]
|