To study the effects of misinformation on attitudes, some social science experiments expose participants to false, misleading, or dangerous information. Most Institutional Review Boards require that such studies be followed by a debriefing session, in which participants are told that the information that was presented was not true. Katherine Clayton and colleagues sought to determine whether these debriefs can "undo" the effects of exposure to misinformation. The authors first replicated existing misinformation research related to beliefs about COVID-19 vaccines and election security, varying whether questions gauging belief in the false statements occurred before or after the debrief. The authors found that erroneous beliefs about vaccines and elections persisted even after participants were debriefed. In a second study, the authors presented participants with nonpolitical falsehoods, such as the idea that toilets flush in opposite directions in each hemisphere. In these less charged contexts, debriefs were more effective. In a final series of studies, the authors tested the effectiveness of an enhanced debrief, in which participants saw an extensive fact-check of the false information and were asked to actively acknowledge their exposure to falsehoods during the study. The improved debrief increased belief accuracy by more than two points on a seven-point scale, whereas the original debrief failed to move the needle. According to the authors, current social science practice may harm participants by failing to reverse the effects of exposure to misinformation and scholars should consider using enhanced debriefing practices to protect participants.
Misinformation Risks Growing Without Proper Debriefs
/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.