De- and recoding algorithmic systems: The case of fact checkers and fact checked users

Research output: Contribution to journalJournal articleResearchpeer-review

Standard

De- and recoding algorithmic systems : The case of fact checkers and fact checked users. / Schjøtt, Anna; Bengtsson, Mette.

In: Convergence: The International Journal of Research into New Media Technologies, 2024.

Research output: Contribution to journalJournal articleResearchpeer-review

Harvard

Schjøtt, A & Bengtsson, M 2024, 'De- and recoding algorithmic systems: The case of fact checkers and fact checked users', Convergence: The International Journal of Research into New Media Technologies.

APA

Schjøtt, A., & Bengtsson, M. (Accepted/In press). De- and recoding algorithmic systems: The case of fact checkers and fact checked users. Convergence: The International Journal of Research into New Media Technologies.

Vancouver

Schjøtt A, Bengtsson M. De- and recoding algorithmic systems: The case of fact checkers and fact checked users. Convergence: The International Journal of Research into New Media Technologies. 2024.

Author

Schjøtt, Anna ; Bengtsson, Mette. / De- and recoding algorithmic systems : The case of fact checkers and fact checked users. In: Convergence: The International Journal of Research into New Media Technologies. 2024.

Bibtex

@article{9ab56af5f0674ce8aa47625061925e26,
title = "De- and recoding algorithmic systems: The case of fact checkers and fact checked users",
abstract = "With the recent development of debunking on social media as a dominant agenda, fact checkers have increasingly used machine learning (ML) to identify, verify and correct factual claims, as ML promises the scaling of fact checking practices. However, it also places a new actor in between the fact checkers and the fact checked users. In this paper, we conducted a contrasted analysis of how fact checkers and fact checked users understand, evaluate and act towards the algorithmic systems and the data flows in Meta{\textquoteright}s Third-Party Fact-Checking Program. For both professional users and end users, the algorithmic system is experienced as a black box in which they have limited insight, and their sense-making practices happen based on the data and metrics that are made visible to them. In the paper, we draw on and expanded theory on decoding algorithms by not only exploring how the two user groups engage in decoding the algorithmic system, but also actively engage in forms of recoding by attempting to adapt or modify the algorithmic system to better fit within their cultural and social context, which is characterised by both varying epistemic cultures and societal positions. While the fact checkers from their hegemonic (sometimes negotiated) position understand the program as a (sometimes stupid) tool and primarily engage in passive acts of recoding, fact checked users, from their oppositional position, understand the program as an unpredictable censoring machine and engage primarily in more active acts of recoding. Based on the analysis, we end the paper with a discussion in which we argued for understanding data reflexivity as highly relational and processual.",
keywords = "Faculty of Humanities, algorithmic systems, act of decoding and recoding, sense-making practices, data reflexivity, journalism, fact checking, Meta{\textquoteright}s third-party fact checking program, debunking",
author = "Anna Schj{\o}tt and Mette Bengtsson",
year = "2024",
language = "English",
journal = "Convergence: The International Journal of Research into New Media Technologies",
issn = "1354-8565",
publisher = "Sage Journals",

}

RIS

TY - JOUR

T1 - De- and recoding algorithmic systems

T2 - The case of fact checkers and fact checked users

AU - Schjøtt, Anna

AU - Bengtsson, Mette

PY - 2024

Y1 - 2024

N2 - With the recent development of debunking on social media as a dominant agenda, fact checkers have increasingly used machine learning (ML) to identify, verify and correct factual claims, as ML promises the scaling of fact checking practices. However, it also places a new actor in between the fact checkers and the fact checked users. In this paper, we conducted a contrasted analysis of how fact checkers and fact checked users understand, evaluate and act towards the algorithmic systems and the data flows in Meta’s Third-Party Fact-Checking Program. For both professional users and end users, the algorithmic system is experienced as a black box in which they have limited insight, and their sense-making practices happen based on the data and metrics that are made visible to them. In the paper, we draw on and expanded theory on decoding algorithms by not only exploring how the two user groups engage in decoding the algorithmic system, but also actively engage in forms of recoding by attempting to adapt or modify the algorithmic system to better fit within their cultural and social context, which is characterised by both varying epistemic cultures and societal positions. While the fact checkers from their hegemonic (sometimes negotiated) position understand the program as a (sometimes stupid) tool and primarily engage in passive acts of recoding, fact checked users, from their oppositional position, understand the program as an unpredictable censoring machine and engage primarily in more active acts of recoding. Based on the analysis, we end the paper with a discussion in which we argued for understanding data reflexivity as highly relational and processual.

AB - With the recent development of debunking on social media as a dominant agenda, fact checkers have increasingly used machine learning (ML) to identify, verify and correct factual claims, as ML promises the scaling of fact checking practices. However, it also places a new actor in between the fact checkers and the fact checked users. In this paper, we conducted a contrasted analysis of how fact checkers and fact checked users understand, evaluate and act towards the algorithmic systems and the data flows in Meta’s Third-Party Fact-Checking Program. For both professional users and end users, the algorithmic system is experienced as a black box in which they have limited insight, and their sense-making practices happen based on the data and metrics that are made visible to them. In the paper, we draw on and expanded theory on decoding algorithms by not only exploring how the two user groups engage in decoding the algorithmic system, but also actively engage in forms of recoding by attempting to adapt or modify the algorithmic system to better fit within their cultural and social context, which is characterised by both varying epistemic cultures and societal positions. While the fact checkers from their hegemonic (sometimes negotiated) position understand the program as a (sometimes stupid) tool and primarily engage in passive acts of recoding, fact checked users, from their oppositional position, understand the program as an unpredictable censoring machine and engage primarily in more active acts of recoding. Based on the analysis, we end the paper with a discussion in which we argued for understanding data reflexivity as highly relational and processual.

KW - Faculty of Humanities

KW - algorithmic systems

KW - act of decoding and recoding

KW - sense-making practices

KW - data reflexivity

KW - journalism

KW - fact checking

KW - Meta’s third-party fact checking program

KW - debunking

M3 - Journal article

JO - Convergence: The International Journal of Research into New Media Technologies

JF - Convergence: The International Journal of Research into New Media Technologies

SN - 1354-8565

ER -

ID: 402164548