Ressources numériques en sciences humaines et sociales OpenEdition Nos plateformes OpenEdition Books OpenEdition Journals Hypothèses Calenda Bibliothèques OpenEdition Freemium Suivez-nous

Assembling fragments: a two-layer knowledge management tool to explore algorithms and their social functions

For the IST23, Rayya Roumanos & Olivier Le Deuff presented the reasoning behind Graph Algo-J, the knowledge management graph currently being developed within the framework of the Algo-J project.

Thanks to Sheila Webber for the live blogging of the presentation.

Abstract

Today, algorithms are under intense scrutiny. While developers focus on their efficiency and reliability, legal experts study their compliance with the law, social scientists explore their   widespread influence on people’s lives, and philosophers examine their ethics. What about journalists ? To date, very few newsrooms have taken up the challenge of investigating algorithms (Diakopoulos, 2018) despite the growing importance of these technological actants (Latour, 2005) in everyday life. One of the obstacles preventing them from addressing this critical issue is the lack of comprehensive understanding regarding how algorithms operate and impact lives.

This talk aims to present a tool currently being developed within the framework of a regional research project entitled AlgoJ whose objective is to provide journalists with the necessary resources to address this intricate matter. The tool in question is a knowledge management graph structured in such a manner that it provides multiple levels of reading and navigation regarding algorithms. Its purpose is to enhance algorithm literacy among journalists, empowering them to set up effective investigations into the disruptive influence of these technical pieces of engineering. It is built around a lexicon comprising linked notes or cards, with the aim of defining and elaborating on various terms using hypertext techniques. As a first step, all definitions within the prototype are produced by researchers based on scientific references as well as other documents (e.g. news publications). The next step will be to enable journalists to make use of the tool and enrich it with their own experiential knowledge. One of the distinctive features of this tool called Graph AlgoJ is its capacity to morph into a “hyperdocument” (Le Deuff, 2021, Otlet, 1934) that encompasses both established information and knowledge that is currently being developed. 

The focus of the talk will be to provide a comprehensive look at the reasoning behind the tool. It will first present the sociotechnical perspective used to establish its analytical framework then discuss its functions. The premise is that algorithms are not merely technical objects, but “heterogeneous and diffuse sociotechnical systems” (Seaver, 2017) that embody technological and normative dimensions. They are the product of a social context and are, as such, highly permeable to power dynamics stemming from cultural, political, and economic grounds. Furthermore, as omnipresent infrastructures they have become “invisible”, and have gained the ability to structure and ritualise people’s lives without their knowledge.  

In order to investigate their “social power”(Beer, 2017), journalists first need to acquire an acute understanding of their nature and function within a specific social context. Graph AlgoJ intends to offer a convincing response to this challenge through a structure that merges two interconnected paths: the first one adopts a conventional knowledge management approach, which involves providing a set of definitions to bridge the gap in knowledge, both in terms of general understanding and specialized expertise. The second focuses on connecting multiple elements in the graph in order to highlight relationships between actors, actants, and actions. 

We argue that this two layer approach can alleviate the pervasive imagery of the “black box” (Pasquale, 2016) as well as other highly effective “fictions” (Gillespie, 2014) presenting algorithms as objective, reliable, and unavoidable. Graph algoJ is designed to serve as a hyperdocument that combines academic information with empirical knowledge, with the goal of transcending mere informative content and becoming an “instrument” (Otlet, 1934) consisting of “a complex assemblage of fragments, with meaning derived from the various pathways established through active reading” (Balpe, 1990: 6). By providing this function, it will help journalists overcome technical, cognitive and cultural obstacles preventing them from applying critical and empirical attention to algorithms.

References 

  • Balpe, J.P. (1990), Hyperdocuments, hypertextes, hypermédias, Paris, Eyrolles.
  • Beer, D.G. (2017) The Social Power of Algorithms. Information, Communication and Society. pp. 1-13.
  • Diakopoulos, N. (2018) The Algorithms Beat. Data Journalism Handbook. Eds. Liliana Bornegru and Jonathan Gray.
  • Pasquale, F. (2015). The black box society: The secret algorithms that control money and information. Cambridge, MA: Harvard University Press.
  • Gillespie, T. (2014a). The relevance of algorithms. In T. Gillespie, P. J. Boczkowski, & K. A. Foot (Eds.), Media technologies: Essays on communication, materiality, and society (pp. 167–193). Cambridge: MIT Press
  • Latour, B. (2005). Reassembling the Social: An Introduction to Actor-Network-Theory. Oxford: Oxford University Press
  • Le Deuff, O. (2021), hyperdocumentation. London: Wiley-ISTE 
  • Otlet, P. (1934), Traité de documentation. Le livre sur le livre, Palais Mondial, Bruxelles
  • Seaver, N. (2017) Algorithms as culture: Some tactics for the ethnography of algorithmic systems, Big Data & Society, July–December 2017: 1–12

Conférence d’Alexandre Coutant sur les publics de l’information

Le 11 juin 2023, le professeur Alexandre Coutant du département de communication sociale et publique de l’Université du Québec à Montréal (UQAM) a donné une conférence intitulée “Ce que les publics de l’information permettent de penser en termes d’éducation aux médias et à l’information”, dans le cadre du projet ALGO-J

Résumé de l’intervention

L’éducation aux médias et à l’information (ÉMI) associe historiquement la nécessité de développer des littératies informationnelles à la possibilité d’exercer pleinement sa citoyenneté. La numérisation des pratiques informationnelles est venue multiplier les dimensions où une éducation est nécessaire comme les contextes dans lesquels exercer cette citoyenneté. Pourtant les recherches critiques en ÉMI déplorent régulièrement des initiatives pédagogiques trop focalisées sur des dimensions instrumentales, individualisantes et peu connectées aux questions de participation citoyenne. Une tendance renforcée par la panique médiatique entretenue autour des fausses informations et du complotisme. Au sein de ce débat, un constat demeure hautement problématique : le peu d’enquêtes approfondies permettant de comprendre les publics de l’information. Cette intervention propose de détailler tout l’intérêt à tirer d’une meilleure compréhension de ces derniers, en se fondant sur l’exemple d’une enquête menée auprès des publics québécois de l’information.

Enjeux de la littératie algorithmique : rendre visible l’invisible

Communication présentée dans le cadre du séminaire design des algorithmes de recommandation de biens culturels, organisé par Samuel Gantier et Fanny Bougenies dans le cadre du GdT Design des algorithmes de recommandation de biens culturels.

Le but de cette communication a été de présenter les enjeux définitionnels, scientifiques, conceptuels et pratiques de la littératie algorithmique. Cette littératie constitue une forme de réponse à l’opacité des traitements algorithmiques qui influent voire biaisent des chaînes de décision qui concernent les individus. Nous avons souhaité retracer le contexte de cette littératie, ses origines et ses proximités avec d’autres littératies ainsi que ses relations avec des disciplines des SHS qui étudient les algorithmes et leurs effets. Nous avons montré l’intérêt de s’intéresser à des perspectives mécanologiques en revenant notamment sur les travaux de Gilbert Simondon et de Bernhard Rieder qui ouvrent des pistes qui incitent à mieux documenter les processus algorithmiques à des fins de responsabilité (accountability) et pour faciliter les démarches de retro-ingénierie et d’investigation journalistique. Nous avons enfin présenté un outil réalisé dans le cadre du projet AlgoJ qui vise à rendre visible et intelligible les processus algorithmiques : le graphe AlgoJ basé sur le logiciel COSMA (https://cosma.graphlab.fr/)

Enjeux définitionnels et scientifiques de la littératie algorithmique : entre mécanologie et rétro-ingénierie documentaire

Parution dans la revue Tic & Société, vol. 15, n. 1-2, d’un article d’Olivier Le Deuff et Rayya Roumanos sur les enjeux de la littératie algorithmique.

Résumé en français

Le but de cet article est de définir les bases d’une littératie algorithmique qui constitue une réponse aux phénomènes de traitement algorithmique des informations et des données. Comme ces traitements sont souvent opaques et potentiellement biaisés, la littératie algorithmique constitue un moyen de monter en compétences pour y faire face. Nous examinons le contexte de cette littératie, ses origines et ses proximités avec d’autres littératies, puis nous analysons les liens possibles avec des disciplines des sciences sociales qui étudient les algorithmes et leurs effets. Nous montrons l’intérêt d’associer cette littératie aux perspectives mécanologiques qui se réfèrent aux travaux pionniers du philosophe de la technique Gilbert Simondon, puis à ceux plus récents de Bernhard Rieder. Nous mettons en évidence l’importance de corréler cette littératie émergente à une documentation des processus se situant entre rétro-ingénierie et explication des traitements algorithmiques effectués par les écosystèmes numériques.

Résumé en anglais

The aim of this article is to pave the way for a new form of literacy called algorithm literacy, that enables users to identify and understand how algorithms work and to what extent they impact their daily lives. The goal of algorithm literacy is to help users acquire new skills to counter the recurrent problem of opacity and biases in algorithmic processing of data and information. We first examine the context and origins of this literacy and its proximity to other literacies. We then analyse the possible bridges that can be built with the many fields of social sciences that study algorithms and their effects. We demonstrate the importance of associating this literacy with the mecanological angle developed by Gilbert Simondon and, more recently, Bernhard Rieder. We then look at possible correlations that this literacy might establish with reverse engineering efforts to help understand the algorithmic processes at play.

L’article est accessible en ligne : https://journals.openedition.org/ticetsociete/7105

Regaining control over algorithms: a hypervisual tool to get from awareness to knowledge

Rayya Roumanos & Olivier Le Deuff will be presenting a talk on ways to get from algorithm awareness to algorithm knowledge at the upcoming ICA 2022 Preconference (26 may 2022)

The talk will tackle the issue of algorithm awareness amongst journalists at a time when news has become fundamentally contingent upon the two colliding processes of platformization (Niebord & Poell, 2018) and datafication (Kennedy, Poell & van Dijk, 2015, Arsenault, 2017). Focusing on the continuous interferences of algorithms in media operations and news circulation (Gillespie, 2013, Diakopoulos, 2014), we explore journalists’ ability to make sense of their changing ecosystem and their aptitude to regain control over the production and flow of information. We also call attention to an ongoing research aimed at designing a hypervisual tool that has the potential to reach this goal.

Algorithms are highly efficient technical artifacts used by platforms as well as newsrooms to gather, process and distribute information. Although omnipresent, they are “invisible infrastructure” (Gran, Booth & Bucher, 2021) that have significant social power especially when they are employed to curate contents (Reider & Sire, 2013), shape choices (Yeung, 2017), induce feelings (Bucher, 2012) or impose agendas (Gillespie, 2017). In this communication, we look at them from the standpoint of social science as objects embedded in social processes and modelled on cultural interpretations. Following David Beer’s hypothesis, we argue that their role is determined by the decision-making parts of their codes (Beer, 2013) and their social function is dependent on social power dynamics (Beer, 2017). 

In order for journalists to discern the performative power of an algorithm with respect to its social impact and/or its effectiveness in shaping news contents (Joux and Bassoni, 2018) and news flows (Wallas, 2018), they need to overcome internal (technical proficiency) and external (code opacity) obstacles to get a clear insight into the workings and social ecology of this algorithm. In other words, they need to go beyond basic awareness of its existence and acquire constructive knowledge regarding its social functions. 

One of the main challenges facing them in this quest revolves around the issue of “algorithm visibility” considering that the ubiquity of these lines of codes renders them invisible, the same way media or metadata became invisible when they developed into infrastructures that people live in rather than with (Deuze, 2011, Pomerantz, 2015). 

Taking into consideration this specific problem, we looked at ways to revisibilize algorithms by focusing on their context and function. In this communication, we will present a Knowlege Management system called Graph ALGO-J that was designed as part of an ongoing research on the role and impact of algorithms in the journalism field. Its purpose is to provide users with interlinked data that can be explored, manipulated and amplified through a hypervisual interface.

Screenshot of COSMA ALGO J

Following an encyclopaedic approach, the knowledge graph gives access to information regarding algorithms (their origin, mode of operation, field of application and social function) as well as actors involved in the news environment (news media and digital intermediaries), devices, scientific research and journalistic investigations. But it also establishes direct and indirect links between each entry that is presented in the form of a card, in order to help users produce their own probe into the impact of algorithms. The ultimate goal of this tool is to become a “hyperdocument”:  a document that is also an instrument of action and knowledge (Otlet, 1934, Le Deuff, 2021) 

Rayya Roumanos & Olivier Le Deuff


References     

  • Arsenault, Amelia. (2017). The datafication of media: Big data and the media industries. International Journal of Media & Cultural Politics. 13. 7-24. DOI: 10.1386/macp.13.1-2.7_1
  • Beer, David. (2017). The social power of algorithms, Information, Communication & Society, 20:1, 1-13, DOI: 10.1080/1369118X.2016.1216147
  • Beer, David. (2013). Popular Culture and New Media: The Politics of Circulation. Basingstoke: Palgrave Macmillan. ISBN: 978-1-137-27004-7
  • Bucher, Taina. (2012). Want to be on top? Algorithmic power and the threat of invisibility on Facebook. New Media & Society, 14, 1164-1180. DOI: 10.1177/1461444812440159
  • Deuze, Mark. (2011). Media Life. Media, Culture & Society. 33. 137 -148. DOI: 10.1177/0163443710386518. 
  • Diakopoulos, Nicolas. (2014). Algorithmic Accountability Reporting: On the Investigation of Black Boxes. Tow Center for Digital Journalism Brief, Columbia University.
  • Gillespie, Tarleton. (2017) Algorithmically recognizable: Santorum’s Google problem, and Google’s Santorum problem, Information, Communication & Society, 20:1, 63-80, DOI: 10.1080/1369118X.2016.1199721
  • Gillespie, Tarleton. (2013). The relevance of algorithms. In T. Gillespie, P. Boczkowski, & K. Foot (Eds.), Media technologies: Essays on communication, materiality, and society (pp. 167–194). MIT Press.
  • Gran, Anne-Britt, Booth, Peter & Bucher Taina. (2021). To be or not to be algorithm aware: a question of a new digital divide?, Information, Communication & Society, 24:12, 1779-1796, DOI: 10.1080/1369118X.2020.1736124
  • Joux, A., & Bassoni, M. (2018). Le journalisme saisi par les Big Data? Résistances épistémologiques, ruptures économiques et adaptations professionnelles. Les Enjeux de l’information et de la communication, 19/2(2), 125–134. DOI: 10.3917/enic.025.0125 
  • Kennedy, Hellen, Poell, Thomas. & van Dijk, Jose. (2015). Data and agency. Big Data & Society 3: 1-7. DOI: 10.1177/2053951715621569
  • Le Deuff, O. (2021). Hyperdocumentation. ISTE Ltd, Hoboken. DOI:10.1002/9781119855590
  • Poell, Thomas & Nieborg, David. (2018). The platformization of cultural production: Theorizing the contingent cultural commodity. New Media & Society. 20. DOI: 10.1177/1461444818769694
  • Otlet, Paul . (1934), Traité de documentation. Le livre sur le livre, Palais Mondial, Bruxelles
  • Pomerantz, Jeffrey. (2015). Metadata. MIT Press Essential Knowledge series
  • Rieder B & Sire G. (2014). Conflicts of interest and incentives to bias: A microeconomic critique of Google’s tangled position on the Web. New Media & Society. 16(2):195-211. DOI:10.1177/1461444813481195
  • Wallace, Julian (2018). Modelling contemporary gatekeeping: the rise of individuals, algorithms and platforms in digital news dissemination. Digital Journalism, 6(3):274-293. DOI: https://doi.org/10.1080/21670811.2017.1343648
  • Yeung, Karen (2017) ‘Hypernudge’: Big Data as a mode of regulation by design, Information, Communication & Society, 20:1, 118-136, DOI: 10.1080/1369118X.2016.1186713

Launching AlgoJ :a research project on algorithms in the news ecosystem

We are pleased to announce the launch of AlgoJ, a research project funded by the Nouvelle Aquitaine Region (2021-2023)

AlgoJ is conducted in partnership with Sud Ouest and Curieux!. It aims to study the influencing role of algorithms in the field journalism. It is led by Rayya Roumanos and Olivier Le Deuff and it relies on a multidisciplinary team composed of Arnaud Schwartz, Florian Tixier, Ugo Verdi and Mohamed Mosbah.     

With AlgoJ, we intend to explore the structuring effect algorithms have of the digital news ecosystem. First, we will consider their impact on news gathering, writing and publishing. We will also measure the level of awareness journalists have of algorithms and look at ways to implement an efficient algorithm literacy.  We also plan to build an innovative methodology that helps journalists understand and investigate algorithms. 

Over a period of two years, the project will explore these issues on a regional level, with the help of many partners :  

  • Sud Ouest newspaper, which will provide an immersive ground to explore new ways to investigate algorithms. 
  • Curieux!, the digital platform of Cap Sciences Bordeaux, where we will conduct participatory observations and in depth interviews with journalists and social media editors that have integrated algorithms in their daily routines. 
  • Bordeaux Press Club and IUT/IJBA Alumni Association, whose involvement in the project will enable us to target a large network of journalists employed in print, radio and television media. 
  • CLEMI and Reporters Without Borders, who are very active in raising awareness on the power of algorithms in the information ecosystem. 

The results of this research project will be presented on this website as well as during national and international scientific events.