cLIMATE MISINFORMATION
Key terms
Algorithmic systems – refer to assemblages of multiple algorithms (i.e., a finite sequence of well-defined, computer-implementable instructions, typically to solve a class of problems or to perform a computation), databases, and interfaces that interact, as well as the social practices that people develop to interact with them (Wagner et al., 2021; Ada Lovelace Institute, 2021).
Artificial intelligence – we use the terms “artificial intelligence” and “AI” to refer to technologies that employ machine learning including deep learning methods. We write “AI and associated technologies” in cases where AI is an integrated part of a technology, such as a video camera that integrates AI-analysis to classify facial expressions, or social media platforms that deploy sentiment analysis to track and maximize user engagement.
Digitalization – is here used to describe the adoption and use of digital technologies, including societal, organizational, and individual impact (Legner et al., 2017). Digital technologies for e.g., food production (‘precision farming’), electricity (‘smart grids’), and housing (‘smart homes’),
do not necessarily use AI for their operations.
Emotions - we use the same definition provided by D’Mello (2017) and view emotions as ‘conceptual entities that arise from brain-body-environment interactions’, including anger, happiness, sadness, fear, joy, and other affective states. Since we also focus on the collective aspects of emotions and its connection to nature, non-human species and future generations, we also include empathy as an affective phenomenon of interest for this review. Definitions of empathy vary, but share the view that it involves relating to the emotional state of another, with understanding and a personal emotional response, while still experiencing the self and the other as separate entities. Empathy can thus be defined as follows: Empathy is to understand, feel, and share what someone else feels, with self-other differentiation (Håkansson et al., 2020).
Generative AI – refers to AI based on advanced neural networks with the ability to produce highly realistic synthetic text, images, video and audio (including fictional stories, poems, and programming code) with little, to no human intervention. The most known example is the Generative Pre-trained Transformer 3 (GPT-3, GPT-4) that underpins OpenAI’s GPTChat (OpenAI, 2023). Note however that GPT is only one of many existing generative AI models (e.g., Gibney, 2022).
Recommender systems – refer to techniques that offer suggestions for information (e.g., other users, text, articles, videos or social media posts) that are likely to be of interest to a particular user. Recommender systems are widely used in social media platforms, but also other digital services like music streaming, video services, and for online shopping (Ricci et al., 2015). Such systems can deploy AI-analysis to guide their recommendations (Engström and Strimling, 2021).
Social bots – are automated social media accounts with the ability to build social communication networks and create online content. Such bots can be used for beneficial or malicious purposes ranging from news aggregation, user assistance or entertainment, to spam and sophisticated influence campaigns (Stieglitz et al., 2017).
References
Ada Lovelace Institute (2021). Technical methods for the regulatory inspection of algorithmic systems in social media platforms. Ada Lovelace Institute, UK.
Gibney, Elizabeth. "Open-source Language AI challenges Big Tech’s Models.’." Nature 606.7916 (2022): 850-851.
Engström, E., & Strimling, P. (2020). Deep learning diffusion by infusion into preexisting technologies – implications for users and society at large. Technology in Society, 63, 101396. https://doi.org/10.1016/j.techsoc.2020.101396
Legner, C., Eymann, T., Hess, T., Matt, C., Böhmann, T., Drews, P., et al. 2017. “Digitalization: Opportunity and Challenge for the Business and Information Systems Engineering Community,” Business & Information Systems Engineering (59:4), pp. 301–308.
OpenAI (2023). https://openai.com/product/gpt-4
Ricci, F., Rokach, L., & Shapira, B. (2015). Recommender systems: introduction and challenges. Recommender systems handbook, Springer Science+Business Media, p. 1-34.
Stieglitz, S., Brachten, F., Ross, B., & Jung, A.-K. (2017). Do Social Bots Dream of Electric Sheep? A Categorisation of Social Media Bot Accounts. arXiv Preprint. http://arxiv.org/abs/1710.04044
Wagner, C., Strohmaier, M., Olteanu, A., Kıcıman, E., Contractor, N., & Eliassi-Rad, T. (2021). Measuring algorithmically infused societies. Nature, 595(7866), 197–204. https://doi.org/10.1038/s41586-021-03666-1
Explore all chapters
Introduction: AI could create a perfect storm of climate misinformation
Chapter 1: What is climate mis- and disinformation, and why should we care?
Chapter 2: The neuroscience of false beliefs
Chapter 3: How algorithms diffuse and amplify misinformation
Chapter 4: Emotions and group dynamics around misinformation on social media
Chapter 5: When health and climate misinformation overlap
Chapter 6: A game changer for misinformation: The rise of generative AI
Chapter 7: Keeping up with a fast-moving digital environment