Salta al contenido principal

Section outline

  • Welcome to this first edition of the DOMINOES course that we celebrate at the Rey Juan Carlos University, in Madrid, Spain.

    To begin with, we will show you some useful and practical information that is important for the development of the course, during this week from September 11 to 15.

    Next, we will carry out an initial test to learn how much you already know about the content we will cover during the week. At the end of the course, we will administer a similar test to measure your learning.

    The tests are anonymous because the goal is to find out if the overall knowledge of the group has improved. So you can participate without worry and with confidence.

    • Initial Test - 1 [Previous knowledge] Página
      No disponible hasta que: se pertenezca al grupo Students
  • In this first section of the course, we examine current trends in the information environment, the development of mainstream media and social media and how they affect the way citizens access information and the types of information they are exposed to. It also examines how narratives can be used in disinformation and propaganda campaigns, paying particular attention to the increasing prevalence of conspiracy theories. Another important topic that will be explored is the novel approach of using intelligence in strategic communication to set the right framework for understanding current social developments.


    • S1.1. Conflict and its manifestation in the information environment


      1. Conflict and its manifestation in the information environment

      2. Hybrid warfare/threats

      3. Cognitive and information warfare



    • S1.2 Legitimate and illegitimate use of information and persuasion in the information environment

      1. Disinformation
      2. Foreign information manipulation and interference (FIMI)
      3. Cyber information operations



    • S1.3. Case studies of hostile narratives and conspiracy theories by authoritarian state and non-state actors

      1. Case studies of hostile narratives and conspiracy theories by authoritarian state and non-state actors.

    • S1.4. Evaluation Module 1

  • In this second section, we will focus on aggravating factors for the spread of disinformation, such as individual and group factors, the role of influencers and pseudo-analysts, societal factors, and technological factors, to better understand how they interact to facilitate the spread of disinformation and provide the framework for analysing the best approaches to counter or limit its impact and effectiveness.

    NOTE: Room 105, Departmental Building


    • S2.1. Fact or Opinion? The Exploitation of Errors of Judgement and Group Vulnerability in the Digital Era

      1. Facts, opinions, and objectivity

      2. Mechanisms used in the construction of fake news

      3. Exploitation of vulnerabilities and  errors of judgment 



    • S2.2. Political and social factors: polarization and decline in trust in expertise and authority

      1. Democracy and trust

      2. Types of trust

      3. Subverting trust: contesting facts, distrusting authority, social polarisation

      4. Rebuilding trust      



    • S2.3. Technological developments and their impact on opinion formation

      1. The role of social media      

      2. Artificial intelligence & Synthetic content 

      3. Strategies to combat hostile influence



    • S2.4. Evaluation Module 2

  • The third section looks at the best-known and most widely used methods of curbing the spread of disinformation. First, it examines the discursive, argumentative and narrative mechanisms that make disinformation attractive to audiences, and second, it explores the advantages and possible disadvantages of critical thinking, media literacy, debunking, fact-checking and prebunking as the most commonly used and recommended means of combating disinformation.


    • S3.1. Transnational and regulatory responses to disinformation / FIMI and resilience building

      1. Transnational responses to disinformation/FIMI

      2. Regulatory responses to disinformation/FIMI

      3. Resilience building



    • S3.2. Critical thinking, media and digital literacies

      1. Critical thinking

      2. Media and digital literacies.



    • S3.3. Fact-checking, argument-checking, debunking and pre-bunking

      1. Fact-checking

      2. Argument-checking 

      3. Debunking and pre-bunking.



    • S3.4. Evaluation Module 3

  • This section presents an overview of technological solutions, both existing and emerging, that could be employed to counter disinformation. The chapter introduces the technological solutions for spotting, flagging and removing disinformation as well as serious game solutions which are designed to prepare and train citizens to recognise it and thus limit its negative impact. The last section of the chapter explores the limitations that technology has with respect to identifying disinformation attempts.


    • S4.1. Strategic, anticipatory, and current analysis of disinformation and information-led hostile influencing

      1. Strategic analysis of disinformation and information-led hostile influencing

      2. Anticipatory analysis of disinformation and information-led hostile influencing

      3. Current analysis of disinformation and information-led hostile influencing


    • S4.2. TECH-DRIVEN SOLUTIONS AND EMERGING TECHNOLOGIES TO COUNTER DISINFORMATION

      1. Tech-driven solutions and emerging technologies to counter disinformation

    • S4.3. PLANNING, DESIGN, AND IMPLEMENTATION OF COUNTER-NARRATIVES AND POSITIVE CONTENT

      1. Planning of counter-narratives and positive content

      2. Design of counter-narratives and positive content

      3. Implementation of counter-narratives and positive content



    • S4.4. Evaluation Module 4

  • This section employs active learning methods with the aim that students apply the competences acquired in previous sections. Participants will be exposed to several role-playing simulation exercises with different objectives, including the production of a debunking piece, producing digital content using generative AI, and will be facing different scenarios involving decision-making. 

    By reflecting on the specific individual and in-group experiences enabled by simulations/gaming, participants will critically reflect on concepts, practices, technologies and countermeasures to disinformation.


    • S5.1. DEBUNK SIMULATION

      1. Debunk simulation. Role Play

    • S5.2. & 5.3. PATHWAY TO VICTORY:  DEEPFAKES AGAINST DEEPFAKES

      1. Deepfakes against deepfakes: producing synthetic content exercise.
    • Final Test - 2 [Acquired knowledge] Página