Welcome to engagement room 3!

There are many innovative responses to disinformation being rolled out across the world, from advanced artificial intelligence to community leaders working to dispel rumours. There is a need for digital solutions, policy and regulation, media and journalism development, citizen engagement and much more. Given the complexity of the issue, it seems that a holistic approach to disinformation is needed, one that involves many different groups and competencies.

This is the space to seek solutions. The invitation is for you to join in exploring these from three response areas:

  1. Reducing the creation and production of disinformation
  2. Stemming the dissemination of disinformation
  3. Building resilience of the target audiences or consumers of disinformation

As a reminder, disinformation is “false, manipulated or misleading content, created and spread unintentionally or intentionally, and which can cause potential harm to peace, human rights and sustainable development”.

Please answer any of the below questions (including the question numbers in your response).  Feel free to introduce yourself if you wish. We look forward to hearing from you.  

  1. What role should legislation and regulation play? How can regulatory responses be developed in a way that respects fundamental rights such as right to privacy, freedom of expression and right of access to information?
  2. What role should policy play at the national, regional and international level? How can these be harmonised?
  3. How can internet companies be effectively governed or regulated to ensure they act in the public interest?
  4. What are the digital and technological options for addressing this? Do you have examples of effective responses?
  5. What is the role of journalism? How can journalists, broadcasters and news editors be better equipped and supported to address this issue?
  6. Which stakeholders need to be engaged and which strategic partnerships should be considered?
  7. How can we build greater resilience to disinformation, especially among vulnerable or marginalised groups, such as through greater media and information literacy?


We commit to protect the identities of those who require it. To comment anonymously, please select "Comment anonymously" before you submit your contribution. Alternatively, send your contribution by email to niamh.hanafin@undp.org requesting that you remain anonymous.

Comments (23)

Daria Asmolova Moderator

Week One Summary

Thank you for all contributors this week. Here is the summary:

1) Beyond tech and regulatory solutions: an ethical pact between political parties in Uruguay helps prevent the spread of political misinformation.

2) What could be the strategies for fighting misinformation on WhatsApp?

3) Personal data privacy and protections vs platforms' responsibility for the content they host: is there a role for regulation or relying on self-regulation is enough?

4) Uruguay is also exploring these questions: 

  • New gatekeepers, private censorship and freedom of expression.
  • Concentration, diversity and pluralism on the Internet.
  • Private regulation, the role of the State and democratic governance.

5) A study with learnings on mitigating strategies for misinformation in Africa should be available at the end of October.

6) More solutions:

  • Digital literacy for identifying misinformation;
  • Engaging influencers.
Caroline Hammarberg Moderator

Good morning and a warm welcome to this three-week UNDP-UNESCO consultation on how we can forge a path to a better information system spanning Effective Governance, Media, Internet and Peacebuilding Responses to Disinformation.

My name is Caroline Hammarberg, Coordinator for #CoronavirusFacts Projects at UNESCO’s Section for Freedom of Expression and Safety of Journalists and together with Daria Asmolova, Digital Transformation Analyst from UNDP, I will be your moderator during this week before handing over to colleagues.

We very much look forward to hearing your thoughts and solutions to the above list of questions and hope that we can extract some valuable recommendations together.

When commenting, please let us know which particular question(s) you are responding to. The floor is yours!

Warm regards,

Caroline & Daria

Ayushma Basnyat

Question 5:

In Nepal, UNDP and UNESCO have collaborated on a range on issues; in particular, the two organizations collaborated to mark the World Press Freedom Day in 2019, which was themed, “Media for Democracy and Peace: Journalism and Elections in the Age of the Internet.” Nepal’s Ministry of Communication and Information Technology, the Election Commission Nepal and the Federation of Nepali Journalists, together with the European Union, UNESCO Nepal and UNDP’s Electoral Support Project organized a one-day national conference to celebrate this occasion.

The programme included several deliberations at the inaugural session as well as the six thematic panel discussions where a lot of pertinent ideas on the nexus between disinformation and governance - elections in particular - were raised.

Another event that the two organizations collaborated on was in orienting young female journalists on political reporting in order to bridge the gender gap in electoral journalism. The event emphasized the importance of having a balanced gendered perspective in journalism. It also proved to be an effective platform to discuss solutions to common problems, for women to unite and work collectively to make progress in their professions and ensure a women-friendly professional environment.

In Nepal, approximately 70% people have access to the Internet: 95 thousand have access to Facebook and 35 thousand have access to Twitter.

From both the events, the following recommendations ensued:

• Media should honour its professionalism and should be unbiased in their reporting. It should abide by the Code of Conduct for the media.

• Journalists should be very careful about disinformation in social media and should not publish or broadcast news without proper verification.

• Social media is a tool that can both spread as well as fight disinformation.

• The Federation of Nepali Journalists should finalize social media guidelines for journalists and share it with other media houses as a reference.

• The Election Commission, Nepal should issue a social media code of conduct for the general public and political parties during the elections. Based on this recommendation, UNDP’s Electoral Support Project also supported the Commission to draft a social media policy to productively engage in social media platforms and dispel disinformation.

• All the media laws that will be brought up by the federal, provincial or local governments must honour the Constitution of Nepal.

• The laws and policies should be based on the international standard of press freedom in democratic countries.

• Media laws should be focused on making media self-regulatory rather than controlling them. Press freedom must be protected. • Adapt code of conduct and legislation accordingly, without curtailing fundamental freedoms.

• The Nepal Police should remain vigilant and act to ensure that trolling and hate speech is discouraged, especially on females, gender and sexual communities, marginalized communities and activists.

• The state must protect and promote social media as a public accountability tool, and act upon any complaints received through social media.

• Increase ECN capacity to detect and respond to disinformation and other threats, both within the Election Commission in Nepal and through partnerships.

• Regularly review the situation in order to be prepared for emerging challenges.

• Experience sharing platform journalists with other stakeholders can help in trust building and also help in disseminating factual information.

• Constitutional Commissions, such as the Election Commission in Nepal, should prepare a roster of media and journalists and update it regularly. This way, a list of credible media houses are recognized.

• Cyber-crime should be made punishable by law.

The specific recommendations for the Election Commission, Nepal on the issues to consider before using technology in elections are as follows: Attention should be paid to the impact of the use of new technology; When using new technology, transparency of the system used in the technology should be ensured. Adequate attention should be paid to minimizing the risks associated with new technology.

More Information:

The full report, complete with recommendations and visual assets, of the event can be accessed here: https://www.np.undp.org/content/nepal/en/home/presscenter/articles/2019/world-press-freedom-day-2019.html

Other events where UNDP and UNESCO have collaborated include:

Stefan Liller

Hi Caroline and Daria,

I would like to share a bit about our journey to address disinformation in Uruguay – firstly in connection with the parliamentary and presidential elections in 2019 and now in 2020 in the context of COVID and the regional and local elections.

Our work began in April 2019 when campaigning started in earnest for the parliamentary and presidential elections, scheduled for October 2019. We noted how an increased number of what at the time was referred to as fake news emerged around several of the candidates and parties.

We reached out to a number of stakeholders worried about this development – and joined forces with the Association of Uruguayan Press, UNESCO, the Astur Foundation and Fredrich Ebert Stiftung to promote the signature of an ethical pact between all the political parties to not engage in and discourage the spread of disinformation in the context of the elections.

The pact was signed in Parliament on the 26 of April 2019, with the participation of the whole political establishment in Uruguay – including the siting vice President a former president, all the political parties and the main candidates. In the morning the same day we organized a closed dialogue at the UNDP office in Montevideo jointly with UNESCO, where we brought together representatives of all the political parties, the electoral court, civil society, representatives of media outlets and the internet platforms (Google, Facebook and Twitter) to share information and approaches of how these different groups and organizations work to address disinformation and explore how to further work together.

Following the signing of the pact and the dialogue there was a lot of attention in the media around the issue and several different initiatives appeared in the following months, including a fact checking service called verificado.uy. Jointly with UNESCO and the internet platforms we then organized capacity building workshops for media and central, regional and local governments on how to address disinformation. This work culminated with the elections in October 2019.

In March 2020, in the context of the unfolding COVID pandemic, we produced produced spots for social media with UNESCO on how to address disinformation in times of Coronavirus. The spots were subsequently translated into several languages by different UNDP and UNESCO offices, and highlighted in the context of the celebration of the World Press Freedom Day the 3 of May 2020.

In April, under the leadership of UNESCO we joined them, WHO and the University of Texas to produce an Massive Open Online Course on the topic “Journalism in a pandemic: Covering COVID-19 now and in the future” with a special focus on how to report during a pandemic and the issue of disinformation. It was facilitated from the 4 to 31 of May and during that time more than 9,000 people/journalists from 162 countries participated in the course, which is now available online.

On the 29 of August 2020, in an official act with the congress of governors and the participation of the Association of Uruguayan Press, UNDP and UNESCO, the political parties in Uruguay reconfirmed their commitment to the ethical pact – this time in anticipation of the regional and local elections that were held on the 27 of September. 

We are currently developing an agenda of work and dialogues in Uruguay related to internet content moderation, freedom of expression and democratic governance, where we want to look at 1. New gatekeepers, private censorship and freedom of expression, 2. Concentration, diversity and pluralism on the internet, and 3. Private regulation, the role of the State and democratic governance. We hope this can be a contribution to the discussion of the roles and responsibilities of the internet platforms and others with regards content management on the Internet.



Daria Asmolova Moderator

Thank you for sharing your action plan, Stefan! The ethical pact signing is a brilliant idea which reminds us that non-tech solutions can have a big impact even in the digital space.  
And I would love to learn more about verificado.uy in terms of the usage, engagement and overall reception of the platform.
The agenda for the internet content moderation, freedom of expression and democratic governance is so on point - these are the questions asked in pretty much every part of the world, so please share your thinking as it progresses and feel free to reach out to [~92362] and other policy experts for more input.

Niamh Hanafin Moderator

[~85812] your approach contains many of the components that we "intuitively" feel should be used to address information pollution.  I was curious to know if you have a sense of impact of the efforts, either individually or collectively?  Were there any measurable changes as per your initial assumptions?

Larriza Thurler

Thanks, Stefan! How is the use of WhatsApp in Uruguay? Here in Brazil we have a lot of fake news spreading through WhatsApp. Media outlets fact check it,  but the government supports are very critical of the media and they don't believe in anything they say. It's a challenge to fight about that. 

Ruth Stewart

Hi everyone, 

So I'm leading a project in partnership with Africa Check which is exploring the evidence for mitigating strategies for misinformation (we're focussing on COVID-19 misinformation shared on social media, but there will be lessons for the wider community). We're just in the process of analysing the research evidence as part of a rapid review, and also analysing a set of interviews conducted with fact checkers across the continent to understand the strategies they use. Happy to share findings once available at the end of the month! (Sorry, it's too early to say much about what we've learnt so far).


Caroline Hammarberg Moderator

Thank you for sharing Ruth, this is very interesting indeed. What type of factors are you looking at as part of the review? Do you think the findings will be available by 23 October which is the last day for these consultations and is there otherwise a website where you plan to publish them? We look forward to hearing more and please don't hesitate to share also indicative findings to inform the discussions in the meantime if there is enough to go on. 

Katie Burnham

We are Farm Radio are also very interested in this! We've been relying on your lists of common myths and misconceptions for some of our work in the past few months!


1. Hace falta enseñar los métodos y canales para identificar y denunciar los actos de desinformación en contextos digitales. 

Los análisis de cuentas falsas, el origen de los contenidos y el impacto generado, deberían poder ser clasificados con facilidad para que la sociedad civil entienda la gravedad de cada caso de desinformación.

4. Se debe combatir la desinformación compitiendo con las velocidades y escalas de difusión de las redes sociales. Los jóvenes e influenciadores digitales son importantes aliados que pueden ayudar a difundir información a gran escala en corto tiempo, al igual que las pautas en redes sociales de manera estratégica para competir con medios descentralizados como Whatsapp. En marzo realizamos un piloto con influenciadores de latinoamérica y la desinformación sobre COVID-19 https://es.unesco.org/news/jovenes-influenciadores-desafian-desinformac…

Adicionalmente es importante explorar herramientas de monitoreo de redes sociales para identificar con más facilidad potenciales estrategias estructuradas con fines políticos. Estas herramientas son usadas normalmente en campañas publicitarias comerciales.

5. Las organizaciones que hagan fact-checking son prioritarias para dar respuesta a la desinformación.

7. Contenidos audiovisuales cortos de alfabetización que puedan introducirse con facilidad en redes sociales. Involucrar instituciones educativas, influenciadores y visibilizar los efectos negativos de casos de desinformación locales. 

Juan Pablo Miranda

1.-A first step is to settle the type of information pollution that is being discussed. For example, a fake news is not the same type of information pollution that can be classified as biased or exaggerated information. It is also important to establish different levels of responsibility. For example, a person that create false information with the purpose of misinform has a different level of responsibility compare with the person that share false information unconsciously. In addition, it is important to develop awareness policies in this regard, which at least in Chile have not existed in a massive and systematic way.

3.- It is important to develop a regulatory framework that, on the one hand, respect personal information of users of social networks, and, in the other, makes companies responsible for the content created and shared on their platforms. Thus, it is important to check the existence of mechanisms within platforms to identify information pollution. Likewise, it is important that companies take an active role in the process of raising awareness about the possible adverse effects of social networks and think about ways to mitigate the effects of the information bubbles that are formed from algorithms used by social networks platforms.

6.- It is important to involve the companies that own digital platforms, as well as public institutions, the media, civil society organizations and political parties.

Daria Asmolova Moderator

Great points! When we discuss information pollution, we always try to distinguish between misinformation, disinformation and malinformation, using the typology from the Council of Europe report:

Disinformation. Information that is false and deliberately created to harm a person, social group, organisation or country.

Misinformation. Information that is false, but not created with the intention of causing harm.

Mal-information. Information that is based on real facts, but manipulated to inflict harm on a person, organisation or country.

But maybe this should be reviewed as well?


And you raised an interesting point about personal privacy and platform accountability. We are seeing platforms self-moderating hate speech, for example, so is it the way to go or should we still look at regulatory framework?

Ruth Canagarajah

"How can we build greater resilience to disinformation, especially among vulnerable or marginalised groups, such as through greater media and information literacy?"

This question invites quite a few different, interesting avenues that could be explored. For instance, there's research that shows that crowdsourcing insights online re: misinformation/disinformation has the potential to work despite concerns of laypersons' (i.e. non-experts) knowing how to accurately flag disinformation, misinformation, or news that is heavily biased and politically-oriented information. This study was run in 2019 in the hyperpartisan context of the States and would need to be replicated in other contexts to see if the findings hold.

Another interesting idea to test how to build greater resilience to disinformation is "inoculation". Just as the biological means of inoculation is introducing an antigen/pathogen into a system to produce immunity, perhaps this idea can be applied to misinformation. It would involve showing populations of interest the "hallmarks" of misinformation (i.e. being mindful of content source, emotive language, amongst others) to build psychological resistance to mis/disinformation. This is an approach that has already been built by Cambridge University in the form of an online game, and certainly one worth exploring given a capacity to "gamify" resilience-building in consuming misinformation rather than relying on information provision alone. 

Regardless of the approach, and especially given that the two mentioned studies derive from the US/UK, the need for testing solutions on a small scale (and our assumptions on whether they will work) is absolutely vital before building it up.

Katie Burnham

In regards to questions 5-7, I wanted to share a bit of our experience at Farm Radio International. We are a Canadian NGO with a network of more than 1,000 radio stations across Africa. We provide these stations with information and training materials, and collaborate with many stations on radio campaigns particularly to support farmers and rural audiences.

Radio broadcasters and journalists are vital for getting good information to people, but they also need access to good information. COVID-19 was new and fake news was almost as common as good information. Farm Radio shared information in a variety of ways, from print resources, to an IVR system, to Facebook chatbot. But perhaps one of the more effective tools was our WhatsApp discussions with expert guests. We have a dozen WhatsApp groups bringing together more than 1,000 broadcasters as a community of practice. We invited public health experts and other experts (gender, agriculture, nutrition, etc) to join our WhatsApp groups so that broadcasters could ask questions about the pandemic. This was an opportunity for broadcasters to fact-check their information with experts. Sometimes, journalists don’t have access to goods of information or the right experts to fact-check what they are seeing shared on social media.

We also tried to provide our broadcasting partners with tools to identify fake news, and developed a Broadcaster how-to guide on this topic, which is available in English, French, Swahili, and Amharic. This speaks to Ruth Canagarajah’s point about “inoculation.” Journalists need to understand how to identify misinformation and how to fact-check it.  

To help broadcasters share good information with their audiences, we published two series of radio spots sharing key COVID-19 info and addressing common myths, such as the idea that alcohol, garlic, lemon, or antibiotics will prevent COVID-19. These are also available in English, French, Swahili, and Amharic, and as short messages, are easier to translate into local languages. This way, we are hoping to reach audiences who don’t speak English or French and can find it more difficult to access information in their language. Marginalised groups can have a challenge getting good information if it’s not available in their language.

Sonia Livingstone

Dear all - sorry for joining this discussion a bit late. It looks like some great solutions are already under consideration. I thought I'd add a little, from a UK perspective, having recently chaired the LSE's Trust, Trust and Technology Commission, with a focus on dealing with mis/disinformation. Our report is at https://www.lse.ac.uk/media-and-communications/truth-trust-and-technolo… - beyond the analysis in the report (more appropriate for the problem statement group on this forum), our recommendations were as follows: 

Establish an Independent Platform Agency

The UK and devolved governments should introduce a new levy on UK online platforms’ revenue, a proportion of which should be ring-fenced to fund a new Independent Platform Agency (IPA). The IPA should be structurally independent of Government but report to Parliament. Its purpose, initially, will not be direct regulation, but rather an ‘observatory and policy advice’ function that will establish a permanent institutional presence to encourage the various initiatives attempting to address problems of information reliability. The IPA should be established by legislation and have the following duties: ■ Report on trends in news and information sharing according to a methodological framework subject to public consultation. This should include real data on the most shared and read stories, broken down by demographic group. ■ Report on the effectiveness of self-regulation of the largest news-carrying social and search platforms. This should include reports on trust marks, credibility signalling, filtering and takedown. ■ Mobilise and coordinate all relevant actors to ensure an inclusive and sustained programme in media literacy for both children and adults, and conduct evaluations of initiatives. The IPA should work with Ofcom to ensure sufficient evidence on the public’s critical news and information literacy. ■ Report annually to Parliament on the performance of platforms’ self-regulation and the longterm needs for possible regulatory action. ■ Provide reports on request to other agencies such as the Electoral Commission, Ofcom and the Information Commissioner’s Office, to support the performance of their duties, according to agreed criteria. ■ Work closely with Ofcom and the Competition and Markets Authority to monitor the level of market dominance and the impact of platforms on media plurality and quality. In order to fulfil these duties, the IPA will need the following powers: ■ Powers to request data from all the major platforms (determined by a UK advertising revenue threshold) on the top most shared news and information stories, referrals, news-sharing trends and case studies of particular stories. The types of data should be determined on the basis of public consultation on monitoring methodologies and according to a shared template that applies across different companies above the threshold. These data will be held by the IPA within a tight confidentiality regime to protect privacy and commercial sensitivities. ■ Powers to impose fines on platforms if they fail to provide data, and to request additional data when a court order is granted. ■ The IPA’s independence from government should be established in law and protected financially and through security of tenure of its governing Board. The IPA should have close links with civil society and be transparent about how it interprets and performs its remit. In addition to this new institution, we make further recommendations:

In the short-term: ■ News media should continue their important work to develop quality and innovative revenue and distribution models. They should also continue to work with civil society and the platforms on signalling the credibility of content. ■ Platforms should develop annual plans and transparent open mission statements on how they plan to tackle misinformation. They should work with civil society and news providers to develop trust marking. ■ Government should mobilise an urgent, integrated, new programme in media literacy. This could also be funded by the digital platform levy and should include digital media literacy training for politicians. ■ Parliament should bring forward legislation to introduce a statutory code on political advertising as recommended by the Information Commissioner.

In the medium-term (3 years): ■ Standard setting for social media platforms. Until now, standards have been set by platforms themselves. If this fails to improve the UK information environment, the IPA should set these in collaboration with civil society, Parliament and the public. ■ The news industry should develop a News Innovation Centre to support journalism innovation and quality news, funded by the levy on digital platform revenue.

In the longer-term (5 years): ■ The IPA should provide a permanent forum for monitoring and review of platform behaviours, reporting to Parliament on an annual basis. ■ The IPA should be asked to conduct annual reviews of ‘the state of disinformation’ that should include policy recommendations to a parliamentary committee. These should encompass positive interventions such as the funding of journalism.

Emanuele Sapienza Moderator

Thank you Sonia Livingstone for this comprehensive set of recommendations! It would be great to hear from consultation participants about the relevance and applicability of these measures to different contexts. Personally, I am quite intrigued by the institutional model envisaged for the IPA and was wondering: are you (or other consultation members) aware of similar mechanisms in other countries?

Louise Shaxson

Hello Sonia, thanks for the reference - very interesting reading indeed.  I have come over from Room 2 where we're discussing the drivers of disinformation, and it's clear that one of the drivers is something around the splintering of a sense of community and a sense of identity.  In a UK context, how would you see an organisation like the IPA working in tandem with (eg) citizens' juries or similar groups that foster debate and consultation across many different groups?  Is that something your group looked at?

Claire Pershan

EU DisinfoLab's recommendations are addressed to the EU level, but can certainly be applied more widely. We hope that the EU legislative packages and roadmaps on the table now (the DSA, EDAP, DEAP, and MAAP, to put in a few Brussels acronyms!) will all work in harmony and find strong support, particularly regarding decentralized funding for the variety of actors fighting disinformation. As all here will know, there is not one shape or size to the solution -- civil society and other stakeholders are extremely diverse and approach this problem from all angles, and all of these actors need sustained, flexible support. 

In particular, the upcoming European Democracy Action Plan, or EDAP, will be critical. Here are a number of points that we think this position should take into account

  • The EU (but, again, this principle applied more broadly!) urgently needs an ambitious framework to fund a decentralised network of journalists, academics, fact-checkers and open-source investigators.
  • We also need better protection for disinformation researchers, guaranteeing the physical and psychological well-being online and offline with funds that account for these risks. Intimidation tactics in our sector are omnipresent (hack-and-leak, intimidation tactics online, etc).
  • Multi-annual grants and specific financing on cybersecurity to cover costly triple penetration tests and resilient IT systems for NGOs working on disinformation would be one way to manage this threat.
  • We need to set standards on data-access and enforcing consistent definitions for platforms to respond to cases of information manipulation.
  • Last, we need to form best practices for political campaigning and a clear distinction between disinformation and strategic communications. Disinformation cannot become a regular political campaigning strategy. Political candidates should commit to respecting best practices for online campaigning and funding should be conditioned on fair and transparent online campaigning.
Larriza Thurler

Hi, exploring more the case of media synergy in Brazil, they created a consortium to present the same information across multiple sources, after difficulties to have access to governamental data. The following media outlets: Estadão (newspaper), G1 (news portal), O Globo (newspaper), Extra (newspaper), Folha  (newspaper) e UOL (news portal).

Jamie Hitchen

Hi all,

Some reflections below:

  1. I think, as others have said, empowering citizens to tackle misinformation and disinformation, on platforms like WhatsApp is perhaps the most viable solution when it comes to thinking about how to create a better online environment. In Nigeria, one idea proposed is working with key influencers on the platforms (who also have a strong offline presence) and group admins who can have a significant knock-on effect on both the conduct of others and in terms of stemming the flow of dis/misinformation. I think the dangers of more formal regulation, is that they can be accused to clamp down on political freedom of speech as interpretations of what is disinformation will become politicised. The regulation would also exacerbate divisions if used in this way and this has been a concern raised in Nigeria. But I do think its important to differentiate between disinformation and hate speech, with the latter, often covered by existing legislation and this should be applied appropriately if an individual is using social media to call for violence against others or worse. 
  2. I think regional or continental wide bodies such as ECOWAS or the AU can play a role in ensuring that their protocols enshrine the right of freedom of speech online and that should include a commitment by states not to shut down the internet (which has been a feature across a number of African countries in recent years). They can also establish and enforce a data protection act that covers online users, in line with the 2010 Supplementary Act on Personal Data Protection within ECOWAS, for example. For all the possible risks posed by disinformation, these platforms also allow the spread of useful and empowering information, or provide space for discussions to take place between groups or across borders that are hard to facilitate in other ways. For me closing down the space isn't really the solution, in fact keeping it open should be a key commitment. 
  3. Tough question. Not sure I have a good answer, other than to say they should! In Africa, they need to be more present (physically) but also in terms of how they are listening to and engaging with user concerns and thinking about how to build platform solutions to some of the concerns. For example the labelling of the veracity of some US election tweets, are these services going to be available in Uganda next year for example. States need to do more in this regard, to ensure that companies like Facebook have the ability to effectively moderate content that users share on the platform (in the language they chose to do that). But its very difficult given the size of these social media companies, and many are also involved in building internet cables across the continent.
  4. Nothing to note
  5. Journalists can work closely in networks, with locally trained fact-checkers, so that they can respond as quickly as possible to false news stories and provide their counters. This network can also help ensure a greater reach of their fact-checks through the same channels. In an election period journalists can interview with leading candidates from political parties about key issues in their manifestos on Facebook Live and Twitter. In addition to questions from the interviewers, space can be given for citizens to submit their own questions using WhatsApp and Facebook, to be asked in a segment of the interview that would aim to focus election campaigns around issues and policy promises. In general I think journalists, globally, can try to do more to set the agenda when it comes to key and pressing issues rather than always responding to what is circulating on social media (the US being a good example). But this is often difficult to align with the need to sell newspapers or raise revenue as many viewers want more clickbait content. 
  6. I think a whole range of stakeholders need to be engaged. Around an election this should include representatives of political parties (to discuss codes of conduct), the election commission (who can try and establish a credible voice on social media platforms), civil society, media (who can try and ensure those codes are applied in practice), an so on. Social media platforms also have a responsibility to be present, and I understand they are.
  7. I think the key is more digital and civic literacy. One idea we have from recent research done in The Gambia is, "listening clubs to discuss what is debated on radio talk shows and social media about politics at the village level across The Gambia. These would aim to stimulate further debate and discussion among citizens about transparent and accountable governance, as well as misinformation, and would be overseen by local moderators, trained on these themes". This can be combined with tips on how to look for false news and creating credible platforms that share more accurate news (recordings of radio shows as audio clips, or newspaper articles presented in a way that can be digested and shared on WhatsApp). The Continent (https://twitter.com/thecontinent_) is Africa's first WhatsApp newspaper so I think these kind of initiatives can help get more better quality information circulating (to balance out the falsehood). Propaganda has always been a feature of life, so for me the question is how do we highlight for the positive uses and reduce the negative uses and for that to happen the key is more empowered and informed users who make those trying to sow division less relevant I think what Finland is doing at primary, secondary school level is interesting - https://edition.cnn.com/interactive/2019/05/europe/finland-fake-news-in… as a long term solution. I think targeted digital literacy, targetting in the case of Nigeria key religious and traditional leaders for example, can have more immediate short-term impacts, but that the longer-term goal has to be on this kind of education driven approach (as national as possible).
Stijn Aelbers Moderator

Dear all,

Week 2 has brought us some really practical examples and thought-provoking ideas, like:

Farm Radio highlights the importance of access to reliable information for local journalists, as they play a vital role in getting good information to people.  One of the most successful interventions to achieve this, was organising Whatsapp discussions allowing journalists to ask questions to experts. They also support them with training and tools to identify fake news, and create short messages in some key languages that can easily be translated in more languages by local journalists.

Some interesting thoughts were shared around disinformation and the need for any response to work at the same speed and scale as it spreads - young people and influencers can help with this.

Interesting ideas around crowdsourcing insights online, by “non-experts” to flag disinformation, misinformation and heavily biased news items.

There also seems to be a need for better categories and definitions around disinformation and info "pollution", because there's a difference between exaggeration, bias and fake news, but also a difference between someone who creates disinformation and someone who shares it, with different responsibilities for each. New terminology is also introduced, like the idea of "inoculation" as we way to identify "hallmarks" or indicators of information, which can also be thought through gamificaton.

There's also a lot of work and ideas around making companies responsible for the content on their platform:

In the UK, LSE has issued a report with some recommendations, most notably to establish an independent platform agency (IPA), structurally independent of government, but report to parliament. The IPA would have an ”observatory and policy advice” role on information reliability. It should or could report on trends in information, effectiveness of self-regulation, coordinate media literacy efforts, have the power to request information from all major platforms, the power to impose fines if platforms don’t provide the requested data.

In the immediate there’s still an important role for media, an urgent need for media literacy, and invest in news innovation.

The EU DisinfoLab’s has also issued recommendations that could be applied more widely. One of them highlighting the importance to fund a variety of actors, as there will never be a “one size fits all” solution.

Please log in or sign up to comment.