Managing Change in Media Space: Social Media, Information Disorder, and Voting Dynamics 2
18 Jun 2024 16:45h - 17:45h
Table of contents
Disclaimer: This is not an official record of the session. The DiploAI system automatically generates these resources from the audiovisual recording. Resources are presented in their original format, as provided by the AI (e.g. including any spelling mistakes). The accuracy of these resources cannot be guaranteed.
Knowledge Graph of Debate
Session report
Full session report
Experts Convene to Address Social Media Misinformation and AI’s Impact on Democracy
During an in-depth workshop focused on the evolving media landscape, a panel of experts gathered to tackle the critical issues of social media misinformation, the influence of artificial intelligence (AI) on voting, and strategies to counteract information disorder. Journalist Vytautė Merkytė from Delphi moderated the event, introducing a panel that included Dr. Tilak Jha, Gabriel Karsan, Dr. Viktor Denisenko, Ieva Ivanauskaitė, and Aistė Meidutė.
Dr. Tilak Jha from Bennett University, India, began the discussion by reflecting on the Indian elections, which were significant for the extensive deployment of AI. He discussed the complex interplay between myth and truth, noting how AI and misinformation campaigns can have both positive and negative impacts on elections. While AI facilitated personalized political campaigns and voter engagement, it also led to the creation of deep fakes and offensive content, ultimately resulting in a casualty of truth.
Gabriel Karsan from the African Parliamentary Network on Internet Governance shifted the conversation to the African perspective, particularly Tanzania. He described social media as a tool for community engagement and feedback for the government, emphasizing the importance of understanding internet governance. Karsan highlighted the need for informed policy that considers the unique social dynamics and digital literacy levels within African societies.
Dr. Viktor Denisenko from Vilnius University addressed the geopolitical challenges in the Baltic region, including information warfare from Russia and Belarus. He underscored the importance of media literacy in building societal resilience against propaganda and misinformation. Denisenko acknowledged the difficulty in implementing media literacy education on a large scale, pointing to the lack of teachers and the need for educational reform.
Ieva Ivanauskaitė from Delphi presented the organization’s anti-disinformation initiatives, which include fact-checking tools, educational content, and collaborative networks. She emphasized the importance of engaging youth through platforms like TikTok and Instagram and the need for stakeholders to take action to change the status quo while also encouraging users to improve their critical thinking and use verified sources.
Aistė Meidutė, representing the Lithuanian Counter Disinfo Project DigiREST, discussed the Digitas project’s efforts to strengthen digital resilience by empowering community leaders with fact-checking knowledge. She highlighted the importance of equipping individuals with the skills to critically assess information and the challenges faced by fact-checking organizations in competing with more entertaining content on social media.
The session concluded with a consensus on the need for a multistakeholder approach to effectively address misinformation challenges. The panelists agreed that solutions must be tailored to specific regional contexts and that empowering community leaders and the general public with the skills to identify and counter misinformation is essential. The importance of collaboration between fact-checkers and social media platforms was emphasized as a critical step towards minimizing the spread of disinformation.
Key observations from the discussion included the recognition of the varying motivations behind individuals’ search for or spreading of information, with some aiming for profit at the expense of truth. The session also highlighted the importance of engaging with panelists and stakeholders to foster collaboration and promote democracy. There was a call for action to work together for greater impact, reflecting a shared commitment to reducing disinformation and enhancing democratic values.
Session transcript
Vytautė Merkytė:
So hello, everyone here. Welcome to the second half of the workshop. The next upcoming hour we’re going to be speaking about managing change in media space, social media information disorder and voting dynamics. My name is Vytautė Merkytė I’m a journalist at Delphi and I’ll be your moderator. And we have a lovely panel of specialists here who are eager to share their thoughts, their information, everything that they have. Some of them are online and some of them are, as you can see, present here. So I’ll start by introducing the people who are online. And since this is an international event, obviously, there’s going to be this charming element of not knowing how to correctly pronounce someone’s name. So I’m very, very sorry if I’m going to butcher your names. Since my name is Vytautė, I’m very used to having this happen to me. So our first panelist is Dr. Tilak Jha. He is an associate professor at Bennett University, India. And please correct me if I pronounce your name in a wrong way.
Dr. Tilak Jha:
No, it’s perfectly fine.
Vytautė Merkytė:
That’s a surprise for me. Thank you. So our next participant who is online is Mr. Gabriel Carson. He is Secretariat Support at African Parliamentary Network on Internet Governance. Do we have him online? No. Oh, OK. And did I butcher your name?
Gabriel Karsan:
All right. So how much time do I have?
Vytautė Merkytė:
Oh, no, no, no. We’re still introducing the people. So if I understand, Mr. Gabriel is not online. OK. Thank you. Thank you. Thank you. So next to me, we have Dr. Viktor Denisenko. He is Associate Professor at Vilnius University. On my right, we have Ms. Ieva Ivanauskaitė. She is Innovation and Partnership Team Lead at Delphi. And over here, we have Aistė Meidutė. She is representing Lithuanian Counter Disinfo Project, DigiREST. So the way it’s going to work here, we’re going to have each panelist present their talking points, and they’re going to have around six minutes to do so. Afterwards, we’re going to have a discussion and we’ll be awaiting questions from the audience and also from the people who are online. So I think we can start with Professor Tulak, who is online. So are you ready to present your talking points? Yes, yes. Please go ahead.
Dr. Tilak Jha:
So first of all, thank you so much. I think it’s an immense privilege to be part of this fantastic discussion. I think let’s start with the debate today about elections. And I think India just had world’s biggest and possibly most comprehensive election exercise with almost 1 billion people voted. So I think how we need to, before we get into the detail of it, let me just tell a few things, is that what we are essentially talking about is two things, two words, which have become very, very interesting in current times, myth and truth. So what essentially these AI tools and misinformation campaigns, and for that matter, they don’t have always negative impact. They also at times and often have very positive impact, including in the election process. But what we do see on a larger scale that there is a whole, sort of thing in which we find that truth has become a casualty. So that’s the first thing. And I think we are living in an era of TikToks and Instagrams, fragmentation of politics, far right coming up in a very, very significant way, including in almost many other parts of the world, including in Europe as well. Quite a bit of irreverence, I would say. I think this has become a trend in younger generation, in the social media generation in particular. There is a tendency to become entertaining and rebellious. And that is a clear sign towards some sort of illiberal rule, if I may say it. Democracy is technically turning more direct in terms of being people’s ability to comment and share and respond to things being said. But at the same time, I think we often see biases coming from both sides. And this is the real tragedy of the misinformation thing, which is happening, including at the time, something as crucial as elections as well. And I think elections, now we’re talking about elections, just a few, I mean, a couple of days ago, we got to know that there have been at least two people who got elected recently to the European Union elections in which there is a politician from Cyprus and another is a Spaniard. They both have hardly any political experience, not much of higher education. Their primary qualification lies in railing against the political elite of the country, taking position which mainstream politicians would find very, very difficult. And this is not just happening in one part of the world, this has happened in America, this has happened. in many other countries, and I think including in India as well, what we see is that it is not about left or right at times, it’s just about politicians and stakeholders. When there is something as high at stake as elections and getting to power, they are willing to cut short, and that’s where all the election-related misinformation comes in. I think when we talk about Indian election 2024, we had, it went on for roughly six weeks, in which you had a huge, this was the first election for that matter in which AI was deployed at this scale, you know, because there was not much of AI before this, so this was the first election in that sense, and what we initially saw was a bit of innocent videos in which some politicians would use the technology to create some videos, to personalize their campaigns, but over the period, it looked to a situation where it was doing big fakes, some of it really, really controversial, and creating issues. There were some Bollywood stars who were caught up in creating, and who were basically found to be doing deep fakes, which were, which was not theirs, and I think thereafter the election commission also started taking note, but this was not just on that side, just celebrities from the Bollywood, or this world, that world, it was happening from the ruling and the opposition parties on the both sides, you know, and I think I would like to just point out that there is this fact-checking website, which I have gone into the detail of it, they have found that… Yeah, I think I have been hearing voices.
Vytautė Merkytė:
Yeah, that’s usually not a good sign, but I can assure you, we all heard them. So I’m very sorry for this, the situation.
Dr. Tilak Jha:
I think I would just like to cut this story short. I believe in the six minutes that I’ve been allotted is that in general, what we see is that positive and negative things. So you have political parties and BJP, the ruling party of Prime Minister Modi and opposition. And of course the regional parties, they’re all using information, misinformation and AI to a great extent at times to speak in regional languages. For example, BJP has been at the forefront of having the Prime Minister speaking many regional languages. This was not something very easily possible earlier, but with AI, this has come fairly. There was one dance video and with the Prime Minister who was dancing. I think the Prime Minister himself shared that video and commented on a light note that, well, I also found myself dancing for the first time ever. So those things have also happened, but there have also been some videos which have been very offensive. I think that legal actions have been taken. when the government had the issue that well, there could be legal action. I think some of them have also found it very, very easy to sort of just escape the legal thing because they can’t be caught one person at least who shared a video of an opposition politician, Mamata Banerjee, in which she was again shown dancing. And I think one of the video was really, really offensive in which she was being shown using a sort of, using a sort of remote to burn down a hospital. That was offensive for sure. So in that case, the user who got in touch with the news agency said, I cannot be tracked and I cannot, I’m not going to take down this video. So these are also the things. So positive and negatives have both happened. But what we do see is that AI has done a lot of micro-targeting and personalization. And quite a bit of misinformation has been used on both sides for the matter. I think in this misinformation thing, at least in this Indian election, what we found is that the opposition and the government were equal victims. At least one website called Logically Facts. They did roughly 224 fact checks. In the report, they said that almost 93 of them were against the alliance, against the ruling party. And roughly 46%, which was almost 103 of the 242 were against the opposition party. So roughly they were both targeted very, very significantly. That’s what we see. And of course, there were quite a bit of AI defects and other things. The election commission also found itself to be at the receiving end initially, though there was not much of AI content in this misinformation thing. It was hardly around four to five to 6%, not much. But it was also, there was, even if it was normal misinformation, people would claim that it is AI, but it was just normal editing. Those sort of things have also happened. At times, politicians have claimed that the video was fake. but fact-check organizations have found that the video was real, so those sort of misinformation has also happened. So quite a bit of, you know, trust has become a casualty, and I think that has been the biggest casualty, not just because of AI, but in general. I think news agencies, the declining amount of trust in news agencies, have created a situation where there is a general distrust, there is a general lack of authority, and at least an authority in which people have faith. And now this failure for that matter of the liberal political setup has definitely sort of pushed the AI and misinformation thing in particular, and the use of AI for this, really, really further. One senior election commission official went on to say that, well, we simply do not have the means to keep track of it. All we can do is complain to the social media platforms, and if they say that this is in line with the community norms, or for that matter, if they take time, we have not much to do. By the way, AI has also helped in voter education. It has also helped in generating engagement. But if we see, sort of do comparison in terms of whether the benefits have been more, and whether whether its limitations have actually been more, it’s the other way around. So these are the, some of the, some of the contours in which we can see the Indian elections, a lack of trust, a lot of fake news and misinformation. And there is a whole tendency to skew public perception, influence voter behavior, and even manipulating election outcomes. Thankfully, it has not led to a situation where we can say with any certainty that it has actually led to manipulation. But the jury remains out in terms of saying that, well, AI and misinformation. misinformation campaigns didn’t really affect election. At least there have been some incidents which have been reported after the election that do point to this being a factor at least in some constituencies, at least in 5% to 10% of, 5% constituencies for sure, especially in populous states and states where literacy is less. For example, Uttar Pradesh, the largest state of India, at least in some cities there have been reports which do point out that misinformation did play significant role. So I think this is the contour in which we need to see Indian elections. And I think I look forward to questions from the, I’ve got to take it further, yeah.
Vytautė Merkytė:
Thank you so much. And I just want to remind you that everyone that this year is a very, very special year. So this year around 4 billion people are going to the election polls and India is one of the biggest democracies in the world and just had their election, but there was going to be election here in Lithuania and in the States. So around 4 billion people are going to be, are either were affected by disinformation that can be found during the elections. Thank you so much. And I believe I can see Gabriel Karsan on the Zoom call. Are you ready to speak?
Gabriel Karsan:
Yes, I’m available.
Vytautė Merkytė:
Okay, so please present your talking points.
Gabriel Karsan:
Thank you very much. My name is Gabriel Karsan. I am the Secretary of Support for the African Parliamentarian Network on Internet Governance. Briefly, what we do is we want to empower parliamentarians in Africa to understand internet governance as an ecosystem, as a means to influence policy and create further understanding. I would like to start with a simple reflection of what social media and social networking is because for countries like Tanzania and developing countries like India, we have a lot of social media platforms. So we have a lot of social media platforms that are used by the government to promote the development of social media. And we have a lot of social media platforms that are used by the government to promote the development of social media. countries like Africa, where we have been privileged to have leapfrogging in localizing social media, there’s a difference on how we relate to the system, social media itself, social network is about bringing communities of people. So if we talk about the internet, characteristically being open, and centralized, and, and, and there is an abstraction in which social media is a use case drawn on how communities align. And for a country like Tanzania, where we have a socialist background, a combination of almost 120 tribes and another nation coming together to coalesce, we do have some parameters of integration, which can also be viewed in the high abstraction of social media. When it comes to voting, frankly, Tanzania, we have gone through democratic processes, backed by the Greek mechanisms where voting hasn’t changed much in terms of its forms, whether online, or whether it goes in a simple ballot box, the conceptuality of voting has always been the same. And when we have mixed with the concept of social media, I think it has still been highly principled in how our community is viewed as deep representation at a centralized and local level. But beginning in the 2015 election, where we did have a higher penetration of using digital tools and people understanding social media, we saw that the political party engagement in using social media as a means to share their objective campaign truth and not much oppression happened in terms of the opposition there, it was an equal space, believing that we were a people still gaining the digital skills and the I think we’re having again, technical difficulties standing up what social media is. Hello.
Vytautė Merkytė:
I’m sorry. You kind of wait for a second. Can you hear us?
Gabriel Karsan:
Yes, I can. I do hope I’m available.
Vytautė Merkytė:
So you can continue.
Gabriel Karsan:
Yes. Thank you very much. So as I was saying, uh, in terms of our understanding of the technical space, um, social media to us has just been used as a tool for representation in our communities. But ever since the 2015 digital transformation where the government aided a lot of incentivization for young people to come to improve access and to actually improve what it meant to be on social media as a tool to embrace democratic values. Then we saw the impacts that came without balanced coalitions among understanding for people. Hence, this was a source of misinformation and misinformation has not been quite politically turned out or politically themed. No, it’s that it has been an era of particular skill sets of the people not actually understanding the power of social media. It has been just a representation of rhetoric that happens on ground. Hence, it’s been sort of a natural coalition of seeing social media for what it could be in the social wise. But in terms of the voting procedure, it hasn’t quite been a tool of oppression rather than a tool of deeper engagement, especially in the 2020 election where we could see with improved digital transformation with improved accessibility and affordability context. So many people now could express themselves in terms of representation, but could also use the engaging social media tool as a form of sharing constant feedback. And this is what representation of democracy has been. So for our community, that angle for voting and representation has aligned well with that. social media as a tool. But when we see what is happening now and as we go to elections in the next year, I am seeing a different rise in terms of the political use of social media to influence people, especially our countries where most of the infrastructure is highly controlled by the state. And this is by design because we want to push for further inclusion in the social structure. But still, who regulates the regulator becomes a question for all of us to understand. But as young people, they are engaging. As young people, we are speaking. But the problem comes that there’s a bridge in terms of the population dynamics. And most of the elder generation understands social media rather as a single, monotonous channel, whereas us young people see it as a cultural shift. And that’s the balancing that we need to do. Because in the end, we are still a representation of democracy at the very ground level, which is the decentralized nature of the internet that we see expressed highly by social media. And with these principles and parameters falling in line, I think it still falls to one thing, informed policy. Informed policy in terms of a very dynamic and engaging population, certainly that actually understands digital skills as tools to help them represent themselves. But as I said, voting in the end has never changed what its form or its nature has been. We still have that secrecy. We still have that dignity holding one’s vote in the ballot box. The confidence is online, but the confidence of digital systems and interoperable and open systems, that is where the question arises because of the regulatory nature. But in terms of the ground, the formality of what people understand, I think that is the dynamic which has been balanced, and Tanzania has been quite exemplary in that matter. I would also like to add that the conflict of interest now we have is that most social media has been created and carries the bias of the creator. It is highly Western or Eastern-centric, hence the need for localization and Hong Kong solutions. And we cannot be blind to the dynamics that are changing now. There is a big geopolitical and geotechnology issue that’s happening with China as well as with America. And for us as Africans, we are caught highly in the middle. And unless we do a lot of understanding and owning ourselves in terms of the infrastructure that we need, the ownership that we need, then even the use case of voting on social media might still have influence and might not come to the basic principles. Hence, localization of Hong Kong is such an understanding or context as the social media uses and a culture of people is something that has created a great dynamic shift on how we have aligned in balancing the dynamics of voting online and using the online channel as a representation of participatory democracy. I think those are my thoughts for now. Thank you.
Vytautė Merkytė:
Thank you so much. And I’ll turn to the panelists who are here live with us. Dr. Denisenko, could you please share your thoughts with us?
Dr. Viktor Denisenko:
Okay. Of course, we could see that every region have own challenges when we’re talking about disinformation, some kind of propaganda, new technologies, including AI. And I will try talk more from perspective of our region. And in our region, I could say that the main challenge is the geopolitical situation. We are living next to Russia, a state which a few years ago began open aggression against another country. We are living next to Belarus, a closer ally of Russia, and in Lithuania. And in general, in this region, in the Baltic states, in Poland, it’s not the first year when we are talking about information warfare and the challenge of information warfare. I, as a young journalist, covered this topic for the first time 18 years ago, and I was not the first to talk or write about it. So, for us, it’s not a new challenge, but it’s part of our reality, and this reality is also changing. Because, before Crimea, or even before the year 2022, we talked more about propaganda warfare, war of narratives, information warfare, in terms of psychological warfare. Today, we are also talking about some kind of hybrid influence, when together with information warfare, we have elements of some physical or kinetic activities, including in our region and including in the Baltic states and Poland. So, it’s a big challenge, a challenge for our security. And in this context, a question about media literacy, and I’m understanding media literacy in a broad term. It means we are talking not only about possibility to recognize some fakes or disinformation and propaganda, but in general, to use information, to figure out which sources of information are trustful, which are not, how this activities of information warfare could affect society and so on and so on. In this situation, I could say that media literacy is crucial thing. And in Lithuania, we have kind of paradoxical situation because I could say that in Lithuania, we are lucky because authorities recognize that the challenge of information warfare exist, and that we are, in fact, in this tough situation. Also, we have political will to do something about it. Non-governmental organizations, a lot of non-governmental organizations works in this field. Media supports media literacy, and in our media, we have a few initiative of fact-checking, so it’s a very good thing. And main point, which we are discussing in Lithuania, I think, last 10 years or even more, but we need to put media literacy in the schools. And of course, here, we have discussion. how it should look like. Should it be a separate course for pupils in the schools, or it could be integrated in some existing lessons or courses? And the paradox is that, in general, in Lithuania, we have this political consensus. If you will ask any politician, I think, do we need some media literacy like a course in the schools, he or she will say, yes, of course, it’s quite obvious. But still, I could not see implementation of this political will, because every time when we’re trying to talk about practical implementation, it’s more problems. First of all, we need teachers. It should be part of education reform. In general, also in Lithuania, in general, is lack of teachers. And if you will start preparing teachers today for this course, but first teachers we will have after four years, bachelor level of university degree, and it will be not enough still. So my point is that even in countries where understanding about challenges and importance of media literacy is quite, where this understanding exists, still it’s problems with implementation. So I think we will stop here.
Vytautė Merkytė:
Thank you so much. And so we touched on elections, we touched on media literacy and what can be done, and what’s actually not being done. And now it’s time to turn to these two lovely ladies, who will actually share some practical information on what they gather know, and how it actually looks to fight disinformation. So, Ieva Ivanauskaitė, if you could go first.
Ieva Ivanauskaitė:
Yeah, just let me share my screen. Apparently, it seems that I have some sort of system restrictions. So I know that the organizers have my presentation, if you would be so kind to show it on the screen. I would be really helpful. But while they’re doing that, I wanted to say that what I am about to say is going to be a smooth transition from the first part of the workshop, workshop 2A, of what Victor has just said, because I’m going to be talking about solutions. And specifically, I’m going to be talking about the solutions of the organization that I represent. And it is DELFE. It is the largest online news organization in the Baltic states, and the most read online news media source in Lithuania. And we have been working quite hard in terms of anti-disinformation measures ever since 2018, when the first project related, actually 2017, when the first project related to countering disinformation measures was born. What we are doing right now is probably, if we’re not seeing the slides, I’m going to try to visualize them for you. So on my first slide, you would see our specific initiatives against disinformation. So we have three main areas. We have fact-checking tools, we have educational content, and we have collaborations. When it comes to fact-checking tools, we have a fact-checking department, the lead of which is sitting right next to me, and it’s called Malwa Detectors, or Lie Detector in English. We belong to many collaborative networks that unify fact-checking organizations globally and on a European level, and thus we’re able to make an impact not only in Lithuania, but also beyond. When it comes to the educational content, we have specific content and specific tools that we’re targeting, by which we’re targeting youth audiences. So for example, when it comes to Malwa Detectors, we have a campaign of media literacy videos on TikTok and Instagram, where we very briefly, but also in a very simplistic manner, what specific disinformation trends on topics that are of everyday relevance to the user’s needs. So for example, there was one video where we explained what a little sticker of a frog on a banana means. You would be surprised that it was a very trending disinformation narrative of such a simplistic thing, but we think that it’s relevant. We did that video and it acquired more than 100,000 views on TikTok.
Vytautė Merkytė:
Which is a lot for Lithuania. A lot.
Ieva Ivanauskaitė:
We only have 2.7 million people living in Lithuania, so I think that’s a huge achievement. When it comes to collaborations, we are part of a few networks. One of which I think is going to be presented as well, as far as I know, where we are part of collaborative work with academia, NGOs, media organizations, which I represent, where we brainstorm together to find solutions on how, in specific markets, in our markets, In our case, there are two organizations that we belong to. One is a Lithuanian organization called Digitas, and another one is a part of the Edmo Hub that was presented in Workshop 2A. It’s the Baltic Hub called Besit. We sit together, we find solutions on how to counter disinformation together. That’s what I was about to present on that first slide, but I’ll probably fast forward to and conclude after this, because I feel like I’m talking too much. The last thing, when it comes to solutions, are also technological innovations. We have rolled out a fact-checking bot on Messenger platform under the account of Maladetektors, but we will also be launching another one in collaboration with other fact-checkers in Europe to ensure… AI learning from different languages when it comes to disinformation in real time. So basically what I wanted to say is that when it comes to the best practices that we can deploy, it is a two-way street. So stakeholders, meaning government institutions, media organizations, NGOs have to take measures themselves, which include fact-checking, collaborative networks to find solutions together, because we’re doing that and we know that it works, and creating engaging content that would be relevant to their audience, regardless of whether the audience is of media organization. So it’s relatively clear what we have to do. We have to make the content easy to consume, readable, but for example, if we’re talking about government institutions, they have to think of what their target audience is as well and how to best approach them. And on the other side of that same street are users and readers. So what they can do, they can improve their critical thinking, they have to use the verified sources, which is a huge problem. And as Viktor mentioned, there has to be political will and there has to be strong measures to change the status quo, which is not so good right now, unfortunately. And yes, they have to be willing to participate in those educational programs. Yeah, so if there is no interest from those parties on both sides of the street, we will probably not see the impact, but at least I can say from a media’s perspective that we’re trying and we’re doing everything we can. Thank you.
Vytautė Merkytė:
Thank you so much. And I’ll just add, I think what was said is very important. I come from Delphi as well. And sometimes, as a media organization, we invite school children to have to see our office and to understand how journalists work. And quite recently, I had a group of 16 year olds who visited us. And I asked them, how do you get your news? What do you guys read? And their reaction was, we don’t read news media like portals, newspapers, and so on. Okay, so how do you get your information? And their reaction was TikTok. And then my reaction was, but do you realize that there’s a lot of disinformation on there? And they were like, yeah. And then I asked them to give me examples of this information that they saw on TikTok. They gave a lot of examples. And so it’s very important that people want to understand and find disinformation, that people want to actually increase their media literacy skills. Because for example, those 16 year olds were okay with receiving disinformation. So I just wanted to add this because it’s a very interesting point. And our last panelist here is Ms. Aistė Meidutė.
Aistė Meidutė:
Hello, everybody. Once again, it seems that today I’m wearing many different hats. So some of you already heard me during the first part of this workshop, where I was talking as part of Delta’s Fact Checking Initiative as an editor and fact checker. And now I’m gonna briefly talk about another thing that we did together with Vital Tasmanian University and the different NGOs. It’s a project called DigitS. This project was started with a very ambitious goal to strengthen digital resilience. of society to talk about disinformation, to empower people to fact-check some certain themselves. And it was a common, it still is a common initiative between academia, universities, media organizations, and independent journalists. And what was our approach to fight disinformation, to talk about it? First of all, we were thinking about how to build trust in traditional media, because of course it’s a huge problem that people do not trust media organizations anymore. They tend to look for information on social networks. Just last week, I was fact-checking one claim when one woman declared that young Lithuanian schoolboys right after the school, they’re gonna be sent to Ukraine to fight in the war. And when one person asked, where’s this information coming from? She said, I saw it on Telegram. So Telegram is now the leader who passes information to people, and not the official media. What we tried to do to change this, at least a little bit in Lithuania, we were talking a lot with regional media, with different media organizations, with NGOs, different stakeholders, about how media works, about what is fact-checking, how fact-checking works, why it’s important. Of course, when you see this picture, how everything is in huge cities, it’s very different from what you see in smaller regions and media-wise as well. People living in smaller territories, usually they do not tend to think about this global perspective, about why do we need so much to talk about this information, this information problem. And by connecting with regional journalists, people from region, we get their perspective. We get their problem, the problem that they’re facing. The other thing that we did was equipping community leaders with knowledge how to fact check content themselves, how to look for sources. We equipped them with knowledge how to consume content in a different manner. And one of the main techniques that we were talking about was lateral reading, where you, instead of scrolling down the page, you tend to come out of the article that you’re reading and search for different clues, like what people they mention, what events they mention, and fact check information in that way, looking for more contextual information on the content that you’re reading. And I think one of the most important efforts that we’ve done was meeting with community leaders who passes their knowledge to others. For instance, we tend to think that journalists passing the knowledge, but it’s also doctors who are passing the knowledge, different people. And people are entrusting themselves with their most valuable asset, their health. They’re looking for advice, help questions. There’s also librarians who passes their knowledge to people who not only are looking for information, they are talking about the reality that they have to face. We’re talking about teachers who passes their knowledge to children. And all these people, all these different people need to be equipped with this knowledge how to fact check information. Well, some of us could say, like, I don’t know. I’m a doctor. I’m a driver. I don’t need those tools, those fact checking tools in my life. I have other problems. But we need to understand that this huge problem of disinformation, it’s not going to. solve itself. And it’s not only fact checkers and journalists who have to explain what the reality is, we all need to have the sense of what the reality is. And the only way of achieving it is to having better knowledge of the most common tools, for instance, how to do basic fact checking. And my general notion is that being a fact checker, at least a mediocre fact checker, if not a good fact checker, is easy. And it’s pretty reachable for many members of our society. And it’s going to be our reality, but so much in terms of such a huge information flow. The other approach that we took was creating a pilot learning, teaching model for students in third university or communication students. And I was leading the workshops on how to recognize the main disinformation narratives and how to use those simple digital tools to fact check information themselves. And of course, those young people had many questions regarding how to, for instance, talk with their parents or their grandparents that deeply believe in conspiracy theory or are deeply affected by low quality content and disinformation. Of course, each huge goal comes with the challenge to achieve that and we have to deal with as well. And one of the biggest obstacles is that of course, fact checking works. And it’s proven by by university studies, but not enough people see why not enough people see because we constantly have to compete with, I don’t know, cute kitties playing butters all over the internet. And it’s hard. It’s a it’s a really hard task to talk about serious things. and to attract people’s attention. So that’s why fact checkers, of course, need help from the biggest social media platforms. Otherwise, we’re not gonna be able to pass our message, what we want to share. And of course, when you work in this huge multiple organization project, you have this feeling that different stakeholders have pretty different goals and it’s not always matching or even if they match, there’s another problem, we risk of duplicating our efforts. And that’s what we see with a huge rise of fact-checking organizations, for instance, and a huge rise of organizations consisting of different fact-checking organizations is that most of the time, we tackle the same disinformation narratives without looking for a direction, a common direction that we could go to and reaching something bigger kind of reaching progress. And the other thing that I noticed, especially it’s kind of self-critique, but between us, a lot of fact-checking organizations are driven by this approach to fact-check singular claims. And this is how most of the partnership works with the bigger platforms that we fact-check separate claims instead of looking at wider context, instead of talking about the influence operations and the actions that the bad actors take. And it’s very important to see the wider picture, of course, one fact-check is not gonna solve this huge problem, but I don’t want to leave you with such a gloomy message. I deeply believe as I said before, each of us can be a fact-checker. We just need this curiosity to do things, to explore media world. It’s very, very powerful. So thank you for listening to me.
Vytautė Merkytė:
Thank you so much. And I can say, I don’t know about you guys, but I’m constantly working as a fact checker for my mother, for my father, for everyone in my family. And I guess everyone can relate to this. So we still have some time for questions. And I hope that either here in this lovely audience or from the lovely people online, maybe someone would like to ask a question and I see an eager audience member. Please go ahead.
Audience:
I’ve been hearing a lot about media literacy programs. And everybody knows that in media literacy, education and activities are important. But my question is, how do we do it at scale? Because we are, most of us, we are small, not well-funded civil society organizations. And if we bring a group and another group, we maybe help a hundred people, a thousand people. But how do you do this? Do you have any ideas? How do you do it at scale? Reach a large audience, thematic change?
Vytautė Merkytė:
I guess maybe Ieva , you could help find the answer here.
Ieva Ivanauskaitė:
Yeah. So coming back to the point that I emphasized during my presentation, first of all, you have to have the demand for a change in a country. If there is a demand, there has to be one initiator and one organization is enough, I think. As long as that organization is motivated enough to bring all of the stakeholders in one room to discuss the possible next steps, even, not the final result, but to outline the strategy that can be later be divided into smaller steps that would allow for a big change. And this is what we are trying to do. We are still doing the baby steps. But when it comes to the networks I mentioned, this is what we did. We had this idea that we wanted to collaborate with academia in the very first place. And then little by little, we realized that we also want to collaborate with government institutions, with the ministries that should be interested in that, with media literacy practitioner, NGOs that have hands-on experience in informal education. And we brought them all together and there is a demand for it. And we’re little by little discussing on how this can move forward. So there has to be coming back to the- Relations, right? Yeah, yeah. Different people in organizations interested in changing the status quo with different capacity skills that can complement one another.
Vytautė Merkytė:
We have a lovely observation from Professor Talaq, he’s online. And he’s talking about that we should look at the algorithms of media companies and how they are using or rather misusing the nature of the human mind, trying to get more likes, trying to get more attention and so on. And I will turn it into a question and it’s gonna be directed to Aiste. Is it possible to find disinformation on social media when social media companies are gaining so much by making people angry, making people engaged? So is there a way for us to work together actually with them trying to achieve this one goal of eradicating or just minimizing the amount of disinformation?
Aistė Meidutė:
I believe unless we include all the platforms and we sit at this round table with them and we have the same goal, then yes, it’s possible. But we didn’t manage to achieve that yet. And while some platforms, for instance, Meta, is at least doing something. I mean, we know that majorly this disinformation problem was created by social media. So they’re kind of tidying after themselves, in a sense. At least they’re trying to do something now. But it’s not enough involvement. And we talk about this problematic platforms. Like, for instance, YouTube, who was sent a letter from the world’s fact checkers, but at first didn’t really react to it, then tried to kind of say that, oh, we’re doing something in-house. But it’s not enough to do something in-house. I don’t see it. Probably you don’t see it as well, how they try to reduce this information on the platform. And we need to have this huge collaboration between fact checkers and platforms. And look for those mutual solutions how to tackle this problem. Because as experts in this field, we know what to do, probably. And it shouldn’t be the initiative coming only from the platforms that say, we can govern ourselves to do that. We don’t trust them anymore, I guess.
Vytautė Merkytė:
So what I noticed when it comes to YouTube, when you go on certain video, you can see at the bottom, they say, oh, this video has information about COVID. And that’s it, you know. And I haven’t noticed any additional work from them. We still have time for probably one question. Is there anyone who would like to ask something? Anyone from who is online? Ah, OK. Yeah.
Dr. Tilak Jha:
Not a question or rather not. observation, I would say, is that I think the previous panelists deliberated on the aspect of algorithm and human mind, but I think we tend to ignore how much logic can achieve. Logic is a double whammy. It can make both sides appear equally logical, which may not be the case. I think this is the challenge that we are facing in elections, in misinformation and information related to health and everything. You just see during COVID, the time when people were dying, just in queue for oxygen and hundreds and thousands of us all across the world, and there were people who were busy minting money because somehow they were misusing the information. So logic and this argument that we can live with a very, very logical world when the entire limits of logic ends up with consumer and markets. I think that’s where this is the fundamental question that we need to address. AI is not a problem, but AI is not very natural. We tend to also ignore the fact that human intelligence and AI, what AI does is essentially clubbing up a lot of information and recycling that information, and we tend to call it intelligent. That’s what intelligence is. Intelligence is using less information to be able to tell more, more deduction. If I know everything that Facebook and Google and Twitter and all the social media platform know about you, I’ll be able to tell much more about any person. But with all this information, they’re just able to read, provide some basic information. These needs to be understood, these needs to be understood and taken in context. I think we have tried to make the world far more logical, including with the application of AI and all these things. That’s somehow backfiring. We need to… focus also on how to understand this information. Information is an empowerment. We are providing empowerment, but we are not providing people the sense to use that empowerment. That is far more critical question.
Vytautė Merkytė:
Thank you so much. And I think it’s a very interesting point saying that, you know, some people were gaining, you know, some people were trying to find information and some people were spreading this information and gaining money from that. So it’s, we turn into trying to find out, realize what being human is. Do we want to get more money and spread this information for our gains or are we actually trying to fight it? And I would like to turn to Francesco to wrap the session up.
Reporter:
Okay, yes. Thank you very much. I’m Francesco Mecchi from Youth League. I’m here as a rapporteur. So my aim is to try to wrap up what has been said during the session. Finally, if there is broader consensus in the room in order to draft, you know, the message that you can actually even later edit, modify, add comments on if you believe that I missed something. It’s a process that’s going to take place in the next few days. So please do when it’s going to be shared on the platform. Okay, I’ll start from the context. Okay, we said that today, this year was really particular because 4 billion people were going to elections and this showed on the one hand, much potential for this information to produce some problems in these democratic institutions, but also general distrust in democratic institutions as they are. We see trend towards entertainment, gamification and polarization in democratic societies. AI used to foster disinformation practices and propaganda, not just with generative AI, but also with algorithm and machine learning. And also with tools like translation tools or micro-targeting campaigns and targeting practices. And especially social media of course played a major role, especially in global South countries where they are actually a tool for constant feedback from the government. What are the solutions to this? Please keep in mind that there has already been a session, so I would skip on things that we already discussed earlier. So, for example, multistakeholderism and other approaches that already have been presented, especially by Delphi. Actually, what we wanted to… Some solutions that are proposed are especially to widen the broad media literacy. That means not just education on how to use media, but especially to feed a critical spirit, education, fact-checking, and add it to the educational curriculum in schools with a specific attention to implementation. Try to create a virtuous cycle between stakeholders and users. So, on the one hand, stakeholders must take actions to change how they provide services, but on the other hand, users need to produce critical thinking on their own and use verified sources. Fourth, diversify solutions depending on the region. We saw that actually misinformation for Central Eastern Europe can be a problem, must be framed in geopolitical terms. For Africa, it has to do with decentralization of power. For India, it’s more related to micro-profiling and other practices. So, we need to diversify solutions. We cannot think of just one solution for every kind of issue. And finally, empower community leaders with knowledge and tools to detect the myths and disinformation and understand information. It’s obvious because they are actually important actors on the local level. Finally, the general approach should avoid West-centric or East-centric trends. So, avoid either European, American attitudes and Chinese attitudes. And be really inclusive and global. And it must be focused specifically on social media because especially for the growing youth in the Global South, they are most of the Internet they consume. Is there any specific objection, anything you want to add, any modification, or do you agree with the main message?
Vytautė Merkytė:
Well, I think you did a wonderful job.
Reporter:
Great. Okay, thank you very much.
Dr. Tilak Jha:
We need to engage more often.
Vytautė Merkytė:
True, that’s true. So, that’s what I wanted to say. I wanted to encourage everyone here to reach out to these lovely panelists. If you have any suggestions, any ideas, if you want to collaborate somehow. I believe that we’re all here and we all have the same purpose. We want to make sure that there is less disinformation and more democracy in the world. So, let’s, you know, collaborate. So, thank you so much for being here today.
Speakers
AM
Aistė Meidutė
Speech speed
150 words per minute
Speech length
1557 words
Speech time
621 secs
Report
During the workshop, significant attention was given to the collaborative DigitS project, which is a joint endeavour by Delta’s Fact Checking Initiative, Vital Tasmanian University, and several NGOs. The project’s primary objective is to strengthen the public’s digital immunity against the widespread threat of disinformation and to empower individuals to conduct their fact-checking.
The presenter detailed the strategy behind the initiative, which concentrates on restoring public confidence in traditional media, considered essential due to widespread scepticism that has led individuals to seek out information from social networks instead. The spread of false claims, such as the myth about Lithuanian schoolboys forced to fight in Ukraine on Telegram, exemplifies this issue.
The project counters this by collaborating with local media and stakeholders to demystify media operations and stress the importance of fact-checking processes. It includes facilitating discussions around media literacy in regional communities that may lack a global perspective. Interaction with local journalists enables the project to identify and tackle specific community challenges.
Further, the initiative has provided community leaders, including doctors, librarians, and teachers, with fact-checking tools and training in critical media consumption, such as the lateral reading technique. This involves stepping away from questionable articles to seek corroborating information from reliable sources.
The project also fosters the development of a pedagogical model for university students in communication disciplines, equipping them with skills to detect disinformation and validate information using digital tools. They are taught how to engage with family members who may be influenced by conspiracy theories or substandard information.
Nonetheless, obstacles have been acknowledged. Despite fact-checking being academically supported as effective, retaining public interest in the face of more appealing online content remains a challenge. Additionally, the emergence of numerous fact-checking bodies without a coordinated strategy could result in their efforts being diluted.
The presenter offered a self-critical perspective on the tendency of fact-checking groups to focus on isolated false claims rather than comprehensive narratives or manipulation campaigns by malign entities. Concluding with cautious optimism, the speaker noted that while the potential exists for individuals to become effective fact-checkers, successful counteraction of disinformation requires a stronger partnership between professional fact-checkers and major social media platforms.
Some platforms, like Meta, have started to address this, but others, such as YouTube, have been less proactive, underscoring an urgent need for the platforms to assume more responsibility and collaborate with fact-checking organisations to discover impactful solutions. This summary has been edited for conciseness, accuracy, and to include relevant long-tail keywords.
It adheres to UK spelling and grammar conventions.
A
Audience
Speech speed
160 words per minute
Speech length
100 words
Speech time
38 secs
Arguments
Scaling media literacy education is challenging for small and underfunded organizations.
Supporting facts:
- Small civil society organizations often have limited resources.
- Media literacy education is essential for combating misinformation.
Topics: Media Literacy, Civic Education, Digital Literacy
Report
The imperative to enhance media literacy is increasingly recognised, particularly given its vital role in promoting informed citizenship and combatting the widespread dissemination of misinformation. However, small civil society organizations, which play a pivotal role in educational initiatives, face considerable challenges due to their limited resources.
This presents a profound concern as they struggle to scale up their media literacy programmes to cater to the public’s growing demands. The core argument here is that inadequate funding and support for small organisations hinder their capacity to broaden their influence and fully address the need for media literacy.
While this literacy is essential for nurturing a critically thinking electorate, the journey towards achieving this objective is fraught with logistical and financial difficulties. Echoing the sentiment of challenge, implementing scalable educational programmes is critical for expanding the reach of media literacy.
These scalable solutions are necessary for the growth of these vital programmes, ensuring they meet the exigencies of an increasingly complex information landscape. However, innovation and investment are required to create and run scalable initiatives, which are currently insufficient. These concerns are in line with Sustainable Development Goal 4, Quality Education, which emphasises the global imperative for inclusive, equitable education and lifelong learning.
This goal inherently includes efforts to enhance media literacy as part of quality education. In summary, the necessity for comprehensive media literacy education is undeniable, yet same is undermined by the practical constraints of resource availability within small organizations. Addressing the challenges of funding and scalability that these entities face is essential to the widespread and effective implementation of media literacy initiatives.
A broad-based support system, alongside innovative and sustainable strategies, is required to empower these educational endeavours. Ensuring the integration of such measures is crucial to overcoming the current uphill struggle and achieving the desired impact in civic education.
DT
Dr. Tilak Jha
Speech speed
178 words per minute
Speech length
2216 words
Speech time
746 secs
Report
The discussion offered an in-depth examination of the recent Indian elections, using this event to explore broader implications of AI and misinformation in electoral processes. The narrative centered on the dichotomy of myth versus truth, highlighting how technology can have both positive and negative effects within the politically charged arena.
The analysis stressed the magnitude of India’s elections, involving nearly a billion individuals, and discussed how AI facilitated the creation of personalized campaign content and deepfake videos. Initially harmless, this quickly escalated; deepfakes caused significant uproar, affecting public perception and involving both Bollywood celebrities and political figures.
Misinformation was not confined to one political endeavour; the neutral fact-checking website ‘Logically Facts’ conducted 224 fact-checks, finding misinformation campaigns affecting both the ruling party and opposition, revealing a universal challenge. Despite its positive aspects, such as increasing voter education and boosting engagement—with an example being the BJP’s use of AI for speech translations—the predominance of misinformation and AI misappropriation overshadowed these benefits.
Notably, this led to an erosion of trust in electoral processes and the manipulation of voter behaviour. The speaker also underscored the difficulty legal systems face in countering misinformation, with the culpability often being evaded by individuals on social media platforms.
Additionally, a philosophical enquiry into the role of logic and information in society was presented, especially noting AI’s incapacity for deductive reasoning as compared to humans. The COVID-19 pandemic served as a stark reminder of how information could be misused for profit during times of crisis.
Ultimately, the discussion concluded that although AI holds promise for enhancing democracy, its current deployment in elections often accentuates the negative, especially around misinformation. Crucial here is AI’s inability to distinguish truth from deception. The analysis called for investment in education to foster discernment in information processing, ensuring individuals can responsibly navigate information.
The summary wrapped up by reinforcing the need to recognise the influence of market and consumer dynamics in the context of logic and AI, emphasising the importance of comprehensiveness in approaching the complexities of technology within society.
DV
Dr. Viktor Denisenko
Speech speed
109 words per minute
Speech length
671 words
Speech time
368 secs
Report
In the comprehensive narrative provided, the speaker examines the pervasive issues of disinformation and propaganda, compounded by the emerging challenges presented by advanced technologies such as artificial intelligence. The discussion is set against the geopolitical tensions within the speaker’s region, particularly given its proximity to aggressive states such as Russia and its ally Belarus.
Drawing on 18 years of journalistic experience, the speaker reflects on how nations like Lithuania, as well as the wider Baltic region and Poland, have faced information warfare tactics for an extended period, predating events such as the annexation of Crimea and the 2022 geopolitical developments.
Information warfare, once primarily psychological, has evolved into a sophisticated hybrid form that now combines deceptive information campaigns with physical actions, further complicating the landscape for these nations. Media literacy is underscored as a pivotal element in combating information warfare.
Defined expansively, media literacy is not solely the capacity to detect misinformation and propaganda but also the skill to discern varied sources and grasp the societal ramifications of information warfare. In Lithuania, there exists a concerted effort by both government bodies and civil society to enhance media literacy, demonstrated by a range of interventions, including fact-checking by various non-governmental organisations and media initiatives.
However, the speaker points out a paradox within Lithuania’s approach: despite a political consensus on the necessity of media literacy education, there is a noticeable lack of implementation within the school curriculum. The debate centres on whether to establish a distinct curriculum for media literacy or to integrate it into existing subjects.
Although politicians agree on its importance, the integration of media literacy into the educational framework is met with significant impediments, particularly the scarce availability of qualified instructors. This issue echoes wider concerns about education reform in Lithuania. Additionally, the speaker indicates that should the training of teachers commence immediately, it would still take at least four years for the first batch of qualified educators to emerge, and their quantity would remain insufficient.
The speaker concludes by highlighting that although there is a keen acknowledgment of media literacy’s significance in countries like Lithuania, it serves as an example of the global challenge faced in translating this recognition into effective educational measures. The consistent struggle to implement media literacy programs illustrates a worldwide gap between acknowledging its necessity and successfully addressing it in the face of evolving informational threats.
GK
Gabriel Karsan
Speech speed
171 words per minute
Speech length
1354 words
Speech time
475 secs
Report
Gabriel Karsan, the Secretary of Support for the African Parliamentarian Network on Internet Governance, aims to empower African legislators to understand and influence internet governance policies effectively. Concentrating on Africa, with specific insights from Tanzania, Karsan provides an intricate analysis of the interplay between social media, politics, and civic engagement.
Tanzania, with its socialist heritage and ethnic diversity, has notably embraced social media to build community bonds and enhance political assemblies. Karsan recognises how social media in Tanzania has been adapted to match local cultures, thus impacting democratic engagement and civic participation.
He points out that even with the retention of traditional voting systems, Tanzanian politics since 2015 have significantly incorporated social media for party promotion and voter interaction, promoting a digital arena for diverse views without suppressing dissent. Karsan raises concerns regarding misinformation, attributing this mainly to a lack of digital literacy amongst the Tanzanian populace.
The 2020 elections exemplified social media’s capacity to encourage conversation and ongoing interaction, marking a pivotal moment for participatory democracy. However, state involvement in digital infrastructure, although intended to foster inclusivity, poses important questions regarding regulatory responsibility. Karsan underlines the generational gap in social media perceptions, with younger people viewing it as a revolutionary tool, contrasted with the older generation’s conservative perspective.
Furthermore, Karsan stresses the importance of informed policy-making that aligns with a digitally adept population capable of leveraging technology for self-expression. Despite the rise of digital platforms, the integrity of voting—its confidentiality and the sanctity of the ballot box—remains preserved in Tanzania.
Karsan also discusses inherent biases in social media platforms, often reflecting the ideologies of their predominantly Western or Eastern creators. He advocates for a localised approach to technology and content to cater more effectively to the unique cultural and political contexts within African nations.
Finally, he examines the shifting geopolitical dynamics, particularly the tensions between China and the United States, and their implications for African countries positioned in between. He advocates for African leadership in managing digital infrastructure and policies to ensure adherence to democratic principles, even if social media were to be employed for electoral processes.
In summary, Karsan offers a comprehensive overview of Africa’s ongoing navigation through the myriad challenges and potential of social media in governance. His advocacy for localisation, digital literacy enhancement, and participatory democracy is vital for the advancement of internet governance and democracy across the continent.
II
Ieva Ivanauskaitė
Speech speed
139 words per minute
Speech length
1242 words
Speech time
535 secs
Report
The summary details a presentation from a representative of DELFE, which has asserted itself as a leading online news outlet in the Baltic states, with particular prominence in Lithuania. Committed to battling disinformation since 2017, DELFE has engineered a multifaceted initiative centring around fact-checking tools, educational programmes, and collaborative ventures.
Central to DELFE’s disinformation countermeasures is their dedicated fact-checking team, ‘Malwa Detectors’—also known as ‘Lie Detectors.’ This division operates not just within Lithuania but has extended its scope internationally through alliances with fact-checking networks, enhancing its global and European influence.
In the realm of education, DELFE recognises the importance of resonant and digestible content, particularly for the youth. This is exemplified by their effective use of social media platforms like TikTok and Instagram, demonstrated by a viral campaign tackling a disinformation story surrounding a frog sticker on bananas, which saw significant engagement indicative of DELFE’s substantial outreach.
A pillar of DELFE’s strategic approach is bolstering cooperative relationships. The organisation forms partnerships with academic institutions, NGOs, and media organisations as part of its collective efforts to fight disinformation. Within networks like Digitas and the Baltic Hub Besit, brainstorming and solution-focused discussions cater to the nuanced needs of varying markets.
Technological innovation plays a leading role in DELFE’s quest against false narratives. Noteworthy is the introduction of a fact-checking bot via Messenger linked to the ‘Malwa Detectors’, and the anticipation of a multilingual AI-powered bot that promises real-time disinformation detection, showcasing forward-thinking utilisation of artificial intelligence.
The presenter emphasised that to combat disinformation effectively, both institutional initiatives and user engagement are critical. They suggest that institutions should foster political will and form collaborative networks to devise collective solutions. For their part, users must cultivate critical thinking and rely on verified information sources.
The presentation concludes that the catalyst for change is the establishment of a unified coalition of stakeholders. It articulates the necessity for synergy among diverse experts to counteract disinformation collaboratively. Small yet strategic incremental actions can consequently lead to significant societal changes.
In summary, the presentation conveyed a pragmatic and optimistic narrative, affirming that slow yet tangible progress is being made in the fight against disinformation, as evidenced by DELFE’s achievements. A collective determination to confront disinformation was marked as pivotal in realising the success of such endeavours.
The review revealed that the summary was reflective of the key aspects outlined in the primary analysis while maintaining UK English spelling and grammar conventions. To enhance the quality and comprise relevant long-tail keywords, terms such as “disinformation countermeasures,” “social media engagement,” “technological innovation in fact-checking,” and “collaborative networks to combat disinformation” were integrated without compromising the integrity of the summary.
R
Reporter
Speech speed
169 words per minute
Speech length
600 words
Speech time
212 secs
Report
Francesco Mecchi of the Youth League offered a detailed synthesis of discussions from a session addressing the context of the present year, which saw an unprecedented 4 billion people participate in global elections. This massive democratic engagement raised concerns about the rising threat of disinformation campaigns, which jeopardise political integrity and undermine trust in democratic institutions.
The summary touched on current trends in democratic societies that exacerbate the problem of disinformation, such as the prioritisation of entertainment values, the gamification of content, and increasing political polarisation that negatively affect political dialogue. The role of Artificial Intelligence (AI) was discussed, noting its potential for transformative impact but acknowledging its use in amplifying disinformation through deep fakes and algorithm-driven content dissemination that can skew perceptions.
Particularly in the Global South, social media’s crucial role in the spread of disinformation was emphasised, serving as the primary information source and a potential feedback loop between governments and the public. Resolving the false information crisis requires a multi-faceted approach prioritising media literacy that enhances both the functional use of media and a critical mindset among consumers.
Education systems should incorporate fact-checking and critical source assessment in their curricula to empower youth with discernment skills. The session underscored that stakeholder engagement is essential, fostering a cycle of action and informed scepticism between service providers such as tech platforms and their users.
Region-specific solutions were advocated, reflecting the varied manifestations of misinformation, whether as a geopolitical tool, a factor in power decentralisation, or linked to micro-profiling, as seen in Central Eastern Europe, Africa, and India, respectively. Local community leaders were pinpointed as key figures in countering misinformation locally.
Their empowerment with knowledge and detection tools is pivotal, given their significant influence. The summary stressed the necessity for an inclusive approach to combat disinformation, one that circumvents models focused solely on Western or Eastern perspectives and promotes globally applicable solutions that respect diverse cultural and political contexts.
In closing the session, Mecchi encouraged ongoing dialogue and feedback, inviting participants to further refine the consensus-based message through edits or comments on the draft, ensuring that the summary encapsulated the full range of perspectives discussed. The review has identified UK spelling and grammar are correctly used, and the summary accurately reflects the main analysis text while incorporating relevant long-tail keywords to maintain quality.
VM
Vytautė Merkytė
Speech speed
164 words per minute
Speech length
1433 words
Speech time
524 secs
Arguments
People have different motivations in their search for or spreading of information, with some aiming for profit.
Supporting facts:
- During COVID, people capitalized on information for monetary gain.
- Dichotomy between seeking true information and creating/spreading misinformation for financial benefits
Topics: Disinformation, Monetization of information
Engagement with panelists is encouraged for collaboration
Supporting facts:
- Vytautė Merkytė urges everyone to reach out to panelists for potential collaboration.
- The intention is to foster an environment of cooperation amongst those at the event.
Topics: Collaboration, Engagement, Panel Discussion
The common goal is to reduce disinformation and promote democracy
Supporting facts:
- Vytautė Merkytė spoke on the shared purpose of the attendees.
- The focus is on combating disinformation and bolstering democratic values.
Report
Recent dialogues on the dissemination of information have highlighted a tension between the pursuit of financial gain and the ethical considerations in disseminating information—-a dilemma intensified during the COVID pandemic. Whilst some have exploited the monarchy of utilising information for monetary benefit, there exists a contrasting ethos that champions the battle against misinformation and embodies a reflection on human values and moral responsibility.
In this discourse, the significance of collaboration and proactive engagement is emphatically underscored, particularly in the milieu of panel discussions with abundant partnership opportunities. Vytautė Merkytė emerges as a strong proponent, spurring individuals to engage with panelists for prospective collaborations, aiming to leverage the aggregate capabilities and insights of diverse stakeholders.
The intent is to cultivate a collaborative ethos that is foundational to the initiatives undertaken within such forums. The fight against disinformation and the fortification of democracy feature as recurrent motifs. Vytautė Merkytė has underscored the collective objective of attendees at recent forums, intent on counteracting the adverse impacts of disinformation and fostering democratic principles.
Merkytė’s statements embody a unified strategy, advocating a concerted response to misinformation through the strength of cooperative endeavours. These exchanges resonate with the objectives of the Sustainable Development Goals (SDGs)—notably, SDG 16: Peace, Justice and Strong Institutions, which advocates for the development of inclusive societies with accessible justice and accountable institutions.
Additionally, SDG 17: Partnerships for the Goals accentuates the critical function of worldwide collaborations in realising sustainable development, inclusive of the crusade against misinformation. The prevailing mood from these engagements is optimistic, anticipating substantive outcomes from collaborative actions. This communicates an inclination towards synergistic and principled approaches, aspiring not just to confront the proliferation of falsehoods for personal gain but also to foster a veracious, democratic paradigm.
The conversation indicates that by consolidating efforts and maximising our collective capacities, there is tangible potential to make headway against the tide of misinformation and reinforce the pillars of democratic governance.