IGF 2023 – Day 2 – DC-Sustainability Data, Access & Transparency: A Trifecta for Sustainable News – RAW

The following are the outputs of the captioning taken during an IGF intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.

***

 

>> SPEAKER: We're sorting out some technical issues but we will get it working, for those joining online.

>> MODERATOR: Hi, good morning, good afternoon. Waqas, are you online? Are we able to hear you? Are we able? We're checking to see if our participants online are able, yes, so they can hear us.

>> WAQAS NAEEM: Hi, Dan, can you hear us, too?

>> DANIEL O'MALEY: Yes. Great. We can see you as well. Great. Thank you, everyone.

>> WAQAS NAEEM: Good morning.

>> DANIEL O'MALEY: Apologies for the technical delays. Wouldn't be an IGF if we didn't have a technical glitch. I think it's just for a reminder of how important the internet is for all kinds of socioeconomic activities are, and I'll pass it to Courtney Radsch who will introduce our session.

>> COURTNEY RADSCH: Great. Thank you for everyone in the room here at Kyoto and everyone online and around the world. We're excited to have you here today because there are few issues as important as trying to figure out how to balance the need for technology, innovation, and governance, and insuring the sustainability of journalism and news media. We're the dynamic coalition on journalism and news media. We're going to group the sustainability, because that's really long to think of DC going forward.

Today we're going to have a three‑part session.

First we'll hear from the authors of our annual report, and we'll have copies, and we'll invite you to sign up for the mailing list so you can receive an update, and then we'll have an open forum for folks online and in the room who are interested in this topic to share their own work and research. We'll invite you. We have invited several to participate in that. This is, again, part of our efforts to make sure this is a very dynamic, inclusive, wide‑ranging discussion on the range of topics that sit at the intersection of Internet Governance and media sustainability.

And then we will have a Q & A session and we'll talk a little bit about the dynamic coalition and what our plans are for the next year.

And again, those are all set by the community. This is an effort that we, this year, we determined for example that data transparency and access are essentially what we call the trifecta for media sustainability. Every year we determine the priorities based on the input from the communities.

So with that, I want to welcome everyone who is here in Kyoto. Thank you for joining us. And turn it over to ‑‑ oh, I should probably mention I'm one of the three co‑coordinators of the Dynamic Coalition. My name is Courtney Radsch. I'm the director of the centre for journalism and liberty at the Open Markets Institute, a think tank based in Washington, D.C. Daniel O'Maley is another coordinator. And our third coordinator is Waqas Naeem, who is online. Waqas, over to you.

>> WAQAS NAEEM: Thank you Courtney. Warm welcome to everyone who has joined us online. Again, we apologize about the technical difficulties as Courtney mentioned, at the Dynamic Coalition we're really interested in hearing your insights and perspectives on the digital policies and the regulatory frameworks that affect new sustainability. This year the coalition has also focused on issues related to data‑sharing and transparency and data practices, especially in connection with big tech companies. As many of you might have noticed during the year, there is ‑‑ there seems to be a growing reluctance misunderstanding big tech companies about distribution of news on their platforms. Sometimes this is in relation or in reaction to government policies and sometimes in reaction to industry demands, which is why we feel that although many of us are working on these issues at the international level, a lot of you who are focusing on these issues at the local or national levels, your work also has a great deal of value for us, especially in discussions such as the one we're about to have, a sort of global collaborative effort or debate on the topic.

So we really appreciate your interest. Please feel free to use the chat to share your questions, comments, or suggestions and we'll try our best to include them during the Q & A session.

Thank you once again. Looking forward to an excellent discussion. Back to you, Courtney and Dan.

>> DANIEL O'MALEY: Great, thank you, Waqas. My name is Daniel O'Maley from the centre for international media systems. Now we're going to hop into the first section of today's agenda and that is the launch of our Dynamic Coalition's annual report. I have copies up front if anyone wants to grab one after the session is over. At the Dynamic Coalition, one of our goals is to make sure we have a group of ‑‑ a multistakeholder group. We have about 80 participants coming from Civil Society, media regulators, governments, big tech companies, and small tech companies as well, looking at these issues holistically. We think that oftentimes the issue of ‑‑ we think oftentimes the issue of news is focused on just a content component and not necessarily on other issues like the sustainability of news ecosystems.

And as a Dynamic Coalition we're not just active for this one day, one session, at IGF. We actually have a series of activities that we do throughout the year as we try to collect information. We have a series of coordinated learning calls and we create an annual report, importantly, to gather a snapshot of some of the most important issues facing news media throughout the year.

Our report is a compilation of articles that are contributed by our members. So we ‑‑ it's a collaborative crowd‑sourced effort to capture these topics.

This year there were kind of three big topics that came out through the contributions from our members. One is that the power imbalance between media and tech giants. So we have articles that are exploring how the dominant tech companies are redefining media sustainability through controlling advertising revenues and data, so looking at that relationship between big tech and media.

Another one that comes through in some of our articles is that government regulations are a dual‑edged sword. This ensures how data protection laws can both support or stifle media affecting things. There's a lot of work done in this area and Courtney is at the forefront of a lot of these discussions, about how particularly regulations of the DSA are impacting media, and we're seeing a double‑edged sword there. There's great things that can come out of that, but also some challenges as well.

Then we also look into some kind of technological innovations and ethical dilemmas. This year's topic everyone has been speaking about, generative AI, and what does that mean for the news media space, especially as large language models are trained on news content, as news organizations are going to start using generative AI, and looking at these technologies and what that means for the practice of journalism as well as news ecosystems in general.

So I would ‑‑ we have just launched our report online today. And for those who are in the chat, or online, I believe Laura is sharing the link. And you can grab a copy up front.

We're now going to have a chance for some of our article authors to discuss their contributions. So we have been having technical difficulties but I believe that Mike Harris may be online via Zoom. Mike, are you there? Great. Okay. Mike Harris, a brief introduction, he's a cofounder of XNM. He devised the rule book system with patents pending in both the U.S. and EU. He's an expert in privacy by design systems. His article in this year's report is Establishing Independence and Parity in the Era of Internet Giants. Just because we've been a little bit delayed I'm going to ask our speakers to keep their interventions to three minutes. Mike?

>> MIKE HARRIS: Hi. I'm not sure how I'll do with three minutes but I'll get going. Good morning. Today I'll present a new technology that will help news media to address the pressing challenges it faces with very large online platforms. The problem is simple. Today's digital infrastructure no longer supports the independence that the web was originally designed for. And that's vital for a healthy news media ecosystem.

And social media completely changed our information ecosystem. We moved from a model of competing with related products to one of opaque algorithms competing with all information, real news, content that looked like news, state‑sponsored information, are all treated equally. Algorithms like Facebook and Google control how information flows, not only shaping our individual world views but directly capturing news media's primary revenue stream. This is a direct result of network utilities taking ownership of web governance.

Now, I'm speaking not as an expert in journalism but as a tech knowledge gist who developed a tool to target this specific problem.

Decentralized role books are a new tool. They're not a platform or a block chain. They're public documents that anyone can create. Joining a rulebook gives users credentials to access web services and benefits. The framework provides a new model of Internet Governance without centralized authority.

How does news media leverage this new model? Rulebooks empower the news media industry to differentiate its online content by publishing under higher standards of integrity. In other words, content that claims to be news will need to live up to the standards of journalism defined in the rulebook.

Publishing under such a rulebook provides a trust signal which platforms and users can use to label and filter content. It's an authenticated blue tick that journalists can apply to increase trust in their publications. If they fall below standards, they can lose it and are subject to penalty when they want to get it back.

The pace of everyday life calls for basic binary signals. This is in stark contrast to trust signals which are not only highly contextual but diverse.

Here, the blue tick is enough to know that someone is accountable to upholding specific standards and rules.

To better understand the context consumers can click to read the rules and see who is managing them. We can see the article is moderated by the FT which is regulated by the FTA and claims to be news. That claim is voluntary so these rules can require truthfulness yet can't limit free speech.

The certificates are user generated so they can apply to a domain or a specific article. Moreover, they can even contain metadata so rules can mandate user tagging, say for adult content.

A crucial feature for journalists is the system's ability to protect anonymity while still ensuring accountability.

Journalists can publish in full adherence to rulebook standards without fear of direct identification.

By deploying a rulebook, the industry would be providing content governance as an information layer for applications and so it becomes easier to discuss what is fair with respect to governance.

Rulebooks will move us from asking how do we achieve an outcome to what do we want to achieve?

We are grateful to publish here today because we believe the IGF is the appropriate forum to initiate this bottom‑up user‑driven policy deployment. The IGF observes an equal footing between stakeholders and rulebooks empower groups and organizations to realize this equal footing, so if you're determined to address these challenges we would like to hear from you. So please do get in touch. Thank you for listening.

>> DANIEL O'MALEY: Great. Thank you so much, Mike Harris. I think one of the things I really like about the IGF as well is that we have people with the Technical Community, people with these types of expertise, that Mike brings to our group, who challenge us to think about opportunities and innovation in new ways.

I now want to check and see if Drew Clark is online and is able to join us?

>> PRUE CLARK: I am.

>> DANIEL O'MALEY: ...a professor reported from over 20 countries for leading publications such as the Washington Post, the ed moreo prize, and leads New Narratives, a nonprofit newsroom that collaborating with Global south news rooms on investigative journalism and news business building. She with her coauthor Maureen Shiay wrote Digital Innovation in Liberia'S Media Sector: Challenges and Opportunities in Low‑Income Democracies. Now, Prue I'll pass the word over to you.

>> PRUE Clarke: Thank you, Dan and to the Dynamic Coalition. I am glad to be able to flag something from people in low‑income countries to a large extent are left out of opportunities that are coming for digital transformation and I really wanted to highlight some things, some work we've been doing, in Liberia with New Narratives.

I wanted to highlight there's 19 low‑income countries left in Africa, according to the World Bank's latest rankings. We're seeing countries likenna and South Africa and to a lesser extent countries like Ghana where there are opportunities to build better digital products and monetize digital audiences and they're coming from platforms and governments and really not coming to these low‑income countries.

We have just done work with Liberia's leading news rooms and brought business and editorial leading from Premium Times in Nigeria which a lot of you know have been breaking ground in Africa on digital innovation, great new digital tools to do journalism but also to build an independent business model.

We brought them to do a sort of hacks and hackers, which had never happened before. It was a complete revelation to the Liberian news media that they could make money from social media. They didn't have the technical capacity to develop apps or websites or donational membership offerings and they never really even heard of this opportunity before. They're still completely dependent on government advertising and news makers for distorted coverage. Even when they really, really want to do great journalism, and there's a lot of dedicated journalists in these places, but they haven't had the opportunity to build these revenue streams, like in other places.

And they've not had the opportunity for grants to fund their journalism that media in larger countries have had.

These are emerging democracies, extremely fragile. We've seen 8 military coups in the area in the last years.

So I don't need to tell you how important this is to building democratic resilience and avert stability instability and food insecurity that is driving migration to Western countries now and is only going to increase as we go forward.

So it's really vital.

At the same time, news media in these countries have had no production from the digital platforms. They're completely consumed or subsumed by them. Facebook is the internet for two‑thirds of Liberians.

The journalism content comes to them from Facebook.

But to just give you an example of how left out of the conversation they are, last year the Daily Observer, the country's second largest newspaper, was hacked and they started sending out images of scantily clad women from their Facebook page. Facebook ignored all the pleas from help from the Daily Observer and it was only when the UK grant stepped in that Facebook returned ownership of the page to the media house. By that time, they had lost massive of Facebook users and their rival now has seven times the number of Facebook followers.

I just want to say I understand why bigger economies have an appeal. They have a larger middle class to monetize and more technological capacity but I think that misses some really important advantages of quality news media in low‑income countries. Two, really. They have a captive diaspora audience who can't get their news from anywhere else. They're relatively wealthy and they live in democracies that place a high value on journalism and its role in democracy.

Secondly, these news rooms have very low labor costs. One of the leading news rooms in Liberia told me they operate on just $50,000 a year.

So you can imagine if they could convert just 1,000 audience members in the diaspora to pay $50 a year they would double their annual revenue immediately but they've had no opportunity to develop a membership model or any donation opportunities.

The media bosses estimated to us that just $10,000 would be enough to get them to the next level where they could really take advantage of the opportunities to monetize audiences. It's a small investment compared with the amounts of money that we've been spending to build these democracies in the first place. I think it's really essential that we start to think about this.

So in my recommendations just quickly, that donors really start thinking about these, these with more nuance about supporting news media in these countries, that they need to understand the opportunities that are available for digital transformation, to build digital revenues, independent revenue streams. They need to get away from this just supporting community radio and that's sort of the end of the story.

They should fund, give grants to fund journalism at low‑income countries. They need to work a little bit harder to find this but if it's true that donor funding is needed to support journalism in America, it's triply true in these areas.

>> DANIEL O'MALEY: Thank you, Prue. We have to move on. I think this is great here. One point. It's important when we have these news conversations that we incorporate not just what's happening in the EU or North America but the entire Global South including Least Developed Countries and also the component around the issues of safety and flagging can also have impacts on media sustainability and users. So thank you very much.

I'm now going to pass to our next author, Juliette Nanafuca. She's a advocate to improve variance governance fractures and digital accessibility and inclusion in Africa. She worked on the collaboration on international ICT policy, for...also known as CPESA.

She contributed an article titled navigating the uncertainty of gen I in Uganda news rooms.

>> Thank you very much. And good morning everyone. Please excuse me. My glue is acting up this morning. I'm from Uganda and I looked at the landscape in the country. I was really excited to do it but I found it was very limited and on top of that few wanted to actually talk about it due to the uncertainty and perhaps limited confidence that they had to talk about it, even as media owners to some extent.

Nonetheless, a few people who spoke about it are looking at it through the lens on their perceptions on it and the greater scheme of things from the media industry which it comes to the global media sector.

I found in comparison to other countries were struggling desperately in the country. As Prue pointed out, some of the challenges in lika is similar to Uganda. We're right next door to Kenya but behaves very differently in terms of how it structures itself as a business model and how it engages with journalists and manages retention. Those are issues we're struggling with, in Uganda. Now the whole AI realm is introducing yet another layer to this whole media industry but it's something that is, um, creating a sense of uncertainty but at the same time is not ‑‑ has not spiked a sense of urgency in the sector, which I found rather alarming, especially as Mike pointed out, other countries are talking about rulebooks. They're at a very different level while we're still struggling to get ourselves organized even at a media guild level, at least in comparison to neighboring Kenya.

So we're navigating way too much, trying to play catchup to more advanced countries, who have long since passed. And we're not thinking about what the AI is going to do to the landscape. At the global level we're aware of that but not really factoring in what it will do at the national level.

When I interrogated some media houses, they have interest at least in investing into the AI space, it was very apparent that the budget or funding for that is not quite a priority.

And such interests have then fallen onto individual journalists who are already struggling with poor salaries and struggling to stay afloat with which of the trends in the media sector. Those who can afford it have taken the leap in educating themselves, but how they will apply it in the media house is again another issue that many are yet to make sense of.

There is an appreciation of how AI can be used to do big media stories, to do big data analysis, but again, those are not necessarily stories that media houses are chasing.

We did have a lot, one thing I picked up, was a lot of reliance on the Civil Society sector to upskill journalists in that regard and push for a type of content in media.

However, it is still a slippery slope. As Civil Society should not really be pushing content. Journalists should actually go out and look for their stories. But I guess they're looking at it in terms of skills development. But one of the arguments that emerged there it's not necessarily the role of Civil Society entities to train journalists. You're letting media houses off the hook for training the journalists in the media houses. That's one thing that stood out for me. That's it? Okay. Thank you.

>> DANIEL O'MALEY: Great, Juliette. I think this is something everyone is talking about, generative AI, topic of a main session. It's not just content production. It's business models and how people are engaging with news.

Now I'm pleased to pass the mic to Courtney Radsch who has already introduced herself but she has contributed an article titled Weaponizing U.S. Copyright and...

>> COURTNEY RADSCH: Thank you very much. I've long been interested in how news media is strained around the world. I did hundreds of interviews and surveys of journalists around the world, including technical assistance to get accounts back when they were closed because of copyright violations and increasingly privacy violations, is that the tech no legal regimes of the United States and the European Union shape the visibility and viability of news media worldwide and yet there's an insufficient recognition of that by policymakers in those companies.

Furthermore, the way the dominant tech platforms, particularly the Meta and Google platforms which are the ‑‑ create the publishing audience Monday nettization platforms for the most of the news media worldwide, create strain or create potentiality. They have to translate policies into technical policy and content moderation rules that affect news media around the world.

The U.S. and EU copyright, the digital millennium copyright act, in particular the DMCA and privacy laws like the GDPR are inscribed into the global tech platforms through their AI systems, their content moderation procedures, their terms and services, and specifically things like notice and take‑down or notice and stay‑down requirements, hashing, and filtering. We see tech platforms will adjust algorithms to help reduce the availability or the virility of problematic content, for example, or prevent it's upload in the first place which is what the DMCA does. YouTube has interpreted the DMCA to create hashing databases that filter content at the level of upload. So we see in the United States, for example, that police officers specifically play very popular copy‑written music in order to trigger the automated filters that will prevent journalists and citizen journalists from filming them when they are committing police abuses, for example.

That has actually now been instructed to police in some precincts. So this is really problematic that a legal regime that exists is being weaponized and there's no personality for doing this.

Furthermore, what we see is that because the tech platforms based in the U.S. have a responsibility to implement the DMCA worldwide it results in a global copyright regime. They only receive safe hashor if they prevent the copyright content from being uploaded but they're also not responsible for ensuring...and there's also no meaningful action against content farms that's plaguize that content. We know for example Facebook as much as 60% of on engagement comes from instant articles based on scraped content.

What we're seeing around the world is that criminals and corrupt officials and others are weaponizing these technolegal regimes. This is because of a failure to correctfy loop holes or enforce penalties for false copyright claims and a failure of technology platforms to redress recurrent abuses and improve safeguards for the immediate.

We need to address these if we need to have a available news media around the world.

>> DANIEL O'MALEY: Thank you now we turn to Juliana Harsiana. Who created an article, about freedom of speech and the media. Juliana is an independent journalist and researcher who has more than a decade of experience in the industry. She's a long time voice advocating for a multilingual internet. She currently serves as the secretary of the Indonesia Internet Governance Forum. You have about three minutes to talk, Juliana.

>> Juliana Harsianti: I'm sorry. I can't open the video, because it will be compromised, my appearance here.

I was talking about the regulation, in Indonesia, and the connection with general sustainability. When the technology is immersed in daily life, it's also affected in practice, in many ways.

The press board has recorded more than 1,000 new sites in the country with more than 900 operating in the digital space. With all the changes, the digital space is also offering a new way for journalists and activists to express their voice in certain issues, like writing in blogs, and have a video and YouTube.

As the digital technology is immersed in Indonesia, the government thinks it needs to have some regulation to make it more easier and more safer for the citizens who are working and living in digital technology.

In 2008, the government has launched the internet and transsection and economic law, with the main subject is to protect the people who transsection and eCommerce, even though this was to protect digitalization, the law content laws that allow people to file a lawsuit on other end users whose social media posts they found offensive. This law was apparently after it's been launched has been used by powerful entities to silence and punish critical journalism, and journalists, who often voice their concern about corruption and for violation of the government or the Private Sector who has more power in economic session.

The most common reason is to relay information, after a journalist has published a news article, to hold official accountable for corruption and violence.

The last year, the regulation, the formation of the regulation to protect the data is still continuing last year. There's the data protection law launched by the parliament. But it's still not clear if it will be affecting the digital sustainability and the freedom of expression because this problematic law is still enforced and the Civil Society, journalism, and some organization are still struggling to appeal to government to revise which problematic clauses and articles. So that's it for me.

>> DANIEL O'MALEY: Great. Thank you, Juliana and we'll be paying attention as there's elections coming up in Indonesia to see how these cyber laws are enacted and see how they may impact news media and journalists and the information system there. Last but not least I have a report, bridging the gap in the digital age.

Essentially, we identified data is very important. People use the analogy data is the new oil. So how can we mine this new resource for the media development goals we have? Oftentimes in our communities we say, if we just had the data. We need the platforms to share data. But one question we wanted to look at is, exactly what type of data are we talking about? What I tried to do in this article is kind of define the different buckets. I think there are different people working in the space on different topics who require different types of data and different types of data could be useful.

So what I try and do in this article, what I do in this article, is look at those buckets.

And so the first one is a bucket on audience engagement and monetization. This is data that tech platforms would have that could inform how understanding better the ad tech stack, how much money is made off of news content. This is one that's the type of data that when we're talking about News Media Bargaining Codes that is of quite interest to news organizations. I kind of see that as one bucket.

The next bucket is data around content narratives user behavior and coordinated behavior. So oftentimes in our space we're talking about online violence against journalists, especially female journalists, and the type of data you would need to marshal to understand those kinds of challenges is slightly different.

And this is also an area where there have been significant challenges in getting the tech platforms to share that data. But understanding what that data is, is really crucial.

And then the third one I think has been talked about less in our space, as I've had these conversations over the past year, is data around cybersecurity. Oftentimes these global platforms, the big tech platforms or other companies like Cloud Player have a sense of what type of cyber attacks are taking place against either individual journalists or news organizations. Sometimes which, you know, if they're state‑sponsored, even. But when security, when it operates correctly, we don't even know about it that it's happening. Right? But still those attacks are happening. Could be really influential in trying to proactively protect organizations that are under attack but might not know it.

So that was another bucket of data that I think could be really helpful.

I think this is an important issue because it will be easier to kind of engage with the people who have possessed this data if we had more clarity around what we're asking for and for what purposes.

I think this is especially in this year in the kind of we're seeing actually backward trends in terms of data sharing in some ways, as trust and safety teams have been decreased. There's less engagement by platforms. There seems to be a kind of race to the bottom as we have more digital regulation and companies are now thinking about this as a compliance issue with laws especially coming out of Europe.

But there's also hope in there because some of these new regulations are including transparency components that require the sharing of some types of data.

I think there's also hope in that, in these efforts as well.

Just as the path forward I see, I think standardizing our data requests and having a better sense of what specific data we're talking about is going to be really helpful.

I think that cross sectorial alliances for equitable data utilisation is one thing we're thinking about. Internet the media accelerator and partnership with Microsoft, there's some hope that these types of partnerships can lead to something, whether that will pan out it's hard to know.

Obviously data is becoming even more important in the artificial intelligence space, which depends on data, and that's one of the Microsoft's comparative advantages so we'll see how that works.

And also we need a global approach to data transparency and media support. This goes to a point that Courtney was also making how the policies we make in the EU and the Global West and how that data transparency policy and how that might impact other places in positive ways. We've seen data protection laws implemented in places where when not done with the right guardrails actually impinges on press freedom.

We need to really think about what we're doing where we are and how that impacts others.

So that is, you know, this is all a plug. Hopefully we've given you a little teaser of everything that you can find in the annual report. We have copies here that you can pick up after the session and we also have copies online.

That's all for this portion of our agenda. I think we're going to need to skip the Q & A or combine it, because we have a lot to get to in the next 40 minutes. So I would ask people to kind of hold those questions. And as we move to the next session that I think, the next section of our agenda, that Waqas is going to talk about the member spotlights and maybe we combine those all and here what some of our DC sustainability members are working on and then we'll have some open discussion that can include questions to the authors. I'll pass it over to Waqas.

>> WAQAS NAEEM: Thank you, Dan. We'll now invite one by one our members who expressed interest in speaking to the session to share work or updates on recent or upcoming initiatives that they or others are taking related to news media and sustainability. In the interest of time we ask all speakers to limit their interventions to three minutes so where he can accommodate as many speakers as we can and still leave time for open discussion after this.

We also request our speakers and this is something that the Dynamic Coalition is really interested in, is to share any inputs regarding areas for future research or stakeholder engagement that may be facilitated by the Dynamic Coalition.

First, I believe we have a written message from Michael Markowitz from a think tank in South Africa that resulted in the drafting for global principles for news media and big tech companies. Dan, do we have a written statement?

>> DANIEL O'MALEY: Yes we do. Nick Mana questa (phonetic) from CIMA is going to read it.

>> AUDIENCE: Just reading out a message from Michael markowitz. GIBS is the Gordon institute of business science.

Michael wanted to draw attention to an important event held in July in Johannesburg entitled big tech in journalism, building a sustainable future for the Gobi South.

That event brought together 70 different people in the industry, from 24 different countries, to discuss solutions to the crisis of sustainability of journalism and its intersection with the role of major tech platforms.

.

That event focused in particular on news bargaining codes. Such as the one in Australia, the one in Canada, a model, a law being discussed in Indonesia, and the speakers came to a number of interesting conclusions, which he's outlined in this written contribution today.

I think owing to limits on time I'm not going to read all of the conclusions that they reached. But I did, I think, the most important issue here or the most important finding to highlight is that, or conclusion to highlight, is that they concluded with a statement on principles, the adoption of principles of big tech and journalism principles for fair compensation. We can probably share the link, if someone is online, if they have it.

Otherwise, we'll share it with the group later, if you don't have it already.

These principles are not just applicable to the Australian style bargains codes but intended to be universal so everything is a framework for any country seeking to address media sustainability through these mechanisms.

So far, these are endorsed by 101 individuals from 28 countries.

The principles are intended to take forward in three ways that he highlights in his message to the group. First, Michael and the signatories hope these principles can be used as a campaigning document for all stakeholders in lobbying for new mechanisms to address media sustainability through fair compensation.

They're keen to build alliances with networks you may be involved with and other think tanks and Civil Societies to work on this.

Second, they hope to submit the principles as part of the Civil Society filings to the South African marketing inquiries.

Where?

They're using these nationally in reform efforts in South Africa.

And thirdly, they believe the principles in other conference outcomes have highlighted areas for further research. So look to those principles for a continuing research agenda, for those who are researchers. Thanks.

>> COURTNEY RADSCH: Thanks so much. If anyone in the room wants to learn more, I was there, to I'm happy to answer any questions.

>> WAQAS NAEEM: Thank you Nick and Courtney. We also shared the link for the principles in the chat. Next up I think in Ana Critina is the in room, senior programme specialist at UNESCO. Ana, if you're able to join us, please go ahead.

>> ANA CRISTINA RUELAS: Hello, everyone. Thank you very much. UNESCO has been developing guidelines for the governance of digital platforms that will be released on the 25th of October this year.

And I think that it's important to say a little bit about how these guidelines are now related to media sustainability and recognitions of journalist safety.

So the guidelines are affirmative about the need to involve the media and its professional and regulatory process in whatever kind of regulatory processes are, independently of if it's self‑regulation, current regulation, or statutory regulation. The guidelines outline that any kind of governance system should promote dialogue between the media and the digital platform for the investment of independent media and for the support and making that available and supporting actions for sustainability, diversity, and plurality.

Third, the guidelines are very stressed a lot about the due diligence processes that the platforms should put forward in order to assess the human rights impact of the treatment of independent news publishers and journalist content to ensure equal treatment of news organizations to establish guarantees to guard against misuse of reporting rules, especially in bad faith designed to censor journalists.

.

But how? Now, we developed another set of consultations specifically to identify the type of that...specifically related to the safety of journalists and media viability, which we will release a policy brief that mentions the specific data asked.

And the different challenges that we are facing when it comes to that access.

And then we are going to, after the publication of the guidelines, we're going to start our process of implementation of the guidelines where we are going to define through a multistakeholder network that will be established in different regions, particularly Latin America, Africa, Arab States, and Asia‑Pacific, the definition of a work plan and indicators for follow‑up of these principles. What we want to understand is to know how this could be real and operational in the different contexts and considering the different realities. And what is actually needed in each of the regions. Thank you.

>> WAQAS NAEEM: Thank you, Ana. Next up I will request Julius End ert from the DW academy to share.

>> Julius: Yes, I'm here. Could you show my slide here in the room? Is that possible? I'm from the DW academy from Deutsche in Germany. This is about AI development. We see this as a disruptive technology for media. So we started to thinking, what's the disruption for our sector in media development?

And it will be the same as disruptive for us as it is for media. I think this is clear.

And we started to, with our own consultations about the question, I think this is the strongest quote from Walid. A professor and senior lecturer from the Sodertorn University in Stockholm. He said we need to build branches within the media development explicitly for AI. Otherwise, we will become irrelevant as media organizations ourselves.

So it's about the sustainability of the media sector as a hole.

I think it's good to visualize it. We developed a kind of pilot four‑step approach for AI impact in media development.

So we have to ask ourselves the question, how can we have and still have impact under the rule, kind of AI, and how can we use AI to have more and Mable different impact?

And for us, it's crucial and I learn this everyday for myself, we need to understand the technology and have a deeper understanding of the technology in the first place.

And that also means that we need to build capacity within our own organizations. So we need to have technical expertise. We need to have technical knowledge. We need to test things. Otherwise, we cannot build on AI or mitigate the risks of AI.

And then, even on the same level or even more important is we need to analyse and research, and this was the question before, what is the impact on freedom of speech and human rights? There are so many aspects how AI is influencing freedom of speech, and also maybe a philosophical question, do we need to vote for freedom of speech for AI? So this is also a question.

And then at the next level we need to detect the gaps in every field of media development and development capacities. This is media viability and sustainability. This is digitalized, this is media literacy, journalism education, and more, advocacy. Because every field, we'll see the effects of AI. This is also clear. And then on the first level, we need to see what impact could we have with AI and how can we mitigate risk from AI. And that means we need also as a sector to develop our own ethics and our own positions and also we need to be more innovative how we can use AI in our own projects with our partners.

And the whole thing, and this is what we hear a lot, AI is so fast in its development, especially generative AI, this needs to be an iterative process and we need to do it again and again.

And it's also clear that we can do it ‑‑ we can't do it alone as one organization. We need to collaborate and involve our partners like you, from Uganda, I think we are on the same page in develop, yeah, these kind of approaches. Yeah. This is what I want to contribute. Thank you.

>> WAQAS NAEEM: Thank you, Julius. Thank you for pointing out the opportunity for collaborations and engagements. People from different parts of the world working on the same issues.

Next I'll request Michael Bach, the executive director of the forum on democracy, to please share their update.

>> Michael: Good morning, everybody. So the forum on information and democracy is the implementing entity of a global process called the international partnership on information and democracy.

Which started a few years ago about a dozen or so democratic states. We've now reached 51 countries that have signed onto the partnership. Brazil was the 51st at the end of August. At the forum as the implementing entity undertakes a bit of work at the interface of research and policy. One of those areas is conducting Working Groups on specifically targeted issues to develop recommendations for states' Civil Society and other actors advocating for the positive impact of technology on our democratic institutions and the information ecosystem.

Late last year, sorry, earlier this year, we published a report from a working group around pluralism of information in curation and indexation algorithms.

It's a lot of very difficult to roll off the tongue, but essentially the group was comprised of renowned experts from across the world who drove the writing of the report and the development of the recommendations, drawn from experts through networks of Civil Society and academia, really looking at how to improve the pluralism of news that we see in our online spaces.

More specifically, they looked at how to give us more control in enhancing the quality and pluralism of the news that is served to us on online services, to give us more transparency and control over how our personal information is used to deliver that content to us.

And to pave the way for more decentralized approaches that may differ from the sort of dominant models that we see in the marketplace.

When that report is delivered, we then work through the state signatories to disseminate through regulatory agency people who influence policymakers and NGOs throughout our networks around the world to influence the kind of policies that are being developed.

And just a few weeks ago we launched our next cycle policy working group on artificial intelligence and the impact on our democratic institutions and the information ecosystem. That's moving forward and a report will be issued in mid next year I think.

Another part of our team very quickly works at gathering evidence under the umbrella on the observatory on information and democracy. We just launched a steering committee of which Courtney is a member. We've only met online so awkwardly this is the first time we're seeing each other in person. The observatory really drives an effort to obtain a common understanding of the state of knowledge and what we understand of the impacts of technology on democracy and the information ecosystem.

The steering committee will be driving a process of several working groups of which one is around media and the digital age.

And that will be a meta‑analysis and delivered towards the end of next year and we'll be seeking contributions from NGOs with a research profile and other academics.

In terms of suggestions for research areas and things that we need to do, I guess one of the themes that's really important to me is to ensure that we continue and do more to ensure diverse voices and experiences, backgrounds, and disciplines make their way into our policy discussions. It's not enough that these take place in the North, in the typical research centres that we think about. I have spent most all of my life in the South, in Southeast Asia. To me, bringing those perspectives to the research centres is particularly important.

And that's a priority for our organization.

So I thank you very much.

>> WAQAS NAEEM: Thank you, Michael. We will now move online to Sabanaj who works on human rights and AI at the policy institute. If you can please share your contribution?

>> Sabhanaz Rashid Diya: We work to reduce technology gaps. We're powered by our human rights professionals who collectively have years of insider knowledge of leadership rules and the largest tech companies and government agencies around the world. Our policy research currently focuses on disinformation, through a Global South lens.

I think I agree a lot with what Courtney mentioned, I think our consistent finding across different bodies of work we do is that the discrepancies in the systemic barriers in how platforms organize and provide access to data to researchers and journalists in the Global South. If you look at the publicly disclosed records of the largest U.S. tech platforms close to 80% of all research funding and data access have been provided to elite institutions in North America, Western Europe, and Australia, despite the fact they only represent less than 40% of the world's internet population. This creates asymmetry that prevents local journalists and researchers and academics from understanding technology's impact on society. This also ends up reinforcing structural biases in underrepresented Communities in these markets. This particularly impacts media. I remember a particular instance during the pandemic that journalists from Bangladesh, Pakistan and Nepal, were prevented from Facebook's data that prevented them from studying and how they would have survived during the pandemic as well as what policies could have countered the negative information ecosystem. So it creates a lot of challenges across multiple layers.

In the forum we talk about regulatory frameworks and I think Courtney talked about how they're weaponized in many instances. I want to talk about the EU Services Act, particularly the regulations on data sharing are there to encourage accountability and there seems to be energy to position this as a gold standard globally. However, it raises critical questions on whether the same framework can be applied in other parts of the world. Seven out of ten countries are deemed to have hostile and oppressive media environments which means their journalists and media professionals often turn to social and alternative media platforms to address dissent. If such a model is taken around the world it raises questions on this backdrop on whether these were work on different kinds of contexts and if we want more government sanctions or scrutiny on user or journalist access to data. Particularly given our experience in cyber protection laws in many Global South markets. It creates questions around who will have access and how can states be held accountable on fairness and neutrality criteria. And would we reach such communities and less mature democracies? Therefore, I think our work looking at these questions is important because what kind of access models would work in Global South context? How would they be able to navigate less mature democracies and what procedural safeguards should be applied when we think about access in different kinds of contexts and jurisdictions? We recommend for further research, and I think there's a lot of focus these days on solutions and what should be run. I think our focus is largely around how it should be done.

We believe imposing procedural safeguards and more evaluations into processes into access as opposed to what the data is actually saying will actually make it a lot more meaningful and neutral and less politicized when different journalists and media can access the data.

And when there's broad provisions copied from other jurisdictions, that will actually provide guardrails in terms of who will get the access and whether it's actually serving the interests of journalists and media communities in these markets.

Thank you.

>> WAQAS NAEEM: Thank you. Important points there. We have Dan Ramirez on the list., from the centre of freedom of expression from Argentina.

>> Thank you very much, everybody.

We've been looking at the issue of internet viability and we started with a basic concern of freedom of expression and the need to have quality journalism agency a building block of democratic society.

>> AUDIENCE: We started just to try to learn where the conversation was going. We tried to understand the situation in Latin America. We basically found was there was a series of policy areas that relevant to the issue of media viability or sustainability, which in Latin America in particular there are a couple of areas that are very important. Subsidies, state subsidies, in Latin America are often passed in the form of politic advertising. Public advertising is a huge source of revenue for legacy media companies. Government‑run media is also a huge part of the conversation.

The problem that we usually have in Latin America is that government‑run media is not independent. It's part of ‑‑ parts of the governments and are sort of partisan tools.

And the other specific issue we found is across countries ethical shortcomings in journalism were a huge matter of concern.

And what we found is that all these policy areas were basically talking to ‑‑ were not talking to each other. They're sealed conversations that really failed to produce consistent policy proposals. And what we have identified as a need for future research and advocacy and engagement is the need to build bridges between those two different silos.

In that sense we started with a hypothesis that was that perhaps the original conversation should take place but because of the problems are very specific to every single country we thought best, the best way to do it, was to try to produce conversations first at the national level to later move up the ladder to achieve some kind of original conversation.

But one of the main biggest challenges we found there is that all these different siloed conversations are crossed by deep and pervasive disagreements between different people who don't think the solutions are about the same place.

So scholars working on media studies, for instance, are not usually willing to speak to media owners. Media owners think that community radio operators are a fringe group of people who should not be included in the conversation. So that requires policy work or politics that's hard to achieve. Our initial conversations with people in different countries basically agree our diagnosis is correct. So that's where our future research is going to go.

I do want to mention because it came up during our conversation today, that we're also working on that access for researchers because obviously it is a huge part of this conversation. We believe that the DSA offers opportunities and we're starting to build the case for that access to known European Union members. Which is difficult. It has its challenges. But we believe there's an interest in the global South to do that and we're trying to be part of that conversation as well.

>> DANIEL O'MALEY: Great. Thank you. That actually wraps up our contributions from our members.

So we're now going to open up the floor to either questions from either the first session of article presenters or also, you know, something that another member wants to say. So if you just want to come up to the mic, and maybe get in line, then we'll do that.

Also just to plug we have another annual report next year. We've heard a lot of interesting topics. So we'll have a call for the next submissions coming out in the next couple of months.

>> AUDIENCE: What a great contribution you guys are making. Thank you. I love the issues especially ‑‑

>> DANIEL O'MALEY: Did you introduce yourself?

>> AUDIENCE: Sagi, an attorney, I work with policy and I work with businesses in Pakistan. Thank you. Number one, I'll talk about very quickly secret take‑down orders. Think about it. When the government makes a request to take something down, where does it get recorded and how do people now? It doesn't. There's a transparency the tech companies do publish but it doesn't tell you who and what happened. That's number one.

The second thing I think we're all concerned about ChatGPT and we're saying, AI, god, you shouldn't put this out there; it's dangerous. And we want principles against it. But when the EU put out GDPR and it's being implemented in countries where now they're saying the Data Protection Authorities should be under the ministry of security? I'm worried. Where is the EU to explain to people this is not about protecting data, it's about protecting privacy. Protecting citizen's data against the government, not protecting the government's data. Thank you very much.

>> COURTNEY RADSCH: If I can respond briefly, the Christ church call process is advocating for governments to report on take‑down requests or violent extremism.

To try to take down investigative journalism and the databases they're using to do that. So thanks for that.

>> AUDIENCE: Hello. This is Handa Lasu from Turkey's observatory. I have firstly product placement. So we also create some creative methods to generate data when it's not available and through our research we identified some stuff such as we found that Facebook's advertisements was being microtargeted 90% to men mostly. We have all these findings that kind of intersect all these issues we talk about and we also identify phishing operations created by AI, targeting lack of governmental action and taking them down in which governmental officials are being represented falsely, these phishing operations, 2 million Turkish citizens affected. So much going on. Turkey can be a great space to identify but also generate great news media. And we have an empowerment programme which tries to address the problems that you pointed out in terms of empowering media. As ex Google employs we also know how search engine optimization works. So we have this new programme.

In regard to the academic research areas that you were asking for, I was wondering, if you're interested in investigating how big tech companies coordinate and comply with authoritarian governments? And what that looks like? And what ‑‑ and how do you think you can generate data on things that happen behind doors? And what could the potential impact be? I think this will especially be important considering that there's more than 50 elections, I think, coming up next year. And finally, we also have going to launch an election monitoring workshop for other investigators that would like to do monitoring on their elections since we just had one last year. And thank you so much for all of the speakers. It was great to listen to. Thank you.

>> COURTNEY RADSCH: Definitely I'm sure all of us are interested in that. I would encourage you, if you could come and give us your card so we can share your information online, even at least for your organization.

>> AUDIENCE: Sure.

>> AUDIENCE: Hi. It's Duba from Georgia. I work at For cep. We help Civil Society with open data solutions. We work in Central Asian companies to boost their data collection, data analysis and visualization skills. And within one minute I would like to reflect on media and open data relations that we talked a lot about today.

We talked about making data accessible but I think another issue and problem and question is what we actually do with this data once it's accessible, once it's open, right? We have worked with hundreds of journalists in these couple of countries and they say that news rooms are way too dynamic to afford spending two weeks on data investigations and data‑driven stories to publish them. They can barely afford it because they are not financially sustainable. Right?

And another issue is that in our target countries, state authorities are disclosing less and less data. So that journalists can't make sense of what's happening in the country.

So in response to this, Forcep with other CFOs started developing tech solutions that scrape data from websites, news agents, social media platforms, so if the state is not giving journalists data we can give journalists data. But now what we see is our tech solutions is losing value because Twitter's data is gone. API is closed. We have issues with Facebook's API as well because it's limited more almost every year. We have issues with Tiktok's data as well. I don't like to be a drama queen, but now we have generative AI coming in, so I think we should really of course talk about making data accessible and invest in this but also invest in actually utlising this data for everyday work in media. Thank you.

>> DANIEL O'MALEY: I think that's a great point about, you know, data. There's so much data. What kind of organizations? We need to think about it probably from an ecosystem standpoint. What can media support organizations do as well. Yes, next question?

>> I'm Jenna Fung, an amateur policy researcher based in Toronto, Canada. I want to bring up one thing probably covered in one of these research already, but I hope the audience hears my ‑‑ have like a basic understanding about the reason, the news act happening in Canada, which they require tech companies like Google and Meta to pay news outlet for posting or linking the contact. If I remember correctly in August, 2023 Meta responds as to comply with the law by removing news completely from may be and Instagram ‑‑ from Facebook and Instagram, that isn't just domestic news but international news. I am originally from Hong Kong. A lot of news outlets closed down due to the political tension happening Hong Kong over the past couple of years and I moved to a country to access this freedom, and I hope that, you know, what will be the place, what should be in place, to make sure that we can balance out the power that the big tech is controlling? Because they are regulating this publicly ‑‑ privately owned public space that we live on and make sure their policy is human centric and, you know, not jeopardizing consumer's right to access information and ensure the sustainability of media ecosystem.

And take this as an example, how will other jurisdictions with even less comprehensive legal framework to deal with such backlash from this kind of legislation and tension between big tech and government and ensure that the interests of consumer and media industry are truly considered and integrated? Thank you.

>> COURTNEY RADSCH: Also we just note it's not that Meta can't comply, it's choosing it's not to pay. It's not a lack of compliance issue. Thank you for that.

>> AUDIENCE: Thank you everyone. I'm Claire Mohindo from Uganda. Working with the African centre for media excellence, a media support organization. We did a study around biometric digital programmes and how they've affected media freedom in Uganda. Biggest trouble with collecting data for the study is the journalists and media owners we interviewed didn't have an idea about the study about digital biometric identity programs. While collecting this data you literally have to explain everything for them to give you your responses you're looking for. Here, to pick lessons from people working in the Global south, what lessons can we pick from you and what contributions can we walk away with on how you navigate such issues? Especially seeing that in our context, people are navigating different issues, issues of media viability, sustainability, and then there's all of this that you have to, like, sort of catch up, like Juliette said.

There's just so much to deal with. The media houses are struggling. And there's all these issues affecting them that they don't even realize they're being affected, because some of our findings showed that, yes, so after we explained what it's all about and showing that all this massive registration is going on and the government is storing our data, people are being surveilled on ongoing surveillance of journalists, but people are not aware. They're receiving anonymous calls from people threatening them and telling them "you can't run this." They're like, where did you get this data from? But they did not realize this. So we had to take lesson ‑‑ we're here to take lessons from everyone working in the Global south.

>> COURTNEY RADSCH: We're at time so I'll take a couple minutes to try to summarize what's been a truly incredible conversation that has I think spanned the entire globe and illustrated why this conversation is needed at the Internet Governance Forum. A global multistakeholder space where we have addressed today the individual level issues, whether that's journalists, news outlets, individual organizations, the need to build capacity, to figure out how to transition to the age of AI. While understanding that they have a constrained choice set. They are bound by a logic, by technological infrastructure, by policy infrastructures. Over which they often have very little impact or control over.

We talked a lot about the global influence of national or kind of supernational in the case of EU‑level policies. You may be an organization working in a developing or low‑income country, but you are forced to comply with and figure out how to build an audience, to monetize, et cetera, and create viability and sustainability, based on rules that are not of your own making. Based on a logic that you have no control over and oftentimes which you have no ability to even try to gain remedy to as we heard in the case of content moderation, this constraining factor, the unequal structural inequalities between journalists and media organizations across the world in terms of their ability to remedy. We heard about the importance of data on multiple layers. What data? What data do we need? How do we get it? And what do we do with it once we have it? How can we come up with a global maybe coordinated agenda? We heard about the global principles for fair compensation that was a similar effort to see, like, globally, where is this consensus of general principles? I think we need to do that, for data. I think we're hearing a call for this. What's the common agenda for the types of data we need? How can we leverage institutions or existing mechanisms like the DAS and leverage those for both European and non‑European entities. Side note, as an American, we also don't have any model for that. So we're interested in looking at that.

So I think but then what do we do with that data as we heard from one of our speakers? Once we have access, how do we both internally at the media organization level, how do we turn that into improved policies that improve safety of journalists? How do we improve Monday nettization? What do we do with that data and make it useful? We heard about the importance of working globally with local knowledge and local input and empirical‑based input, especially thinking about those who work in the Global North or have access to the seats where policies are being made, and discussed, that we represent and we create opportunities for engagement and access by those who don't typically get a seat at the table or are not offered a space at the table. I will say I think that's one of the things I think I'm really proud of about this dynamic coalition about all of the people in this room and online, who are doing that. That is a fundamental core commitment of this Dynamic Coalition. Yes, we do talk about, you know, the EU and U.S. policies a lot because of their constraining impact, but we are also very interested. We heard about Uganda. What is happening in Liberia. As the Internet Governance Forum kind of conveys, they're local issues with global implications and global issues with local implications. So I would invite everyone to please consider joining the Internet Governance Forum Dynamic Coalition on news media and sustainability.

Join our mailing list. Use the mailing list. We hold learning calls as you heard from Dan. We'll be organizing those over the coming year and we'll be holding a meeting in the next couple of months to determine what should the prioritiets for the coming year be. Undoubtedly it will cover generative AI. You already heard about some of the issues we need to think about. These four part issues, how do media development adapt and news organizations adapt and how do we think about AI governance and its news media. I invite you to our session at 1:30 where we'll talk about media governance for the global majority. With that I thank everyone. Thank you to everyone online. Thank you to Waqas. Thank you to all the presenters and thank you to my co‑coordinator Dan.

>> DANIEL O'MALEY: Thank you Courtney. There are a couple other people we need to thank. Those doing the behind the scenes work the people who are transcribing our session and the people doing the video recording and also the great assistance we received in the room. We had technical issues that were not caused by them at all but they really helped us make sure we had a session that was excellent. Thanks for all that background support. And Laura Becana Ball, our Secretariat, you can see her in the middle, she's waving, she did a lot of background work to make these things happen. They don't happen out of thin air. We thank you for your hard work and waking up in the middle of the night in Spain for this session P thank you all. If you want to connect with us, I'll be standing out in the lobby here and would love to connect. Thank you.

>> WAQAS NAEEM: Thank you, everyone. Have a great day. Bye.

[Applause]