- Session Type (Workshop, Open Forum, etc.): Workshop
- Title: Who is in charge? Accountability for algorithms on platforms
- Date & Time: 12 November, 12.10-13.40
- Organizer(s): Kristina Olausson, ETNO; Lorena Jaume-Palasi, the Ethical Tech Society; Pablo Bello, ASIET; Andrés Sastre, ASIET
- Chair/Moderator: Gonzalo Lopez Barajas, Telefonica
- Rapporteur/Notetaker: Kristina Olausson, ETNO
- List of speakers and their institutional affiliations (Indicate male/female/ transgender male/ transgender female/gender variant/prefer not to answer):
Speaker 1: Oscar Martín González, male, Public Sector, Latin American and Caribbean Group (GRULAC)
Speaker 2: Lorena Jaume-Palasí, female, Civil Society, Western European and Others Group (WEOG)
Speaker 3: Fanny Hedvigi, female, Civil Society, Western European and Others Group (WEOG)
Speaker 4: Pascal Bekono, male, Government, African Group
Speaker 5: Phillip Malloch, male, Private Sector, Western European and Others Group (WEOG)
- Theme (as listed here): Development, Innovation & Economic Issues
- Subtheme (as listed here): INTERNET MARKETS - TELCOS, INTERNET SERVICE PROVIDERS, COMPETITION
- Please state no more than three (3) key messages of the discussion. [150 words or less]
1. The use of algorithms has become increasingly common not only in private but also public sector. There is a clear benefit in terms of assessing large amounts of data. However, challenges such as access to data, differences in legal frameworks and the impact of mathematical models for algorithms on the freedom/autonomy of the individual also need to be addressed.
2. With rapid technological development, we need to ask how well current legislative frameworks are adapted to address human rights in the case of use of automated decision-assisting.
3. As the discussions are still at an early stage, the multi-stakeholder model can be used to map and identify risks/challenges to increase the exchange between different regions.
- Please elaborate on the discussion held, specifically on areas of agreement and divergence. [150 words]
There was overall agreement that “transparency” and “explainability” are two different issues. While transparency was seen as key for ensuring accountability of algorithms, not all actors in the session saw this as enough and demand more active participation by the person who’s data is used in the process.
The session noted that governments and private sector plays an important role in ensuring human rights and ethical principles. There was also broad agreement among the session participants that it is too early to regulate algorithms on platforms. The current framework for human rights is sufficient. However, some participants noted a lack knowledge about how individual are impacted.
Private sector also took a more positive outlook, not only to look at challenges but also opportunities of algorithms. They can be an important tool to address SDGs by providing efficiency gains, making better analysis of data and creating values for individuals. Companies compete on trust from users. Convergence and globalization has brought a lot of competition. Users do not only care about price and quality but also whether a brand is trustworthy. These values will help ensuring that companies continue enforcing human rights.
- Please describe any policy recommendations or suggestions regarding the way forward/potential next steps.
Start with setting standards and technology and design technology according to those standards. It should be made clearer what is meant by “responsibility” and “accountability” as the legal concept has different meanings in different regions. The human rights framework is sufficient to address the current issues with algorithms, but we should extrapolate them to this new context. This will be key in ensuring trust from users of platforms. Therefore, users should be engaged and consulted when issues address by algorithms are impacting them. Finally, we should not rush into regulation that could hamper innovation.
- What ideas surfaced in the discussion with respect to how the IGF ecosystem might make progress on this issue? [75 words]
IGF should continue being a forum for exchange of information and best practices. While the participants concluded that a broad set of stakeholders should remain engaged, their specific roles need to be further discussed. Some of the concrete topics that we need to continue addressing are:
- how can we make the utilisation of algorithms really understandable for all those people involved?
- how can we reconcile transparency with people's intellectual property rights in the private and commercial space?
- what is the role of a government actor, the private sector and others?
- Please estimate the total number of participants.
Ca 70 people.
- Please estimate the total number of women and gender-variant individuals present.
About 40-45 women.
- To what extent did the session discuss gender issues, and if to any extent, what was the discussion? [100 words]
The session discussed how algorithms can both enforce and uncover bias in society. Gender biases are one example. It was noted that algorithms therefore should be transparent and be based on a legal system respecting human rights.