Human Rights & Freedoms
Digital Technologies and Rights to Health
Non-discrimination in the Digital Space
Rights to Access and Information
Catherine Easdon. European Cybersecurity Fellow at European Cyber Conflict Research Initiative (ECCRI); Internet Society Early Career Fellow; Privacy Engineer at Dynatrace. Participating as a member of the technical community. WEOG (Austrian resident).
Ryan Payne. Researcher in Biometric Privacy at Queensland University of Technology; Ryan Payne Design (fashion consulting business); Internet Society Early Career Fellow. Participating as an academic. WEOG (Australian resident).
Catherine Easdon, Ryan Payne
Targets: This talk is primarily aligned with SDG 9, in particular targets 9.1 and 9.c. Software now plays a critical role in providing access to and ensuring the quality, reliability, sustainability, and resilience of our global infrastructure. From our physical infrastructure (for water, energy, and transport, for example) through to our financial markets and even the political and social institutions that shape our societies, software determines what happens when and where information flows. It is therefore essential for digital inclusion that rights protections are built into software to ensure equitable access to infrastructure and services. In this talk, we will present how software can be built to meet the varying privacy needs of users around the world, and how such privacy protections lay the foundation for protecting users’ human rights. Without such protections, we cannot achieve universal access to the Internet (SDG 9.c) and equal rights of access to enabling technology and services (SDGs 1.4, 5.b) because Internet surveillance will continue to pose a threat to marginalized groups.
20 minute presentation with interactive Q&A + 10 minute discussion
This Lightning Talk introduces how rights protections can be built into software products by design, with a particular focus on privacy protections and why they lay the foundation for protecting users’ human rights and civil liberties. Such protections are a key building block in our efforts to achieve universal access to the Internet and equal rights of access to technology and services. We will discuss how the contextual privacy needs of users around the world differ and provide concrete examples of design choices and technical changes that can be made to protect these users. Incorporating ‘rights by design’ is an act of translation from the social, legal, and political realm into code and digital infrastructure. We therefore welcome participants from all backgrounds to join the discussion and contribute to our understanding of how we can best protect the rights of all in software. There will be a 20 minute presentation with interactive Q&A, followed by 10 minutes of discussion.
The presented slides are available at https://www.cattius.com/images/rights-by-design.pdf.
Conducting a human rights impact assessment is crucial when developing a new product, feature, or tech policy. As privacy is an enabling right for many other rights, potential privacy harms for individuals are likely to also have human rights implications. These privacy harms can be assessed using established frameworks and taxonomies (Microsoft Harms Modeling Framework, LINDDUN, PLOT4AI, Citron & Solove's privacy harms taxonomy, etc.).
We call on both technology companies and policymakers to introduce ethics and threat modeling training into their organizations. It is crucial that every individual involved in the product or policymaking process reflects on their own moral compass, the privacy harms experienced by others, and their "red lines" of what they will refuse to build or legislate in order to build the moral imagination necessary to assess their work's ethical impact.
The aim of this Lightning Talk session was to introduce privacy as an enabling right and how rights protections can be built into technology products and policymaking (“Rights by Design”). An additional goal was to present privacy from a range of perspectives to encourage attendees to challenge their preconceptions about privacy. Based on the feedback received from attendees, the session successfully achieved both aims for the small but highly engaged audience on-site. We were delighted to hear from attendees with experience in privacy and data protection that they had gained new insights from the session.
The session consisted of a 20-minute presentation followed by a Q&A and discussion. We will briefly recap the presentation content here; for the complete slides, please see the link in the session description. To begin, the speakers asked attendees whether they thought privacy mattered, discussed examples of why privacy is important, and defined privacy to ensure everyone was aligned on the concept for a productive discussion. They next outlined the privacy rights granted to individuals under data protection law and discussed how digital privacy is a crucial enabler for other fundamental rights, including examples of how these rights contribute toward achieving the Sustainable Development Goals (SDGs). This message was reinforced by a case study of how biometric privacy - or lack thereof - impacts individuals’ rights. After defining privacy engineering and privacy by design with examples from architecture and software, the speakers introduced Rights by Design as a necessary extension of the Privacy by Design principles. Finally, the presentation concluded with proposals of how Rights by Design could be applied to technology, both in the corporate world and in policymaking.
These proposals prompted a lively discussion after the presentation. As this was the last session of the day in Speaker’s Corner, it was possible to continue the discussion well beyond the planned 10 minutes. Attendees’ contributions focused on the presentation’s two calls to action to introduce ethics and threat modeling training into organizations working with technology and to ensure that human rights impact assessments are conducted when developing new technology products, features, or policies. Attendees raised their concerns about how feasible ethics training would be in a multicultural organization - given that beliefs about what is right and wrong vary widely - and whether this should (or even could) be implemented via regulatory efforts. The speakers clarified that the goal of such ethical training is not to instill specific values but rather to raise individuals’ ethical awareness, encouraging them to develop their own moral compass and establish personal ethical red lines, for example which product use cases they would refuse to develop software for on moral grounds. Regarding making such training a regulatory requirement, the discussion reached a consensus that this would likely be less effective than multi-stakeholder engagement to make such training a market norm. Establishing a market norm would however only be effective for establishing this in companies, whereas such training is also essential in policymaking fora.
There was considerable interest in the session topic before the session, which led to insightful preliminary discussions in Days 1 and 2 of the conference. We would like to thank all involved - both session attendees and those who joined for discussions beforehand - for their valuable contributions. In particular, we would like to thank the Design Beyond Deception project team from the Pranava Institute, who introduced their research initiative investigating deceptive UI/UX design and educating designers in ethical, human-centered design practices. The team contributed training materials to share with session attendees and with the speakers’ communities after the conference, and described their research into how deceptive design practices differ in the Global South. Together we discussed how such deceptive designs target the highly vulnerable, such as impoverished communities without access to traditional financial services, and how such exploitation might be prevented through legislative and educational efforts.
It was unfortunately not possible to conduct the session in a hybrid format as planned, as the room had no camera or microphone to stream the session. Due to confusion about the assigned room (the talk temporarily had no room assigned in the schedule) we found out too late to make our own recording arrangements. We therefore have no feedback from remote participants to report. However, we have shared the slides online and will open discussions within our communities about the topic. If organizing a session in future, we would confirm the room assignment and available equipment on the first day of the conference to ensure sufficient time to make arrangements to offer the session in a hybrid format. This was a key learning for us as session organizers, as we would like to ensure that stakeholders unable to attend the IGF in person are still able to make a full contribution.