Panel - Auditorium - 60 Min
In the wake of COVID-19 an erroneous dialogue has appeared in multiple governmental and global environments comparing the supposed tensions between privacy and protection in stemming the spread of the coronavirus disease. However, where it is assumed citizens need to give up location, biometric or medical data based on governmental or advertising based surveillance to combat the spread of COVID, key human rights may be violated as what is "best for society" to move beyond this crisis.
It is time for the advent of Personal Sovereignty -- when the digital environment serves to enhance human interests (of humans and their meaningful groupings). By creating tools for citizens, such as data governance frameworks and machine readable privacy terms for all, society can evolve the logic of a physical passport to a digital framework where people can be placed at the center of their data. Beginning with children to strengthen GDPR and Privacy by Design focused legislation, the creation of such tools allow individuals (or their caregivers) to better understand and influence the collection and use of their (or their children’s) data, as well as access and meaningfully curate and share their data as they choose. While people may still be tracked by advertising or government surveillance oriented tools, Personal Sovereignty provides all humans with their digital voice at an algorithmic level to face the future as empowered and proactive participants in digital democracy.
This proposed Open Forum session will provide an introduction to the mature and extended suite of currently available technologies, communities and standards that can be used to empower Personal Sovereignty to become ubiquitous in the age of the algorithm.
Citing examples from IEEE’s Digital Inclusion, Identity, Trust, and Agency program and many of IEEE’s volunteers, collaborators and contributing organizations to deal with COVID and beyond, and recommendations from The IEEE Global Initiative on Ethics of Autonomous and Intelligent Systems, this session will also provide an introduction to current efforts regarding protecting children's data and creating trustworthy experiences. It will feature discussion on the nature of data, Artificial Intelligence and ethics, as well as provide insights into the human-side of broad data-collection, sharing and use, and consequences of failure to consider the diverse users. This session will feature an interactive Q&A among the lead discussants and the participants centering on trust to gain diverse perspectives on what the future of trust in the algorithmic age looks like.
Mr. John Havens, IEEE
Ms. Moira Patterson, IEEE
Dr. Salma Abbasi, eWorldwide Group
Moira Patterson, IEEE
Constance Weise, IEEE
Kristin Little, IEEE
GOAL 3: Good Health and Well-Being
GOAL 10: Reduced Inequalities
GOAL 17: Partnerships for the Goals
Panelists agreed about:
- Data is a commodity and is generated and, therefore, owned by us, the individual. As such, we have to demand our rights. Trust has to be earned.
- Enforceable laws are needed, and the general public has to think about how it will give its data to companies. People need to know what is happening to their data, and governments need to protect the people.
- The need for collective effort on the part of governments, the private sector, civil society, and the technical community to [achieve] personal sovereignty.
- Data must be seen and owned by us and used with our permission, supported by enforceable laws to help us.
- Standards can play a critical role in scaling solutions; including in empowering people, in helping to create digital literacy frameworks, which help empower people with the necessary skills.
- Human dignity needs to be at the core of our thinking whereby the technology should serve people's needs and their communities.
- It is possible for companies to build customer trust within a model of data sovereignty.
- Consumer data use that leaves out individuals who do not fit into set profiles is a concern.
- IEEE and IGF are excellent fora in which to discuss the topic of child online protection.
- All actors, including governments, the private sector, the technical community, and civil society, must work together collaboratively to create tools for citizens, such as data governance frameworks and machine readable privacy terms for all, to place citizens at the center of their data and to empower them to advocate for their personal sovereignty.
- AI is being used to measure trends for business, but analysis of trends in health or trends in humanitarian issues will not happen unless driven by citizens. One such issue is child online protection, and the panelists agreed that IEEE and the IGF are excellent fora in which to discuss the topic.
- IEEE helps to educate about the crucial role of standards in helping to create these ecosystems and tools for citizens: Standards are building blocks that can make best practices more accessible to all actors in society.
- Currently available technologies, along with related IEEE communities and standards, can be used to empower Personal Sovereignty to become ubiquitous in the age of the algorithm.
- John C. Havens, IEEE
- Dr. Salma Abbasi, eWorldwide Group
- Moira Patterson, IEEE
The IEEE Open Forum “Personal Sovereignty: Digital Trust in the Algorithmic Age” (#42) did not discuss gender issues as the focus was on digital trust and personal sovereignty.
- Webpage: https://standards.ieee.org/events/2020/igf-2020.html
- Blog: https://docs.google.com/document/d/1xPKQhUwpfy-15xrXlkAcK8Tklh4K20GlBKpaOgCx-20/edit#
- IGF Report: Internal link: https://docs.google.com/document/d/1eBRROlCRNTbLxOqzYdsud7-ilLn7jssJkECdJ8e38TQ/edit?ts=5fada779
- This IGF Session Report