You are here

Cybersecurity, Trust and Privacy

IGF 2018 LIGHTNING SESSION #8 Always-on and listening: a talk about digital assistants

Subtheme: INTERNET OF THINGS

 

Presenter

Name: Luã Fergus Cruz

Organization: Center for Technology and Society at FGV Law School

Country where Organization is based: Brazil

Stakeholder Group: Civil Society

Regional Group: Latin American and Caribbean Group (GRULAC)

Short Description

IGF 2018 BPF Cybersecurity

2018 Best Practices on Cybersecurity

Wedneday 14th Nov, 10:10-11:40 CET, Salle XII

Co-moderators: Markus Kummer, Internet governance & policy consultant and Kaja Ciglic, Microsoft

 

Format: 

1. Introduction by the co-moderators (5 minutes)

2. Run-through of this year’s BPF output by Wim Degezelle, BPF Cybersecurity consultant (10 mins)

IGF 2018 CYBERSECURITY, TRUST & PRIVACY

IGF MAIN SESSION ON CYBERSECURITY, TRUST & PRIVACY

Cybersecurity and privacy practices that can build trust and ensure growth and prosperity for all

Tuesday, 13 November, 10:00-11:20 (80 minutes), Salle I

IGF 2018 DC Internet of Things: Global Good Practice in IoT: a Call for Commitment

Focus of this year's Open Workshop of the Dynamic Coalition on the IoT is two fold: 

  1. How to increase IoT security from a global good practice, multistakeholder perspective;
  2. Agree on a Statement that invites explicit support for the DC IoT Good Practice Paper.

IGF 2018 LIGHTNING SESSION #16 Convention 108+ in the Digital Era

Theme

Cybersecurity, Trust and Privacy

Subtheme: Data Privacy & Protection

Short Description

IGF 2018 LIGHTNING SESSION #19 Improving the Security of Consumer IoT: A New Code of Practice

As we connect more devices in our homes to the internet, the cyber security of these products is now as important as the physical security of our homes.

IGF 2018 WS #421 Algorithmic transparency and the right to explanation

How do individuals seek recourse when they are affected by automated decisions? What are the implications for justice when automated decision-making such as Artificial Intelligence (AI), Machine Learning (ML) or Deep Learning (DL) or an automated script or piece of software is involved in making or influencing a legal decision that has a legal or significant effect on another person? Under the EU General Data Protection Regulation individuals or “data subjects” have a “right to explanation” with regards to the reasons behind automated decisions that could significantly affect them. This “right to explanation” arises from a combination of rights afforded to data subjects under the GDPR in particular article 22 of the EU General Data Protection Regulation (GDPR) states that “The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her”. Article 22 is further interpreted interpreted by the Working Party on Data Protection in their Guidelines on Automated Decision-making. Issues discussed will involve: Algorithmic bias: It is important that if algorithms affect our lives, they do not have bias, and that they are impartial. they are transparent and understandable Algorithmic transparency and the right to explanation: When people are affected by algorithms there must be an ability to explain why an algorithm has made a decision. How is this achieved in reality when the effects of code are hard to understand, and much automation and algorithms happen behind proprietary "black boxes" of obscured code?

 

Alex Comninos, Independent Researcher, Civil Society

Imane Bello, Lecturer and Researcher, Sciences Po, Academia

Lorena Jaume-Palasi, Ethical Tech Society, Civil Society

Chinmayi Arun, Assistant Professor of Law at National Law University Delhi, Academia

Joy Liddicoat, University of Otago, Academia

Karen Reilly, Independent, Business and Technical Community

Contact Information

United Nations
Secretariat of the Internet Governance Forum (IGF)

Villa Le Bocage
Palais des Nations,
CH-1211 Geneva 10
Switzerland

igf [at] un [dot] org
+41 (0) 229 173 411