The right to privacy in the digital world
7 Mar 2019 01:00h
Event report
The side event of the 40th session of the Human Rights Council was organised by the Organisation Internationale pour la Sécurité des Transactions Electroniques (OISTE Foundation). The session was moderated by Mr Carlos Moreira (OISTE Foundation) who explained that the discussions would focus on privacy issues and on the ways human rights could be protected in the digital sphere. He noted that the scope of privacy issues are generally not well understood by laypeople and that technology, and encryption, in particular, could provide solutions. He pointed out that digital identities are about much more than credit card numbers or any type of specific information but that they reflect our digital self through the gathering of personally identifiable information (PII). Finally, he asked the panellists to define the rights most commonly violated through the denial of online privacy and asked whether they saw any possible solutions.
Ms Roxana Radu (Graduate Institute of International Studies) highlighted that there are a number of rights at stake in the discussion about digital privacy and that the debate must be tackled in a comprehensive way. She explained that privacy issues have severe consequences for other human rights and that these challenges cannot be addressed without tackling other human rights aspects at the same time. Radu illustrated this fact by explaining that there had been cases in which leaked information led to the killing of people.
The researcher also mentioned the ease with which people can be identified through a variety of easily available data. This, in turn, exposes vulnerable populations even more to human rights violations and puts them at risk of being excluded from enjoying the right to education and freedom of speech among others. She explained that issues with opt-in and opt-out applications are very important to consider given that people often do not understand what they are signing up for and how their data will be used. Therefore, they cannot provide an informed consent.
In terms of artificial intelligence (AI) and the risks it might create, she noted that the technology will both deepen the existing privacy issues and create new challenges due to its scale and impact on every aspect of our societies. AI relies on the vast amounts of data and therein the distinctions between personal and non-personal data are often blurred and unclear. Given the amounts of personal data that is being collected about us, she noted that we are at risk of being constantly tracked and living in a mass surveillance society. According to Radu, privacy by design measures should be incorporated into new technologies given that they are a promising way to tackle better protection of the right to privacy. Additionally, she supported the enforcement of the data minimisation principle to limit the amounts and the type of data collected about users.
Ms Wafa Ben-Hassine (Access Now) spoke about the fact that governments worldwide are increasingly adopting digital identification (ID) programs. They often do so by invoking the sustainable development goal (SDG) 6.9 which states that governments will provide legal identification to all by 2030. However, Ben-Hassine emphasised that the SDG only mentions legal, not digital identification. That distinction is relevant because, while digital ID can improve the delivery of state services, it raises serious privacy concerns and puts vulnerable populations at risk. Malicious actors and state authorities alike can infiltrate and abuse digital identification systems and undermine people’s rights to freedom of expression and other human rights.
She further noted that these systems are rather data heavy and therefore require safeguards. Ben-Hassine mentioned that the UN Guiding Principles on Business and Human Rights [link] provide good avenues for companies’ compliance with human rights frameworks. According to Ben-Hassine, data is a representation of oneself in a different form and personal data must, therefore, be treated as such.
Ben-Hassine also explained that their digital identities are valuable and that they are being monetised by companies. She argued that customers should, therefore, be able to provide an informed consent before giving their data away and get some of the monetary value in return.
Ms Carly Kind (Freelance Human Rights Lawyer) explained that initially, the Internet allowed for anonymity online and that people did not have to identify themselves in order to use the Internet. However, nowadays there is a stronger move to link online and offline identification. Kind further noted that there is so much available data that it is easy to identify individuals whether they possess digital IDs or not.
According to Kind, the EU General Data Protection Regulation (GDPR) somewhat contains the data collected about us by states and big tech companies. She further said that the regulation caused ripple effects and that states such as Brazil, India, and Thailand are adopting or have already adopted regulations which offer similar privacy protection frameworks as the GDPR. However, she warned about some forms of state regulations which might be adopted with good intentions but could lead to opposite outcomes. She identified Germany’s rule which requires the deletion of hate speech and discriminatory content within 24 hours by online platform operators as an example that caused self-censorship of companies, ultimately limiting user rights online.
Kind said that there is no need for new online human rights frameworks – seeing that the existing rules offer scaffolding which also applies to online rights. Kind also pointed out that human rights violations should be enforced through monetary sanctions given that reputational damage has only limited effects on businesses.
Kind noted that digital literacy is crucial for the better respect of human rights. Nonetheless, she pointed out that the biggest producers of digital literacy support materials are big tech companies. She therefore, highlighted the need for independent and impartial organisations to produce their own educational products.