Youthful approach at data protection in messaging apps
29 Nov 2022 10:15h - 11:45h
Event report
During the last decade, the world has experienced an increase in connectivity, and one of the groups most impacted have been young individuals. The internet has enabled a massive number of children and teenagers to exchange messages across borders. This session was a youth-to-youth conversation about policy issues related to messaging platforms such as TikTok, WhatsApp, and Instagram.
The session explored a wide range of issues, including the privacy implications of messaging apps systems, how to ensure the application of the Declaration of the Rights of the Child, the responsibility of users to secure their data, and the challenges that quantum computers are expected to bring to current encryption methods.
Most chat apps do not permit the use of their service unless access is granted to data (granted when the “I agree” box is checked); most of the services operate on the basis that if data is not provided, the service cannot be accessed. Limited awareness about the implications of such approaches leaves most people’s data unprotected. To facilitate awareness raising and a better understanding of privacy and data protection implications, privacy terms need to be redesigned, perhaps using visualisation and other means familiar to young people.
Another solution to manage the ‘take it or leave it’ approach is to change the narrative, so that if one doesn’t agree to the policy, the data sent through the app is limited. Policies and programmes should be put in place to educate youth and children on the dos and don’ts and risks of using chat apps, in particular when it comes to how data is processed.
Many current data protection laws require that if a user creates data, it can be deleted. Most platforms have such a delete option now. But it is unknown whether it is really deleted. Hence, a vital issue here is accountability. Even if countries have a data protection act which stipulates that one has the right to refuse to provide data, service providers may not fully adhere to it. Most likely people are not even aware such a right exists.
Running chat apps requires expensive servers, while most chat apps are offered for free. Hence, sometimes service providers use user data to make revenues. Web 3.0, described by some as a decentralised infrastructure at a low cost, can be used to host a chat app; however, it is a distant solution. Yet, having centralised messaging apps can address privacy challenges to some extent.
Regulations to protect children as users of chat apps should be in place, similar to those for television, radio, and other media. The current approach of regulating data rests on the user and his or her ability to restrict data sent. But children do not know the consequences of sharing data. So we need to explore other kinds of regulations that limit the kind of data collected from children.
New ideas have also appeared, like that of comic contracts, which are becoming more popular in the legal space. Comic contracts are primarily drawings of what is happening, of the legal obligations, and of the legal rights between the parties involved. This also helps to overcome language barriers.
The question is: What is stopping us from developing an environment whereby the privacy policies or the consent forms filled in to access apps can be presented in such a way that is digestible to the users?
While most messaging apps promise end-to-end encryption of data, they are still storing the data (whether for 30 days or more). A classical computer using a common algorithm may require years to decrypt a message. However, quantum computers may take only seconds to do the same. So data encrypted using today’s methods – including passwords for online services or credit card numbers – will be left vulnerable once quantum computers become a reality. Robust quantum resistant algorithms will therefore be needed, and work is undergoing in this direction.
By Mili Semlani
The session in keywords