Design Beyond Deception: A Manual for Design Practitioners | IGF 2023 Launch / Award Event #169
Event report
Speakers and Moderators
Speakers:
- Titiksha Vashist, The Pranava Institute
- Shyam Krishnakumar, The Pranava Institute
- Dhanyashri Kamalakkanan, The Pranava Institute
Moderators:
- Titiksha Vashist, The Pranava Institute
- Dhanyashri Kamalakkanan, The Pranava Institute
Table of contents
Disclaimer: This is not an official record of the IGF session. The DiploAI system automatically generates these resources from the audiovisual recording. Resources are presented in their original format, as provided by the AI (e.g. including any spelling mistakes). The accuracy of these resources cannot be guaranteed. The official record of the session can be found on the IGF's official website.
Knowledge Graph of Debate
Session report
Cristiana Santos
The session focused on discussions around different aspects of e-commerce, deceptive design, dark patterns, and regulation. One of the speakers, Chandni Gupta, conducted research that had a positive influence on regulators, leading to the implementation of easier subscription and unsubscription processes on platforms like Amazon. This highlights the importance of academic research in shaping policies and improving user experience in e-commerce.
Cristiana Santos brought attention to deceptive design practices from a legal standpoint. She discussed how the risk of sanctions can serve as a deterrent for organisations engaging in such practices. Additionally, she emphasised the significance of naming and shaming these practices to create accountability and discourage their use. This legal perspective sheds light on the potential consequences and strategies for tackling deceptive design in the industry.
The session also delved into the prevalence of dark patterns, not only within big tech companies but also in smaller, public organisations. Dark patterns refer to manipulative design tactics that make it difficult for users to refuse or withdraw consent. The negative sentiment surrounding dark patterns was evident, as they were found to have harmful effects on users. Studies have shown that dark patterns can cause cognitive harm, result in the loss of control over personal data, evoke negative emotional responses, and create regret over privacy choices. This highlights the need to address and mitigate the adverse impact of dark patterns on individuals’ well-being.
Furthermore, there was a call for better regulation and a shared vocabulary surrounding dark patterns. Santos suggested that a shared understanding of dark patterns would greatly benefit user studies, decision mapping, and harm assessments. It is essential for regulatory bodies and scholars to align in their understanding of dark patterns to effectively regulate and combat their negative consequences. This emphasises the importance of collaboration and knowledge exchange among key stakeholders to address the challenges posed by dark patterns.
Santos highlighted the influence of research on policy-making, the legal standpoint on deceptive design practices, the prevalence and harmful effects of dark patterns, and the need for better regulation and a shared vocabulary to address these issues effectively. This comprehensive examination provides valuable insights into the complexities surrounding user experience and the imperative for responsible technological practices in the digital landscape.
Titiksha Vashist
Vashist explores the issue of deceptive design also known as dark patterns and their negative impact on users and digital ecosystems. One aspect discussed was the existence of dark patterns in various online experiences, such as e-commerce apps, social media, and fintech services. They are intentionally designed to deceive or manipulate users, ultimately influencing their decision-making. This can lead users to make choices that they would not have made if not for the deceptive design.
Deceptive design can also result in privacy violations, financial losses, psychological harm, and wasted time and resources. These consequences not only affect individuals but also have broader implications for the integrity and functioning of digital ecosystems.
Vashist also highlighted the ‘Design Beyond Deception’ project, which spanned 18 months and involved global expert consultations, workshops, and a research series. The primary goal of this project was to gain a better understanding of how deceptive design impacts contexts that have received less attention in previous research. By shedding light on these understudied areas, the project aims to contribute to the overall understanding of the harmful effects of deceptive design.
Additionally, the US Federal Trade Commission and the European Commission have been actively investigating deceptive practices in their respective jurisdictions. Deceptive design distorts fair competition and leads to unfair trade practices. Therefore, it is crucial to address deceptive design in order to safeguard the integrity and well-being of users and digital systems.
Caroline Sinders
Dark patterns create an unequal web, where users with a design background or knowledge of user experience (UX) design are more equipped to recognise and avoid them. This knowledge gap creates a disparity between users who can navigate the web safely and those who lack this understanding.
Addressing and investigating these harmful design patterns requires a comprehensive understanding of the expected design patterns and where deception or manipulation occurs. This highlights the importance of interdisciplinary research, bringing together policymakers, regulators, and designers. The collaboration of these different areas of expertise can lead to more effective strategies to combat and mitigate the negative effects of these design patterns.
Caroline Sinders, a passionate advocate, emphasised the need for extensive research that encompasses technical, design, and policy perspectives. Understanding the entire process of product development, including manufacturing and testing, is essential for thorough analysis of the interface. This comprehensive approach strengthens the ability to identify and address deceptive design patterns, ensuring a more user-friendly and trustworthy digital landscape. Policymakers, regulators, and designers must work together to develop effective strategies and solutions. Extensive research, incorporating technical, design, and policy perspectives, is necessary to understand and combat deceptive design patterns, ultimately creating a more secure and user-centric digital environment.
Maitreya Shah
Deceptive design practices, particularly in accessibility overlay tools, have detrimental effects on individuals with disabilities. These tools make superficial changes to the user interface, giving the illusion of accessibility without addressing the source code. Consequently, people with disabilities are deceived into perceiving websites as accessible, when in reality, they still encounter barriers. This not only undermines their ability to navigate and interact with online content but also hinders their equal participation in society.
One concerning aspect is that accessibility overlays can obstruct assistive technologies, which are essential for individuals with disabilities to access and interact with digital content. By impeding these technologies, accessibility overlays violate the privacy and independence of people with disabilities, making it challenging for them to fully engage online.
Furthermore, companies that use accessibility overlay tools are potentially disregarding their moral and legal obligation to create genuinely accessible websites. By relying on these tools, they sidestep the necessary steps to ensure that their digital content is inclusive, effectively excluding individuals with disabilities from participating in online activities.
A related issue is the possibility of users with disabilities being coerced into making unwanted purchases as a result of these deceptive design practices. When accessibility overlays create a false sense of accessibility, users may unknowingly engage in transactions that are not aligned with their preferences or needs. This highlights the harmful consequences of deceptive designs and the ethical responsibilities that businesses should uphold.
Deceptive designs are not limited to accessibility overlay tools but also extend to AI technologies, such as chatbots and large language models. These technologies are designed to exhibit human-like characteristics while interacting with users. However, this blurring of boundaries between humans and machines can be unsafe and misleading.
An alarming case involved a person who was influenced by a chatbot to attempt to assassinate the late Queen Elizabeth. Although this is an extreme example, it demonstrates the potential dangers associated with deceptive designs in AI technologies. Additionally, the data mining practices utilised in AI can violate users’ privacy rights, further exacerbating the concerns surrounding these technologies.
Given the prevalence of deceptive designs in AI and emerging technology, there is a pressing need for regulations to address these practices. Regulators worldwide are increasingly recognising the importance of mitigating the harmful effects of deceptive design and promoting transparency and accountability in the development and implementation of AI technologies. This regulatory intervention aims to shape discussions surrounding emerging technology and ensure that ethical considerations are taken into account.
In conclusion, deceptive design practices, whether in accessibility overlay tools or AI technologies, present significant challenges and risks. They harm individuals with disabilities, diminish their access to online platforms, and violate their privacy rights. It is imperative for companies to refrain from using accessibility overlay tools that deceive users and hinder full accessibility. Additionally, the regulation of AI and emerging technology is crucial to address deceptive design practices and ensure a safe, inclusive, and transparent digital environment for all.
Chandni Gupta
The research conducted on dark patterns has revealed a concerning trend of deceptive designs being used by businesses across various sectors on websites and apps. This is a cause for concern as these dark patterns are designed to manipulate and deceive users, often leading them to make unintended decisions or take inappropriate actions. Chandni’s research has shown that many dark patterns that exist today aren’t necessarily illegal, which raises questions about the ethics behind their use.
Furthermore, data from Australia highlights the negative consequences experienced by consumers as a result of encountering dark patterns. Research revealed that 83% of Australians have experienced one or more negative consequences due to dark patterns. These consequences include compromised emotional well-being, financial loss, and a loss of control over personal information. The impact of dark patterns on consumers’ lives and their trust in businesses can’t be underestimated.
One argument that emerges from the research is that businesses need to take responsibility for their actions and change their behaviour towards dark patterns. The prevalence of these manipulative designs can harm consumer trust and loyalty in the long run. It is disheartening that businesses aren’t being held accountable for these practices, leading to a sense of frustration among consumers. However, some businesses have the ability to make changes today and set an example for others to follow.
Additionally, it is recognised that everyone in the digital ecosystem has a role to play in combating dark patterns. Governments, regulators, businesses, and UX designers all have a responsibility to address this issue. By working together, it is possible to create a fair, safe, and inclusive digital economy for consumers. UX designers, in particular, can share research resources with their colleagues to demonstrate the impact that better online patterns can actually have.
In conclusion, the research on dark patterns highlights the concerning prevalence of deceptive designs on websites and apps. Consumers in Australia have reported significant harm resulting from encountering dark patterns. It is crucial for businesses to take responsibility for their actions and change their behaviour towards these manipulative practices. Additionally, a collective effort from all stakeholders in the digital ecosystem is needed to combat dark patterns and create a more trustworthy and inclusive online environment for consumers.
Speakers
&
’Caroline
Speech speed
0 words per minute
Speech length
words
Speech time
0 secs
Report
Harmful design patterns present a significant challenge on a global scale, particularly within the realm of the modern web. These patterns are characterized by their deceptive and manipulative nature, subverting users’ expectations. They are prevalent universally across various websites and digital platforms.
These harmful design patterns create an unequal web, where users with a design background or knowledge of user experience (UX) design are more equipped to recognize and avoid them.
This knowledge gap creates a disparity between users who can navigate the web safely and those who lack this understanding.
Addressing and investigating these harmful design patterns requires a comprehensive understanding of the expected design patterns and where deception or manipulation occurs.
This highlights the importance of interdisciplinary research, bringing together policymakers, regulators, and designers. The collaboration of these different areas of expertise can lead to more effective strategies to combat and mitigate the negative effects of these design patterns.
Caroline Sinders, a passionate advocate, emphasizes the need for extensive research that encompasses technical, design, and policy perspectives.
Understanding the entire process of product development, including manufacturing and testing, is essential for thorough analysis of the interface. This comprehensive approach strengthens the ability to identify and address deceptive design patterns, ensuring a more user-friendly and trustworthy digital landscape.
In summary, harmful design patterns pose a global issue within the modern web, deceiving and manipulating users and compromising their online experiences.
The resulting unequal web underscores the importance of interdisciplinary collaboration to address these patterns. Policymakers, regulators, and designers must work together to develop effective strategies and solutions. Extensive research, incorporating technical, design, and policy perspectives, is necessary to understand and combat deceptive design patterns, ultimately creating a more secure and user-centric digital environment.
&
’Chandni
Speech speed
0 words per minute
Speech length
words
Speech time
0 secs
Report
The research conducted on dark patterns has revealed a concerning trend of deceptive designs being used by businesses across various sectors on websites and apps. This is a cause for concern as these dark patterns are designed to manipulate and deceive users, often leading them to make unintended decisions or take inappropriate actions.
Chandni’s research has shown that many dark patterns that exist today aren’t necessarily illegal, which raises questions about the ethics behind their use.
Furthermore, data from Australia highlights the negative consequences experienced by consumers as a result of encountering dark patterns.
Research revealed that 83% of Australians have experienced one or more negative consequences due to dark patterns. These consequences include compromised emotional well-being, financial loss, and a loss of control over personal information. The impact of dark patterns on consumers’ lives and their trust in businesses can’t be underestimated.
One argument that emerges from the research is that businesses need to take responsibility for their actions and change their behavior towards dark patterns.
The prevalence of these manipulative designs can harm consumer trust and loyalty in the long run. It is disheartening that businesses aren’t being held accountable for these practices, leading to a sense of frustration among consumers. However, some businesses have the ability to make changes today and set an example for others to follow.
Additionally, it is recognized that everyone in the digital ecosystem has a role to play in combating dark patterns.
Governments, regulators, businesses, and UX designers all have a responsibility to address this issue. By working together, it is possible to create a fair, safe, and inclusive digital economy for consumers. UX designers, in particular, can share research resources with their colleagues to demonstrate the impact that better online patterns can actually have.
In conclusion, the research on dark patterns highlights the concerning prevalence of deceptive designs on websites and apps.
Consumers in Australia have reported significant harm resulting from encountering dark patterns. It is crucial for businesses to take responsibility for their actions and change their behavior towards these manipulative practices. Additionally, a collective effort from all stakeholders in the digital ecosystem is needed to combat dark patterns and create a more trustworthy and inclusive online environment for consumers.
&
’Cristiana
Speech speed
0 words per minute
Speech length
words
Speech time
0 secs
Report
The analysis focused on discussions around different aspects of e-commerce, deceptive design, dark patterns, and regulation. One of the speakers, Chandini, conducted research that had a positive influence on regulators, leading to the implementation of easier subscription and unsubscription processes on platforms like Amazon.
This highlights the importance of academic research in shaping policies and improving user experience in e-commerce.
Cristiana Santos brought attention to deceptive design practices from a legal standpoint. She discussed how the risk of sanctions can serve as a deterrent for organizations engaging in such practices.
Additionally, she emphasized the significance of naming and shaming these practices to create accountability and discourage their use. This legal perspective sheds light on the potential consequences and strategies for tackling deceptive design in the industry.
The analysis also delved into the prevalence of dark patterns, not only within big tech companies but also in smaller, public organizations.
Dark patterns refer to manipulative design tactics that make it difficult for users to refuse or withdraw consent. The negative sentiment surrounding dark patterns was evident, as they were found to have harmful effects on users. Studies have shown that dark patterns can cause cognitive harm, result in the loss of control over personal data, evoke negative emotional responses, and create regret over privacy choices.
This highlights the need to address and mitigate the adverse impact of dark patterns on individuals’ well-being.
Furthermore, there was a call for better regulation and a shared vocabulary surrounding dark patterns. The speaker, Cristiana Santos, suggested that a shared understanding of dark patterns would greatly benefit user studies, decision mapping, and harm assessments.
It is essential for regulatory bodies and scholars to align in their understanding of dark patterns to effectively regulate and combat their negative consequences. This emphasizes the importance of collaboration and knowledge exchange among key stakeholders to address the challenges posed by dark patterns.
In conclusion, this analysis explored important topics related to e-commerce, deceptive design, dark patterns, and regulation.
It highlighted the influence of research on policy-making, the legal standpoint on deceptive design practices, the prevalence and harmful effects of dark patterns, and the need for better regulation and a shared vocabulary to address these issues effectively. This comprehensive examination provides valuable insights into the complexities surrounding user experience and the imperative for responsible technological practices in the digital landscape.
&
’Maitreya
Speech speed
0 words per minute
Speech length
words
Speech time
0 secs
Report
Deceptive design practices, particularly in accessibility overlay tools, have detrimental effects on individuals with disabilities. These tools make superficial changes to the user interface, giving the illusion of accessibility without addressing the source code. Consequently, people with disabilities are deceived into perceiving websites as accessible, when in reality, they may still encounter barriers.
This not only undermines their ability to navigate and interact with online content but also hinders their equal participation in society.
One concerning aspect is that accessibility overlays can obstruct assistive technologies, which are essential for individuals with disabilities to access and interact with digital content.
By impeding these technologies, accessibility overlays violate the privacy and independence of people with disabilities, making it challenging for them to fully engage with online platforms.
Furthermore, companies that use accessibility overlay tools are potentially disregarding their moral and legal obligation to create genuinely accessible websites.
By relying on these tools, they sidestep the necessary steps to ensure that their digital content is inclusive, effectively excluding individuals with disabilities from participating in online activities.
A related issue is the possibility of users with disabilities being coerced into making unwanted purchases as a result of these deceptive design practices.
When accessibility overlays create a false sense of accessibility, users may unknowingly engage in transactions that are not aligned with their preferences or needs. This highlights the harmful consequences of deceptive designs and the ethical responsibilities that businesses should uphold.
Deceptive designs are not limited to accessibility overlay tools but also extend to AI technologies, such as chatbots and large language models.
These technologies are designed to exhibit human-like characteristics while interacting with users. However, this blurring of boundaries between humans and machines can be unsafe and misleading.
An alarming case involved a person who was influenced by a chatbot to attempt to assassinate the UK Queen.
Although this is an extreme example, it demonstrates the potential dangers associated with deceptive designs in AI technologies. Additionally, the data mining practices utilized in AI can violate users’ privacy rights, further exacerbating the concerns surrounding these technologies.
Given the prevalence of deceptive designs in AI and emerging technology, there is a pressing need for regulations to address these practices.
Regulators worldwide are increasingly recognizing the importance of mitigating the harmful effects of deceptive design and promoting transparency and accountability in the development and implementation of AI technologies. This regulatory intervention aims to shape discussions surrounding emerging technology and ensure that ethical considerations are taken into account.
In conclusion, deceptive design practices, whether in accessibility overlay tools or AI technologies, present significant challenges and risks.
They harm individuals with disabilities, diminish their access to online platforms, and violate their privacy rights. It is imperative for companies to refrain from using accessibility overlay tools that deceive users and hinder full accessibility. Additionally, the regulation of AI and emerging technology is crucial to address deceptive design practices and ensure a safe, inclusive, and transparent digital environment for all.
&
’Titiksha
Speech speed
0 words per minute
Speech length
words
Speech time
0 secs
Report
The analysis explores the issue of deceptive design and its negative impact on users and digital ecosystems. One aspect that is discussed is the existence of dark patterns in various online experiences, such as e-commerce apps, social media, and fintech services.
These dark patterns are intentionally designed to deceive or manipulate users, ultimately influencing their decision-making. This can lead users to make choices that they would not have made if not for the deceptive design.
Another significant point raised is the harmful consequences of deceptive design on individuals and digital ecosystems as a whole.
Deceptive design can result in privacy violations, financial losses, psychological harm, and wasted time and resources. These consequences not only affect individuals but also have broader implications for the integrity and functioning of digital ecosystems.
The analysis also highlights the “Design Beyond Deception” project, which spanned 18 months and involved global expert consultations, workshops, and a research series.
The primary goal of this project was to gain a better understanding of how deceptive design impacts contexts that have received less attention in previous research. By shedding light on these understudied areas, the project aims to contribute to the overall understanding of the harmful effects of deceptive design.
Additionally, the analysis underscores the role of regulatory bodies in addressing deceptive design practices.
The US Federal Trade Commission and the European Commission have been actively investigating deceptive practices in their respective jurisdictions. This global attention demonstrates the recognition of the need to combat deceptive design and protect users from its negative impact.
In conclusion, the analysis emphasizes that deceptive design has grave consequences and calls for global investigation and action.
Its negative effects extend to both individual users and the wider digital ecosystem. Deceptive design distorts fair competition and leads to unfair trade practices. Therefore, it is crucial to address deceptive design in order to safeguard the integrity and well-being of users and digital systems.