Five technology design principles to combat domestic abuse

Measures technologists can take to ensure the products they create are resistant to coercive control

Domestic abuse is all about control and it doesn't discriminate. Anyone of any gender, race, religion, age or sexual orientation can be a victim, or a perpetrator. And, with much of the world in lockdown at home, Covid-19 has only made things worse. Shocking statistics have shown that there has been a significant increase in domestic violence, with current living conditions exacerbating victim abuse. But not all abuse is physical - it can also be emotional, sexual, financial, or psychological.

As society evolves and technology becomes a core part of our everyday-lives, the ability to control or be controlled is being aided by the very devices and technology that were intended for our protection. While the control that abusers exert isn't new, the tools they use are.

One particular form of abuse where technology is aiding abusers is coercive control; a purposeful pattern of behaviour that binds victims in invisible chains, instilling a sense of fear and control in all areas of their life. The Home Office described this type of behaviour as taking place over a length of time in order "for one individual to exert power, control or coercion over another". Since 2015, the UK has recognised coercive control as a crime, acknowledging and highlighting that technology can be utilised to carry out this very abuse.

As abuse facilitated by technology rises, this type of abuse can quite often go undetected and unreported. What's more, those who are developing technology for good remain largely unaware of the ways their technology is being used for abusive and coercive purposes. But we can no longer remain ignorant to this fact. We need to assume that attempts will be made to use our technology for malevolent purposes and do our utmost to design it to be resistant.

Given the complexities of technology-facilitated abuse, there is no simple solution to eliminate it. However, to raise awareness and work towards an industry solution, we have developed the Five Technology Design Principles to Combat Domestic Abuse.

The following five principles act as guidelines for technologists and developers in weighing up the potential risk vs reward for users, helping to ensure the products they produce are resistant to coercive control.

1. Promoting diversity

When designing technology for yourself it's simple - you understand your needs and what outcomes you want it to achieve. However, it's important to understand that you alone cannot represent all users. Each person will have different needs and may utilise technology in unexpected ways, with potentially unintended consequences.

An example is location data embedded in images. If a parent and child are escaping an abusive partner, and the child doesn't understand the terminology used in the privacy conditions and unknowingly shares this in an image, this allows the abuser to know their location. The creators of such an image sharing app would have likely used personas to discover use cases, but if the personas didn't include children they would have failed to identify the associated risks.

User profiles are currently still too limited, and miss key issues that subsets of the user-base are facing. Having a diverse team encourages technologists to consider and understand a diverse range of user habits, enabling you to uncover a more complete set of requirements and identify a wider range of potential issues.

2. Guaranteeing privacy and choice

Account privacy and data security have been topical issues for many years. Despite improvements to user control with GDPR, if the users themselves aren't aware of this functionality or don't understand how to utilise it, these improvements are wasted.

An example is social media privacy settings. With a big, green 'accept default' button and phrases like 'advanced settings', it is clear which is the easier and more appealing option to choose. If a victim of abuse accepts a friend request and the default settings openly publish this on their profile page for the abusive partner to see, this can lead to dangerous consequences for the user.

Privacy settings are intended to support users. They need to be transparent when outlining what information is shared so users can continue with confidence. With simple and easy to understand settings, users can comfortably configure their own settings without its presentation trying to influence them. Periodic notifications to review these settings ensures they can continue to make active and informed decisions.

3. Combatting gaslighting

Gaslighting is a common abuse technique that is used to psychologically manipulate the victim and make them question and doubt their judgment. By controlling internet-enabled devices from their personal device, abusers can control common objects in the home such as the locking of doors, passwords, and smart devices. When these techniques are repeated over time it can create a deeply destabilising environment for the victim, unaware that the "truth" is being manipulated.

Technology needs to make it obvious when remote functionality is triggered, making it difficult to obscure or hide gaslighting attempts. These alerts should be clear and conspicuous - faint signals, such as a small LED changing from a breathing to a solid light, are not enough in a busy home environment.

Applications that allow users to make changes on a shared account/device should include the option for notifications when changes are made to help eliminate these gaslighting techniques in technology. Whilst this may seem like a contradiction to the privacy recommendations, the aim is to empower the user to make decisions right for them, rather than making the decisions for them.

4. Strengthening security and data

Security systems designed for legitimate purposes, such as anti-theft, friend locator and parental control apps, can also be used to control victims. Despite these applications being designed for safety, victims are usually unaware that these dual-use apps can be manipulated in such a way.

Victims of coercive control are at greater risk of harm if information about them is made readily available. This includes data that can be seemingly innocuous. An example would be your device battery level, included when you share your location. With that small piece of data in the hands of an abuser, their victim can no longer escape the relentless barrage of abuse by turning off their phone and saying their "battery died" as the abuser would know they had lied.

Whilst we are not proposing that no data should be collected, data collection, sharing and storage should be considered pragmatically in view of potential risks. Privacy by design is a requirement of GDPR, so we need to design for privacy as well as security.

5. Making technology more intuitive

In the context of coercive control, connected home solutions are often installed, operated and understood by a single user who is often more confident with technology. With victims of coercive control frequently lacking the confidence to navigate all the features of new technology, the control of the abuser increases. By using more accessible language in user manuals, as well as ensuring technology is intuitive and can be understood by all - regardless of technical confidence - the use of technology as a weapon of control by abusers is limited.

Making technology safer

Technology brings huge benefits. By being more aware, technologists have a real opportunity to create products that resist coercive control and make a contribution towards the elimination of domestic abuse in our wider society. By opening-up conversations and adopting a mindful approach to design, we can work to ensure our technology is resistant to being used for harm, making it inherently safer.

To learn more, read the full ‘Coercive Control Resistant Design' report.

Lesley Nuttall, is accelerated value leader at IBM Security Expert Labs and author of IBM's paper discussing technology-facilitated domestic abuse and why we need technologists to act.