Whether it is designing technology or a piece of content, each project has to be built and improved on the values of agency, hope, empathy, solidarity and trust.
We must make brave and bold choices that prioritise the physical and emotional safety of users, especially if they have been denied this safety at many points in their lives. Whether it is the interface of our platform or the service blueprint, safety by design should be the default.
Abuse, inequalities and oppression strip people of agency by removing power and control of the narrative from the survivor. By honouring the survivor’s wishes and how their story is told and used, we can create an affirming experience. This requires seeking informed consent from survivors at every step and providing information, community and material support. Users should be critical to their own path to recovery, and the design of interventions for it.
The world as it currently exists is not just. Systems are set up to work for dominant groups and do not do justice to the differing needs of people. As such, all of our interventions need to be designed with inclusion and accessibility in mind. Survivors are not a homogenous group and they will not all benefit from the same types of support. We must consider how position, identity, vulnerabilities, experiences, knowledge and skills shape trauma and recovery, and create solutions that leave no one behind.
Privacy is a fundamental right. Due to the stigma, victim blaming and shame associated with GBV, the need for privacy is greater. A survivor’s personal information including their trauma story - such as data, images, videos, or statements - must be kept secure and not disclosed, unless the survivor decides to do so. At the same time, we should remove unnecessary obstacles from users getting to the information and help they require.
We must build accountability into the systems that enable and facilitate harm, and the interventions addressing it. This includes being open and transparent about what is being done, how, and why; creating and nourishing constructive feedback loops that catalyse change. It also means openly communicating about what is working and what isn’t. To build trust, this communication should be clear and consistent. Accountability requires change to be sustainable and long-term, not one-off.
There is no single-issue human, and to do justice to the complexity in human experiences, we need to suspend assumptions about what a user might want or need and account for selection and confirmation bias. Harms manifest in different and disproportionate ways for people living at the intersection of multiple oppressions, these lived realities must be recognised and we should never assume a ‘one-size-fits-all’ approach.
Too often, the power to make decisions is concentrated in the hands of a few. Instead, power must be distributed more widely among communities and individuals who are impacted the most by tech abuse. Interventions should be co-designed and co-created with survivors.
Abuse can leave us feeling hopeless. Users do not need to be unexpectedly and constantly reminded of their own traumas or struggles. The language, images and processes we use can often be triggering of abusive experiences, prioritising sensationalism rather than healing, or are shocking for the benefit of an audience rather than the survivor themselves. Interventions should be designed to be an oasis for users, by being empathetic, warm and soothing and motivating people to both ask for and embrace the help on offer. It should validate their experience as we seek out collaborative solutions and offer hope for the future.