Abusers are using AI and digital technology to attack and control women, charity warns Violence against women and girls

by
0 comments
Abusers are using AI and digital technology to attack and control women, charity warns Violence against women and girls

Domestic abusers are increasingly using AI, smartwatches and other technology to attack and control their victims, a domestic abuse charity says.

A record number of women abused and controlled through technology were referred to Refuge’s specialist services during the last three months of 2025, with a total of 829 women involved in the most complex cases, an increase of 62%. Referrals from those under 30 also increased by 24%.

Recent cases include criminals using wearable technology like smartwatches, Aura Rings and Fitbits to track and stalk women, disrupting their lives through smart home devices that control lights and heating, and using AI spoofing apps to impersonate people.

Emma Pickering, head of the tech-facilitated abuse team at Refuge, said: “Time and again, we see what happens when devices hit the market without proper consideration of how they could be used to harm women and girls. It is currently too easy for criminals to access and weaponize smart accessories, and our frontline teams are seeing the devastating consequences of this abuse.

“It is unacceptable to think of the safety and well-being of women and girls as an afterthought once technology has been developed and distributed. Their safety must be a fundamental principle shaping both the design of wearable technology and the regulatory framework around it.”

Refuge said smart accessories are too easy to access and weaponize and women’s safety needs to be factored into their design.

A survivor refugee, who worked with Meena, left In her rush to escape her abuser she followed her smartwatch, who used it to track her using linked cloud accounts to locate her emergency accommodation.

“[It]was extremely shocking and frightening. Knowing that my location was being tracked without my consent, I suddenly felt exposed and vulnerable. It created a feeling of constant paranoia; I couldn’t relax, sleep properly, or feel settled anywhere because I knew my activities were not private,” she said.

Despite the police returning the device to Meena, she was located at her next shelter by a private investigator hired by her abuser, using suspect tracking through technology. She reported the violations to the police but was told that no crime had been committed as she “came no harm”.

She said, “I was repeatedly asked to step aside for my safety, rather than deal with the technology directly or have the smart watch confiscated from him. Each move made me feel more unsettled and displaced.”

“Overall, the experience left me feeling vulnerable, unheard, and responsible for managing a situation that was completely out of my control. It showed me how technological abuse can quietly and powerfully escalate coercive control, and how easily survivors can be left to shoulder the emotional and practical burden when the system doesn’t fully understand or respond to it.”

Abusers are also increasingly using AI tools to manipulate survivors, Pickering said. For example, they may alter the survivor’s video so that she appears intoxicated, allowing them to tell social services that “she is behaving erratically again, making slurred speech, has a drinking problem” and is therefore an unfit mother or a risk to herself and others. “We’ll see more of this as these videos and applications move forward,” Pickering said.

Pickering said he has also heard of AI tools being used to develop fraudulent documents that appear authentic, for example job offers or legal subpoenas, which could be sent to survivors to trick them into believing they are in debt, or to persuade them to come to the same location as their abuser.

Pickering feared that in the coming years, medical technology would increasingly be misused, for example by controlling insulin levels through diabetes trackers, which could be fatal.

He urged the government to take action on digital technology-enabled and online crimes, including providing more funding to develop and train digital investigation teams. “They want short-term wins, they don’t want to think about long-term investment in the region, but if we don’t do that we’ll never move forward,” he said.

She also wants to see the technology industry held accountable for failing to ensure that devices and platforms are designed and function in ways that are safe for vulnerable people.

“Ofcom and the Online Safety Act don’t go far enough,” he said.

A government spokesperson said: “Tackling all forms of violence against women and girls, including where it occurs online or is facilitated by technology, is a top priority for this government.

“Our new VAWG strategy sets out how the full power of the state will be deployed online and offline. We are working with Ofcom to set out how online platforms tackle the disproportionate abuse women and girls face online.”

Related Articles

Leave a Comment