A reflection on the CyberDIVA conference at Aston University, examining cyber violence against women and girls, the fragmentation of the UK response ecosystem, and the architectural incentives shaping harm in modern digital environments. The article connects operational realities to broader structural questions around platform design, AI integration, economic alignment and the need for systemic accountability in an increasingly asymmetric web.
Contents
- Contents
- 1. Introduction: You Wait Ages for One Event…
- 2. Opening the Day
- 3. Dark Web, Toolkits and Policy Reality
- 4. Mapping the Ecosystem
- 5. If It Feels Real, It Is Real
- 6. The Pipeline of Harm
- 7. The Sweet Bobby Reality
- 8. The Panel: Are Our Responses Fit for Purpose?
- 9. Coordination and Capability
- 10. The Immersive Challenge and Close
- 11. And Then We Ran to BT
- 12. Conclusion: Why This Day Matters
- 13. Epilogue: Architecture, Asymmetry and Institutional Imagination
1. Introduction: You Wait Ages for One Event…
Last Wednesday (25th of Feb) was slightly absurd. I had three significant cyber events land on the same day.
- One was CyberASAP’s end-of-year showcase in Canary Wharf, academic spin-outs, serious innovators, serious commercialisation work, nothing specifically to do with women in cyber, but important nonetheless.
- The second was a Cyber Women CIC event at BT in Birmingham over at Three Snow Hill.
- The third was CyberDIVA at Aston University Conference Centre, focused squarely on online harms affecting women and girls.
You cannot exist in three places at once. For about an hour, Andy, Ryan, and I, collectively felt like Schrödinger’s Cat, attending and not attending multiple conferences until observed.
In the end, I chose CyberDIVA for the day, and then we hot-footed it across Birmingham afterwards to make the drinks reception at BT, CyberWomen Groups C.I.C., and the Cyber PATH celebration. Jumping from conference to conference was chaotic, but it was definitely worth it.
2. Opening the Day
CyberDIVA is an Innovate UK–funded programme led by Dr Anitha Chinnaswamy at Aston University, in partnership with Prof Deborah Leary OBE and Ben Leary at Forensic Pathways, with support from DSIT and West Midlands Police.
The day was compered by PJ Ellis, of “Wit + Grit“, and hosted by the irrepressible Dr Anitha Chinnaswamy.
Professor Osama Khan opened the conference, setting the tone from Aston’s leadership level. Dr Anitha followed with the project overview and research context, grounding the day in what CyberDIVA is actually trying to build: not just commentary, but capability.
This was not a performative awareness session. It was a practical, cross-sector look at how cyber violence against women and girls actually operates, and what we are realistically doing about it.
3. Dark Web, Toolkits and Policy Reality
John Thornton gave an introduction to the dark web that was refreshingly grounded in operational reality rather than Hollywood mythology. Ben Leary then launched the CyberDIVA toolkit: part education, part protective guidance, part early-intervention capability.
Daljinder Mattu provided a regional policy perspective from DSIT, outlining how cybersecurity and online harms are being viewed at the government level within the Midlands.
At this point, the shape of the problem was clear: a lot is happening across sectors, but it does not always feel joined up.
4. Mapping the Ecosystem
Andrew Briercliffe’s keynote laid that fragmentation bare.
His slides mapped the UK online harms ecosystem in a way that made something obvious: a lot is going on, but very little of it feels coherently mapped.
We have tens of thousands of schools, 45 police forces plus the NCA, multiple government departments, age verification providers, research labs, charities, NGOs, moderation services, task forces and regulators. It looks impressive on paper.
But when you ask who is doing what, what works, what doesn’t work, and where the duplication or gaps sit, the answer is far less clear.
We only know what we know. We do not know what we don’t know. And that is not a comfortable place to be when harm is scaling.
5. If It Feels Real, It Is Real
After coffee, Nina Jane Patel’s fireside conversation was one of the most human parts of the day.
There is still a cultural undercurrent that suggests online harm is somehow less serious because it is “not physical”. But if an experience triggers fear, stress, shame, withdrawal or behavioural change, then it is real in every way that matters to the nervous system.
The body does not sit there and say, “Ah, this is only digital. I will not produce cortisol”.
Children and young women are growing up in environments where digital and physical life are fully intertwined. Pretending that one is less real than the other is not only inaccurate, it is dangerous.
Nina also made a point that resonated with me: banning technology is not a long-term strategy. The battlefield is digital. Removing young people from it does not prepare them for it. It simply delays their exposure without increasing their resilience.
6. The Pipeline of Harm
Amit Singh Kalley followed with a session heavy on data and clarity.
He described harm not as a single event, but as a pipeline:
- Sexualisation.
- Humiliation.
- Extortion.
- Silence.
That sequence explains why sextortion cases escalate so fast. It explains why image-based abuse feels impossible to contain. And it explains why “just report it” is often a deeply unsatisfying answer.
One slide on revenge porn outcomes was particularly sobering. Reports are increasing. Re-uploads are relentless. A very small proportion lead to meaningful charges. Takedown is not the same as resolution, especially when content is copied, mirrored and weaponised elsewhere.
The system does respond. But it does not respond at the speed or scale that the harm operates.
7. The Sweet Bobby Reality
Harkirat Kaur Assi’s spotlight on the Sweet Bobby investigation added a different dimension.
Online abuse is not always strangers shouting into the void. It can be intimate, prolonged and psychologically sophisticated. It can sit inside trust networks. It can evolve long before the victim even understands what is happening.
That complexity makes simplistic advice dangerously inadequate.
Harkirat’s story was nuanced, painful, stretched over many years, and layered in multiple ways. I recommend you go and engage with her podcast to learn more.
8. The Panel: Are Our Responses Fit for Purpose?
After lunch, the panel tackled the core question: are our responses to online abuse fit for purpose? The short answer? Not yet.
On the panel were Shaila Parvez, Andrew Briercliffe and I, with hosting from the indomitable Prof Deborah Leary OBE.
I found myself returning to something I’ve been thinking about for a while. Online harm today is shaped by platform architecture.
- Identity systems.
- Visibility mechanics.
- Algorithmic amplification.
- Frictionless account creation.
- Weak repeat-offender suppression.
- Engagement incentives designed around retention.
- The increasing interplay between AI systems, agentic automation and people to produce “sticky” environments.
- Reduced friction that makes participation easier, and prevention harder.
Companies like Discord, Snapchat, et al., are not neutral utilities. Their design choices influence behaviour at scale. Recent legal scrutiny of major platforms over addiction and child protection makes it increasingly difficult to deny (I explored this dynamic previously in Snapchat’s Settlement Is Not the Story: The End of ‘We’re Just Platforms’ is).
Many of these environments present as open platforms, but function operationally as semi-closed or pseudo-private ecosystems. Discord servers, private Snapchat groups and ephemeral messaging systems create fragmented micro-communities that are socially porous but structurally difficult to observe. They feel open to users, but are governance-light in practice. That combination makes them particularly compatible with automation and agentic AI systems that can amplify, simulate participation, or recombine content with minimal resistance.
Harm in these environments is not always loud. It is often subtle, layered and iterative. Visibility can be weaponised quietly. Reputation can be reshaped incrementally. The increasing interplay between AI systems and human users compresses the time between experimentation and impact.
This is not anti-technology. It is not moral panic. It is a recognition that incentives shape systems, and systems shape outcomes. If we build for engagement, retention, speed, and scale above all else, we should not be shocked when harm also moves at speed and scale and is even more subtle.
Reduced friction subsequently compounds effects. When account creation, content sharing, recombination and amplification are frictionless, harms do not just scale: they combine. AI-assisted systems accelerate that combination further, allowing harassment, manipulation or humiliation to be generated, replicated and redistributed at negligible cost.
At the end of the day, Dr Anitha Chinnaswamy also highlighted the work of GRIIT (Gender Equity Research and Inclusive Innovation in Technology), a collaborative forum connecting academia, industry and policy to build shared capability across cyber, intelligence and safeguarding.
9. Coordination and Capability
One thing that was clear throughout the day is that harm does not sit neatly inside a single organisation. It crosses policing, safeguarding, schools, mental health services, platforms and regulators.
Through the West Midlands Cyber Hub, we are trying to stitch some of that together. Government, policing, innovators, big tech, academia and community organisations need to be in the same conversation, not parallel ones.
Policy is only meaningful if it becomes protection. Otherwise, it remains a well-written document on a shelf.
10. The Immersive Challenge and Close
The Regional Cyber Crime Unit then ran an immersive cyber challenge, grounding the day back in operational thinking. It was a reminder that behind the theory and policy conversations sit investigators dealing with real cases, real victims and real evidential constraints.
The reflections and close brought the themes together: complexity, coordination, and the need to be honest about what we still do not understand. The immersive challenge reinforced that cybersecurity is not only technical: it is investigative reasoning, collaboration and ethical judgment under constraint.
11. And Then We Ran to BT
After CyberDIVA wrapped, we quite literally ran across Birmingham to make the Cyber Women CIC reception at BT’s Snow Hill offices.
Different crowd. Different energy. Same underlying theme: how do we celebrate women in cyber while also confronting the reality that the online environment is often structurally hostile to them?
I met students who are stepping into this field with their eyes open. I spoke to founders building solutions rather than commentary. And yes, the Coventry CIC crew have volunteered to help with the D&D night we’re launching alongside the Hub in April, which is perhaps the most wholesome side-plot of the entire day.
12. Conclusion: Why This Day Matters
CyberDIVA was not abstract.
The ecosystem is complex.
The harm pipeline is predictable.
The legal outcomes lag the scale of abuse.
Deepfakes are accelerating image-based exploitation.
Coordination is still immature.
If we are serious about celebrating and protecting women online, we have to move beyond slogans and into system design, enforcement capability and measurable accountability.
The web we built in the early days was idealistic and open, but there was always the undercurrent of heavyweight commercialisation, layered over technologies developed by the military for the military. A tension I explored in The Web’s Odd Couple: Tim Berners-Lee, Marc Andreessen and the Yin Yang of the Early Internet recently. The web we inhabit now is highly optimised and economically driven. Platforms are not malicious; they are economically aligned with optimisation. Retention drives revenue. Engagement drives retention. Addictive mechanics are not accidental features: they are optimisation outcomes.
Recent legal scrutiny and admissions from major platforms around addictive design mechanics reinforce this point. I examined similar structural pressures in Navigating the Legal Storm: Meta’s Ethical Dilemma in Protecting Children Online, where compliance obligations collide with engagement economics. If attention is the product and time-on-platform drives value, then friction reduction and behavioural reinforcement are not incidental: they are strategic. That reality does not imply intent to harm, but it does explain why safety mechanisms that reduce engagement face structural resistance.
If safety interventions reduce engagement without regulatory or market counterweight, they are structurally disadvantaged. Legislative attempts to rebalance this dynamic (such as those discussed in The Senate’s Latest Quest for Social Media Accountability) illustrate how difficult it is to realign incentives once optimisation regimes are entrenched. The question we are now forced to confront is whether we are willing to rebalance those incentives before the integration between human behaviour, AI systems and platform infrastructure becomes even more asymmetrical.
13. Epilogue: Architecture, Asymmetry and Institutional Imagination
That asymmetry is not just about content moderation. It is about architecture. Automation now generates scale at negligible cost. AI systems can simulate participation, accelerate visibility and shape conversational environments. Yet consequence (reputational, emotional and legal consequence) is still borne by humans.
In other words, we are building systems where machines generate density, but people absorb impact.
Humans also provide the legitimacy layer. Empathy, reassurance, trust, moderation, de-escalation and social glue are still overwhelmingly human functions. Machine systems can optimise, accelerate and simulate, but they cannot absorb emotional consequence. As integration deepens, human warmth increasingly functions as stabilising infrastructure within environments optimised for scale.
That is the core argument behind my forthcoming series, Hard-Wired Wetware, and the Asymmetric Integration Model (AIM). AIM examines how optimisation regimes increasingly centralise control while distributing affective and reputational risk outward to participants. When engagement is the metric and persistence is the goal, human emotional labour becomes part of the operating layer of the system, whether we design for that outcome or simply allow it to emerge. In practice, that means platforms scale interaction mechanically, but moderation, empathy, reassurance and recovery still fall to people.
CyberDIVA illustrated the real-world manifestation of that imbalance. Women and girls are often the first to experience the sharp edge of architectural asymmetry, because abuse exploits exactly those optimisation dynamics: visibility, virality, anonymity, amplification and permanence.
If we do not understand the structural integration between human behaviour and machine systems, we will continue to respond downstream, after harm has already been inflicted. This builds directly on themes I outlined in Structuring Cyberpsychology: From Foundations to Practice, where I argued that we must move beyond surface moderation and into behavioural systems thinking.
If we are serious about getting ahead of harm rather than responding to it, we need to exercise institutional imagination. AI compresses innovation cycles. Abuse patterns will not look like yesterday’s cases. Regulation and enforcement must anticipate combinatory risk, not merely catalogue historical examples.
Institutions move on legislative and budgetary cycles measured in years; machine systems iterate in days. That tempo mismatch alone guarantees lag unless we design differently.
That is where my next set of articles is heading.
But Wednesday was a reminder that this is not abstract theory. It is already playing out in real lives, in real investigations, and in real classrooms across the UK.
The toolkit is live. The conversations are happening. The architecture is still being written.
And we would be foolish to treat it as anything less than that.