Gathering as an Act of Solidarity
A deep dive into the Digital Futures Gathering 2025. By Hannah Lichtenthäler

Shorter days, less sunlight, and fall foliage provided the backdrop for our 2025 Digital Futures Gathering focusing on digital violence[1]. At a time when justice, support, and safety in digital and offline spaces are more urgent than ever, we created a space to address these increasingly pressing issues with the intention to move beyond the state of urgency into a space of imagination and creation.
When people arrived, we noticed how worn down many were by political backlash, shrinking civil society funding, the rise of anti-gender and anti-rights movements, and the challenging daily work with survivors. With that level of fatigue that everyone was carrying on their shoulders, gathering – and the very act of being in one room together – felt almost radical in and of itself. In our SUPERRR tradition, this wasn’t a conventional conference – there were no keynotes, nor panel discussions. Instead, we used a participatory methodology so that the two days were shaped by the needs and desires of the people in the room.
Multiple Things can be True at the Same Time
With this mindset, we did not focus on the outcomes, but started with grounding ourselves in the space, and activating our imagination muscles, using different futuring methods that Ouassima Laabich, our facilitator and founder of Muslim Futures, weaved into the agenda. One of this methods was called “Connecting our Ancestry”, in which small groups shared an insight about a teacher or ancestor who influenced their life and work – and also thought about which ancestor we imagine ourselves to be one day. By taking a much longer timescale as our default, we were able to get out of the every day grind, and instead think about the much bigger impact we might have on the world, and how to get there – allowing participants to get into a space in which they can imagine possible futures together, connecting personal experience with collective visions.
The focus of the first day was to hold multiplicities – multiple truths, multiple discomforts, multiple ideas of tech and care, and move away from the inevitability narrative that Big Tech pushes upon us. In self-organised sessions, participants tackled what matters in working with survivors of gender-based violence (GBV) – ranging from reimagining reporting systems that don’t retraumatise survivors, to co-designing survivor-centred tools, to naming how digital rights work is inseparable from sex workers’ safety. Projects included the chatbot Maya by myprotectify from Germany, designed by a survivor of domestic violence to support other survivors of GBV, the Survivor AI tool by global non-profit Chayn to support survivors of image-based sexualised violence to write take-down letters, or the WhatsApp based chatbot “Viktoria” that is designed to support survivors of GBV, implemented by the feminist NGO feminoteka from Poland.
One thing that was emphasised in many of the discussions was that reporting systems as they exist now are like navigating a maze, using up time and resources that could otherwise go to supporting survivors. Even though platforms like Meta, Snapchat, or TikTok may check compliance boxes under digital policies like the EU Digital Services Act (DSA), the reporting often feels like running in circles for survivors and advocates.
In a workshop held by Elizabeth Ávila Gonzalez from bff (Federal Association of Rape Crisis Centres and Women’s Counselling Centres in Germany), participants moved beyond critiquing this status quo and activated their futuring muscles to collaboratively imagine feminist reporting futures grounded in transparency and care. Using a method called “News from the Future”, some of the news headlines from the year 2035 that participants came up with included: “Finally committed: cyber violence against women ceased to exist”, “Visa and Master card de-partnered with major social platforms who allow IBA/NCII to exist”, and “Emergency reporting: with live people, interoperability between platforms”.
Another pattern was equally striking: digital rights and frontline services are too often working in silos. The discussions about online harms inevitably tackled children’s rights, criminalisation of sex work, platform bans, and access barriers. Yet, all too often these discussions and the actors working on the issues rarely seem to be crossing paths with each other. It became clear that building those bridges and opening dialogues between them is a necessity if we are to be able to develop approaches that work for the most marginalised.
Care was not only the central motif in technical terms, but the essence of the gathering itself. From making space for people’s needs, building flexibility in the agenda, and on-site childcare, we tried our hardest to think about care at multiple levels in the event design as well as in the agenda. Day one ended in a lot of gratitude for the space offered, and left us with food for thought – and many new and profound connections – for the next day.
The Myth of the Silver Bullets
On day two we dived more into tech itself. Sessions like “AI and Social Justice” brought out some of the key tensions around AI: tools like chatbots for survivors that are designed to help survivors find resources, meet a real need but also introduce new risks.
One of the key issues in building chatbots is that the infrastructure to build AI is owned by a small group of people and companies. The discussion also raised questions about governments funding AI instead of on the ground services, which reinforces the danger of chat bots replacing those services. Another issue with chatbots is that without extremely careful attention, they reproduce discriminatory narratives and information – from the information they are trained with – so they need to be developed with anti-racist frameworks built in in order to reduce risks for vulnerable communities. It’s clear that frontline services are in need of funding, but given the finite amount of resources available, participants expressed concern that their frontline work might be replaced with AI chatbots.
This all too often unspoken tension was one of the common themes of the gathering: as with any technology, AI is neither a silver bullet nor damned. When policymakers prioritise the use of technologies like AI, instead of supporting chronically underfunded social welfare or support system, the use of such automated systems does more harm than good, because it risks marginalising the already marginalised even more.
Conflict, Care, and Uncomfortable Alliances
Another commonality across discussions was the reality that conflict lives within movements as much as between them. A session on uncomfortable alliances explored tensions between movements, for instance about race, trans* rights, or global majority experiences, and the emotional labour that so often remains unnoticed within such spaces. What the discussions revealed was that conflict is not a disruption, but can provide clarity and turn into resilience, when examined with care – but to make that a reality requires a level of investment in time, care and effort, and a willingness to compromise
Simultaneously, the session on learning from feminist movements of the 1960s and 1970s reminded us that work like mutual aid, quiet organising, education in kitchens have historically been the source for transformative change.
Translating Hopes and Dreams into Policy
In the sessions addressing policy issues, two things were emphasized the most: policy must be trauma-informed and it must be grounded in lived experience. All too often we see that rushed urgency without care replicates harm, in contrast to approaches which centre personal stories and experience as valid forms of evidence and as a result, allow for more meaningful understanding of the actual issue at hand. For example, Chayn’s approach of taking a trauma-informed approach to supporting survivors of GBV creates a space for people to receive support in culturally-appropriate ways, without being worried about how reporting their experiences might lead to retraumatisation.
Child Protection Without Tech Solutionism
A controversial topic about child online safety and Child Sexual Abuse Material (CSAM) attracted a lot of attention among participants, with standing room only for the session run by Dr. Kate Sim from COSPR (Children’s Online Safety and Privacy Research). Some of the current frameworks like the CSA-regulation, widely discussed as “chatcontrol”, pressing for client-side-scanning or grooming classifiers as technological solutions against CSAM, or Age Verification Systems in various countries trying to solve access for children to social media and other platforms as a protection against online violence and dangerous content, treat harm as content to be removed. But harm is relational, as many debates at the gathering articulated.
As Dr. Sim emphasised, what is often missing in the conversations about CSAM is that the vast majority of sexualised violence against children happens within their social circle.[2] Without addressing structural conditions enabling systems of abuse, CSAM detection and tech surveillance won’t put an end to the danger, but rather risk creating new ones. This reframing challenges prevailing policy frames that foreground tech solutions as the answer.
What We Took Home
The gathering allowed for deep connection, shared language and experiences, and collective accountability. People left energised and felt seen, heard, and less alone. The joy many uttered in our closing circle and in coffee breaks was not found in solutions, but in solidarity. In a world constantly expecting us to respond to crises, the Digital Futures Gathering dared to be boldly different in taking a pause to think collaboratively, to feel together, and to build from what is already there.
Political change does not happen over night – but with our batteries recharged and dear connections made that are meant to last, we stepped out of Berlin Global Village’s Miriam Makeba hall into the autumnal cold with a changed mindset about our digital futures, and new relationships and ideas to make those futures a reality.
__________________________________________________
[1] By digital violence, we refer to any form of harm and/or violence – individual, structural, or institutional—enabled or amplified through digital infrastructures (internet and communication technology, ICT), disproportionately impacting women, girls, LGBTQIA+ people, racialised communities, disabled people, and/or otherwise marginalized communities, and children. It includes direct violence, systemic exclusion, data-based harm, and the reproduction of violence through economic and technical systems.
[2] In each national context, the numbers vary a little bit, depending also on the respective criminal codes, which data has been collected and by who etc. The Centre for Expertise on Child Sexual Abuse provides data for England and Wales. In their research based on reported cases of child sexual abuse, they show that: „Almost half of all child sexual abuse offences reported to the police in England and Wales in 2021/22 took place in the family environment. That means the abuse was by parents, siblings, grandparents or anyone considered ‘one of the family’.“ In Germany, the federal crime police office (BKA) publishes numbers on child sexual abuse each year – reflecting those numbers of reported cases that fall under the criminal code. Dark figures are always estimated higher than actual reports. The Independent Federal Commissioner Against Child Sexual Abuse as well as the National Council against Sexual Violence against Children and Young People as well as scientific research all reflect similar numbers: the majority of sexualised violence against children and young people happens within the social environment. (source in German). In the US context, the national anti-sexual violence network (Rape, Abuse and Incest National Network (RAINN)) highlights the scope of the issue: 93% of survivors under 18 know their abuser.