Stefanie Felsberger is a researcher at the Minderoo Centre for Technology and Democracy (MCTD), part of the University of Cambridge’s Centre for Research in the Arts, Humanities and Social Sciences. Her research explores the intersections of technology, data, gender and power, grounded in participatory methodologies and feminist theory. Her doctoral work focused on menstrual tracking technologies and how they intersect with questions of privacy, health, control and digital economies.
“Who owns and controls our intimate data is still an open question.”
When data meets the body
Two different paths brought me to this work. After completing my master’s, I was working as a researcher in Cairo at the Access to Knowledge for Development Centre at the American University. We were exploring how different regimes of knowledge production intersect with economic development, global inequalities and societal inequalities. At the same time, I was also personally grappling with menstrual health issues – severe cramps, pain, and everything that comes with it. I’d been using period tracking apps and was struck by how little transparency there was about how they worked and what data they were collecting.
I had already been thinking about surveillance and data privacy, especially in the context of ‘big data’, which was the buzzword at the time. I was concerned about how these apps commodified something so intimate, and how easily that data could be shared or sold. At the same time, I was reading a book that explored the history of reproductive knowledge – how women used to hold that knowledge, and how it was taken from them and medicalised. The book even talked about witch trials as part of that history.
Then I came across a ‘smart cup’, a menstrual product that collects data. And I remember thinking: how is this product supposed to fill the huge gap left by centuries of erasure? That was a pivotal moment. It pushed me to look more closely at period tracking apps – not just from a feminist critique of tech, but also from a deeply personal place.
Tech for people, not the other way around
The MCTD is an independent group of researchers at the University of Cambridge, within CRASSH – the Centre for Research in the Arts, Social Sciences and Humanities. We think critically about how power is distributed in digital societies and ask how technology can work for people, not the other way around.
Right now, I work on a large Horizon Europe project that aims to develop trustworthy AI tools for organisations tackling disinformation. My research focuses on how people perceive and use these technologies, and what trust means in these contexts. I also spend a lot of time collaborating across our research consortium – organising meetings, sharing insights with partners, giving talks, attending conferences and facilitating discussions about where meaningful intervention is possible.
Cycles, stories and data justice
The research that shaped my PhD was carried out during the COVID lockdown, with 30 participants in Austria. I used a participatory action research framework, which treats participants as experts in their own lives and tries to work with them, not just study them.
I also applied a data justice lens, which shifts the focus away from the technology itself and towards the social histories and power structures that technology interacts with. In this case, I was looking at how menstrual tracking apps connect to long-standing histories of menstrual stigma, reproductive control and the commodification of knowledge.
This kind of data holds enormous value because decisions about fertility and reproduction ripple out to affect everything from career paths and housing choices to family planning and personal well-being. Protecting it isn’t just about privacy — it’s about safeguarding the autonomy and futures of those who share it.
The wider digital economy runs on user data as a source of value. So, I wanted to understand how people navigate these platforms, how they think about the data they’re sharing, and what trade-offs they’re making — consciously or unconsciously.
Not just data – it’s about dignity
What surprised me was the huge response I got. So many people wanted to talk to me. Everyone I met who has a period either asked if their app was safe, or shared something about their own experience – pain, joy, confusion, or frustration.
I had come to this topic from a concern about data privacy and commercial surveillance. But most of my participants came to their app through a different route – they were trying to understand their bodies. Many had felt dismissed by doctors, gone undiagnosed, or just wanted to make sense of their cycles. For them, the value these apps offered was much more tangible than the potential risks, which felt distant or abstract.
I realised this applied to me too – I used an app, even though I knew the risks. I tried to pick the safest one, but I still used it. And when I spoke to others, I saw the same: they weren’t necessarily uninformed that data was collected — they were making reasoned, context-based decisions, but they lacked knowledge of how that data collection was connected to potential risks and harms.
One thing that stood out was how differently people thought about medical versus commercial research. Many were open to their data being used in medical studies because they believed it could help others. But they didn’t trust commercial entities with the same data, even if both had access.
Consent shouldn’t be a click in the dark
This research fits into my broader interest in how data is accessed, commodified and used in a digital economy where user information is the product.
At the start, I held the assumption that people using these apps simply didn’t care about data privacy. But the more I spoke to users, the more I realised that wasn’t true. They weren’t necessarily uninformed that data was collected — they were making reasoned, context-based decisions, but they lacked knowledge of how that data collection was connected to potential risks and harms.
I think we need to question that binary: either you care about privacy or you don’t. Instead, it’s about what users value, what they have access to, and how they assess harm. The way these apps – and tech platforms more generally – are structured makes meaningful consent almost impossible. You can opt in or out, but the options are limited, and the alternatives don’t always meet people’s needs.
Power, profit and periods
For me, the biggest challenge is the economic model that underpins the digital economy. While period apps differ in how they generate income – some rely on premium subscriptions, others sell products like temperature-tracking rings – they often operate within the same extractive logic of the digital economy.
User data is treated as a commodity in data capitalism, but users have very little say in how it’s used. Yes, there are privacy settings, but we could be doing so much more to build systems where people have agency. Right now, the idea of consent rests on individual consumer choices – whether or not to use a product. But that doesn’t amount to real control, especially when the alternatives are limited or the app feels essential.
More broadly, we’re operating in an ecosystem where a handful of tech billionaires have amassed enormous power, shaping the systems and values that govern our digital lives. Challenging that structure is difficult, but necessary.
Next steps and new directions
I’m excited about taking the work from my PhD and developing it further. I want to build frameworks for trust, especially in areas like FinTech and menstrual health apps.
I also want to draw broader connections between how unsafe the internet can feel for women and gender-diverse people – how online harassment, surveillance and misinformation impact their ability to learn about their bodies, access support and make informed decisions.
And honestly, one of the things I loved most about my research was just talking to people about their cycles. Hearing what they’d learned, what they struggled with, what brought them joy – those conversations have stayed with me. That’s the kind of work I want to keep doing.