Writing by Erandhi Mendis // photograph by Kelli McClintock
Every three months or so, I seem to have a conversation with a man (at work, at a party, at dinner – anywhere) about the fear of being female. It comes up organically – how so much of life in our bodies is spent thinking about safety. Where someone else would walk home after an evening catch up without a second thought, I am calculating which route has more streetlights and foot traffic. Can I get a friend to drive me back? Can I take a taxi? What risks does taking a taxi pose? Can I be on the phone to someone? Who knows where I am?
The concept of “safe,” has often plagued my existence in a myriad of ways, most of all that I now think about my own caution without really thinking about it. Reckoning with a world that may never feel completely secure has shaped my thought patterns and decisions to a point of incognito operation. My friends and I often discuss this, as the unfortunate truth is: more often than not when we catch up, one of us will have experienced something that evokes fear. We will discuss it, get mad, get sad and then before we’ve even taken another breath to solution something – another experience is quickly tallied upon the cumulative history of women simply existing.
The frequency of unsavoury interactions feels magnified amongst a technological revolution – particularly one that has changed the way the majority of people meet.
Dating at the best of times is difficult. But with the advancement of the digital landscape comes a plethora of complications. Increased paradox of choice makes it harder to discern genuine connections and an abundance of potential partners can lead to a “grass is always greener,” scarcity mindset.
But, as a woman navigating the realm of dating in 2023, the overarching concern is often the abundance of risk that comes with online anonymity – and often the false sense of security that can exist from behind a screen in the comfort of your own home. Whether it’s verifying the authenticity of profiles, conducting thorough background checks, or meeting in public spaces – there are a million small things that I know my friends and I do on autopilot – we have been raised to assume concern, and rightly so. For the same number of people who use these platforms with genuine intentions, there are always an equal share who exploit the digital space.
Last year, I had someone exhibit low-grade ‘stalkeresque’ behaviour on a dating app. I did my digital due diligence and then reported them. I have no way of knowing if they were removed from the platform, given a warning or perhaps nothing at all eventuated from it. My scepticism led me to think all reports from these kinds of apps just land in a big pile and people continue on with their day. How much can a dating app really keep you safe?
But a few months ago, someone I know was forcibly removed from a dating app. They hadn’t been messaging anyone, and swiping patterns could not have resembled a bot. It seemed like a bizarre glitch. We rationalised it due to the fact that we were travelling, surely this was an accident? They were offered no explanation – and most interestingly, no appeal process. Given this can happen to people who barely use the app, it’s a hard line approach – and in some ways, that caused me more concern. Much like how apps such as Instagram will remove content based on algorithms rather than human review, the immediacy and lack of transparency when it comes to dating app reports held my distrust. I wouldn’t rely on a machine learning detective to review abuse in real life. As online abuse occurs in such colossal numbers, within our current system it doesn’t appear feasible for qualified humans to reasonably review each case that gets reported – let alone the ones that never make it that far.
My friend had their app reactivate briefly upon returning to Australia, signs that it was a glitch. Not for long though, a day or two later and the login function no longer worked. While I am aware that technology is always evolving and far from perfect, it made me think about how we use these platforms with the assumption that they can reflect how we expect to be treated in real life. Discussing this led to other people confiding about their experiences with censorship and shadow-banning. The number of women and queer people who are excised from apps that they barely use is demonstrative of a different kind of tech based abuse. So long as the custodians of the digital world remain largely hidden and imprecise, regular people existing on the web can simply become collateral to a larger moderation strategy. At the end of the day, dating apps suffer from the same invisibility cloaks that traditional social media behemoths do: algorithms make decisions – not humans.
On the other hand, despite the obvious flaws in a digital morality system that largely exists to keep humans safe (and help them find genuine connections), if numbers are anything to go by, a hard-line unmoderated approach to online dating reports might help more than it does hinder.
Last year the The Australian Institute of Criminology published a report with some horrifying (albeit not surprising) figures. The study surveyed 10,000 adults, examining a range of tech-facilitated sexual violence: verbal abuse, unwanted pictures, threatening language, in-person violence etc. The findings showed more than 70% of respondents had experienced online sexual violence, harrassment or aggression. Given dating app usage has increased exponentially over the past few years, the stats are particularly devastating.
The concept of stranger danger was introduced to me as a child exploring MSN and myspace. Entirely unmoderated virtual chat worlds such as Club Penguin, Habbo Hotel and Runescape created formative experiences around what it meant to be anonymous online. Back then, I don’t think the concept of a “report” button even existed. Would we have even known how to use it?
Today you can report people on all the big app players, but I was still curious as to why when people get kicked off apps they aren’t given a reason. Doesn’t it make sense to reprimand and educate? Or if a mistake has been made, provide an avenue for an appeal process?
It turns out there is a very obvious reason: to keep the person who reports safe.
If dating apps told potentially harmful people exactly why they were reported, it is very likely via the power of deductions that they would be able to figure out who reported them. Dating apps don’t know when you’ve met up or whether you’ve exchanged details – popular apps such as Tinder, Hinge or Bumble have no way of knowing how many potential suitors have your personal contact details or address. What is stopping someone dangerous from coming to your home even after you’ve reported them? So, it takes the side of caution: nobody gets to know why they’ve been reported in order to protect the reportee.
From a practical standpoint, this makes a lot of sense. A mobile application is designed with user experience in mind, and thinking about the safety of the user is key. There are the obvious holes of what happens to the report, particularly if it is serious can the algorithm discern criminal behaviour? Does the responsibility fall on the app or the reportee to share online sexual violence with law enforcement? Would it make a difference given the structure of our judicial system?
I came away from most of these conversations with more questions than I had answers. Much like in the real world, it’s obvious that there is no single piece of policy or legislation that can protect users online, be it from harassment or shadowbanning. Do I wish we lived in a world where there was equal effort to educate and rehabilitate harmful behaviour as there was to offer safety solutions and reporting functionality? If only. Am I still pleased we’ve graduated from the days of Habbo Hotel? Yes.
The things is, you don’t know what you don’t know. I recently learned that Tinder had partnered with WESNET, the national peak body for specialist women’s domestic and family violence services on a rollout of a new Dating Safety Guide. If you’re a Tinder user you may have even seen an in-app safety campaign roll out on your device. WESNET has a long history of advocating for clients who are subject to tech facilitated abuse and in many ways Tinder Australia’s approach of considered collaboration seems unique in the landscape. All dating app providers should be in conversation with the services that their users engage in a crisis. What’s more than that, Tinder was open to hearing about my thoughts (good and bad) on ways we can improve experiences – this felt somewhat revolutionary; a no holds barred discussion centered entirely on how we can make online dating safer with the singular goal of hearing about diverse experiences.
It might be a broad assumption, but I’d bet most users would not know about all the features available to them. I quizzed a few of my peers who use dating apps – they were all pleasantly surprised to learn that an app had more than even one safety feature. In the same way that I assume risk in the real world, my default in the digital space is one with a grain of salt. While I hope a universe that doesn’t require such high levels of vigilance exists one day, for now, the fact that dating app companies are actively partnering with frontline organisations and thinking about safety is a comforting thought.
Chanel Contos, sexual consent advocate and the founder of Teach Us Consent said the guide isn’t just a reminder to be respectful online, “it also makes sure users know that there are safety features there to support and protect them, and that we encourage their use. This is an important initiative that addresses pertinent problems and helps educate Australian’s on dating decorum.”
Contos, alongside WESNET and Tinder informed the final output of the guide which can be accessed online at the link above.
The question in my mind still stands: how much can technology feasibly keep us safe? On a planet where more than 2 million women and counting have experienced sexual violence, it feels like a herculean task to safeguard hidden corners of the internet when we haven’t yet managed to make the world feel comfortable to walk in.
But, many hands make light work – and it is a cause worth working on. With the advancements in artificial intelligence and natural language processing, it’s imperative that we drive awareness of what is available to us should we want to date online, what tools are on offer and what safety features exist. Hand in hand, we should be critiquing the way all mobile app functionality works because there isn’t necessarily a human reviewing activity behind a screen. Is something functional if it can remove people who are doing no harm in the first place? We should be discussing data commodification, digital footprints and the fact that machines can only work with what we have given them – so if and when we give them power to police, we should have a robust understanding of how they wield that.
One day I hope I will make peace with the way caution is built into womanhood – that eventually, instead I will write about the sweeping progress of unlearning abuse culture. That young men may grow up knowing right from wrong and that 18 year olds who seek connection on the internet will only have positive experiences. It’s a utopia for the cynic in my mind, but until then: I am glad dating apps have safety features yet concerned that powerful algorithms are cagey and inconsistent. Multiple things have to hold true in the complicated world we live in. Amongst all of it, my sincerest hope is that you get to match with people far more than you ever have to report them.