The non-consensual sharing of intimate images on Telegram is just the tip of the iceberg

A recently-published article on BBC News has shined a light on the growing and disturbing trend of women’s nude photographs being shared without consent on the instant messaging app Telegram. Whilst the sharing and disseminating of intimate images without the consent of all the parties involved is, sadly, not a new phenomenon, the fact that it is becoming increasingly common – and on a multitude of different platforms – highlights that this is an issue that is only going to get worse unless there is a concerted effort from these platforms to tackle it.

The sharing of intimate images without consent – more colloquially known as “revenge porn” – has, in recent years, come to be recognised as a specific form of abuse (“image-based sexual abuse”, or “IBSA”). Whilst there are many reasons that somebody would non-consensually share images of another person beyond simply “revenge”, the consequences for the victim are often the same regardless: feelings of fear; shame; anxiety; and extreme distress are all common. Following the non-consensual taking and sharing of intimate images, one woman reported to the BBC: “I can’t recover. I see therapists twice a week…They say there is no progress so far. They ask if I can forget it, and I say no.”

Whilst it may be easy to view IBSA in isolation and as being unrelated to the wider commercial sex industry and the exploitation therein, with a little digging the connections between the two become abundantly clear.

Sara, who was also interviewed by the BBC, explains the impact on how people perceived her following the non-consensual sharing of images: “They made me feel like I was a prostitute because [they believed] I’d shared intimate pictures of myself. It meant I had no value as a woman.”

This view of women within the commercial sex industry as being totally without “value” or as being “less than human” is common. For example, within prostitution specifically, research demonstrates that the very act of purchasing sexual access is linked to drastically reduced empathy for the women involved, as well as encouraging the idea that once sexual access has been purchased, the woman is obligated to do whatever is demanded of her. These attitudes are directly in opposition to the idea that any form of sexual relationship should be based on mutuality and respect, and most importantly, the recognition of the other person as being fundamentally deserving of such mutuality and respect; something that is almost always not afforded to women within the sex trade.

The dehumanisation that occurs permeates our culture far beyond the immediate confines of the sex trade. Myriad research highlights that the attitudes and mindsets encouraged as a result of consuming pornography do in fact “trickle down” into everyday interactions between users and their peers. Viewing pornography encourages men to view women as disposable, as well as being linked to increased support of ideas that underpin rape culture and rape myths, and a higher likelihood of committing acts of sexual violence.

The non-consensual sharing of intimate images has become a thriving marketplace, where abuse and exploitation are traded for profit. As the BBC article explains: “Disturbingly, while we were investigating these groups, an account from Russia also tried to sell us a folder containing child abuse videos for less than the price of a coffee.

Profiting from illegal images and videos has become big business for exploiters across the world, particularly due to the increasingly unregulated nature of the internet, which includes instant messaging apps such as Telegram. As the BBC reports: Telegram does not have a dedicated policy to tackle the non-consensual sharing of intimate images, but its terms of service make users agree “not to post illegal pornographic content on publicly viewable Telegram channels, bots, etc”.

Coupled with the almost-entirely unregulated nature of the wider online sex industry, this is a recipe for disaster. The UK Government has thankfully started to recognise this issue, and has made a concrete commitment to tackling online harms through the upcoming Online Safety Bill. But whilst there have been many positive steps – including reintroducing plans for online age verification for adult websites – there is a danger that unless the Government recognises the commercial pornography  industry specifically as a hotbed of harm and exploitation, that any other steps will simply be a drop in the ocean.

It is vital that the Government look to tackle the sprawling and dangerous online commercial sex industry as a problem deserving of its own focus, but further, social media sites and instant messaging apps must recognise the role they play in facilitating illegal and abusive content. It is unconscionable that these tech giants deflect all responsibility with proclamations about wishing to protect its users’ privacy, with little-to-no safeguarding in place to prevent such criminal activity; what about the privacy of the women and children whose abuse is profited from? Are they not deserving of the same concern and protection? At CEASE, we say they are; and looking to the future, combating this behaviour must be not only a concern, but a priority.

Share