Caught slipping: Cybercriminals outsmarting legislation on ‘revenge porn’
- Social media sites like Facebook are more prepared for image-based criminal behaviour than the law itself.
- The monetary loss after a cybercrime attack has increased 72% in the last 5 years, according to Accenture Security.
- The ‘revenge porn’ law in the UK hasn’t been updated since 2015.
When it comes to advocating for targets of ‘revenge porn’ or image-based sexual abuse, “the civil law in the UK is ahead of the criminal law,” says David McClenaghan, Head of Abuse at Bolt Burdon Kemp. Speaking at a roundtable video series held by the specialist law firm, McClenaghan explains that the criminal law is woefully inadequate in helping survivors of image-based sexual abuse to come forward.
Every act of image-based sexual abuse can cause harm
Section 33 of The Criminal Justice and Courts Act 2015 covers the disclosure of private sexual photographs and films with intent to cause distress. As McClenaghan explains, in 2015 his firm established a precedent in the civil law – that the nature of some acts, including image-based sexual abuse, are such that it’s simply not necessary to prove that the perpetrator intended to cause distress. In certain cases, distress to the target of the abuse is always implicit. The firm’s precedent-making case meant the civil law became better geared towards the lived experience of victims than the criminal law as it stands today.
Cybercriminals more tech savvy than cyber legislation
But, even the civil law is no match for cybercriminals. The Ninth Annual Cost of Cybercrime Study published by Accenture and the Ponemon Institute found that cyberattacks are evolving rapidly, with the cost of cybercrime to organisations that are targeted increasing 72% in the last 5 years. The average cost of cybercrime was $13 million (£9.85 million) in 2018. Looking individually into countries, the United States, Japan, Germany and the United Kingdom spent the most when recovering from a cybercriminal event, with the United Kingdom seeing the highest percentage growth (at 31%) year-on-year.
Image-based sexual abuse has very much become solely a digital crime. In 2018, the Law Commission stated that reforms of the law are needed to protect victims from online abuse, with the current law inadequate to protect people harmed by abusive online communications, social media harassment and misuse of private images. The current law, they argued, lacks coherence, denies targets of the abuse their basic rights and fails to reflect their lived experiences.
With legislation flailing, cybercriminals are thriving. As Jeff Bezos’ recent combat with the National Enquirer made clear, our private photos and text messages are only private up until the point at which criminals choose to pursue us. Consider also the criminal possibilities in ‘deepfake’ pornography, an AI-powered crime where a real person’s likeness is used to create a virtual simulation, often in a pornography context. While celebrities have traditionally been the target of this ‘fake pornography’ crime, the software is becoming more accessible, and every day people are finding themselves in the crosshairs.
Social media: where crime is born – and fought
These technological advances – as well as the popularity of social media – have meant that there are more ways than ever for a person to commit an act of image-based sexual abuse and to terrorise their targets. Social media platforms are now used to both spread private images of an individual without their consent and to anonymously contact the individual for blackmail, extortion or other threatening behaviour. The law urgently needs to change to reflect these new ways of committing crimes.
Plus, with social media platforms being used to commit crimes, it makes sense that the crimes should be fought there too. Fortunately, Facebook, Twitter and Instagram – and search engines like Google – have taken notice. In 2015, Google introduced a feature allowing users to request for their images to be removed from search results. In 2017, Facebook added a reporting function specifically for users to report nude photos of themselves, while in 2019, they developed a machine learning and AI-powered tool to detect nude or near-nude images on both Facebook and Instagram for review before it’s posted. All encouraging steps that recognise a burgeoning digital threat.
The rapid advancement and evolution of technology should be no excuse when it comes to ensuring the targets of image-based sexual abuse are protected in a way that befits any victim of any crime. A concerted effort and an all-encompassing approach is needed when updating legislation, and deciding next steps for the proper protection, and justice for, targets of image-based sexual abuse.
‘Speaking at a roundtable video series held by the specialist law firm, McClenaghan explains that the criminal law is woefully inadequate in helping survivors of image-based sexual abuse to come forward.’
https://www.boltburdonkemp.co.uk/adult-abuse/revenge-porn-is-sexual-abuse/
Help keep news FREE for our readers
Supporting your local community newspaper/online news outlet is crucial now more than ever. If you believe in independent journalism, then consider making a valuable contribution by making a one-time or monthly donation. We operate in rural areas where providing unbiased news can be challenging. Read More About Supporting The West Wales Chronicle