Bumble makes cyberflicker detection tool available as open-source code

0

cyber flashing is the act of sending non-consensual nude photos through your phone. It is an act of digital sexual violence that is often downplayed by terms such as “unsolicited dick pics”. Dating app Bumblebee tries to fight against the violation on its application, while campaigning for the act to be made illegal in the UK and US.

In 2019, Bumble launched its artificial intelligence ‘Private Detector’ toolwhich alerts users when they receive an obscene photo and automatically blurs the image.

Now the dating app is making a version of the tool available to the wider tech community.

SEE ALSO:

It’s time to stop saying “unsolicited dick pics”. Here’s why.

So how exactly does the tool work? Well, the AI ​​detects when an obscene image has been sent. The photo is then automatically blurred and the user has the choice to view, delete or report the image. It launched on both Bumble and badoo in response to the rise of cyberflashing.

Search for a British Government report from March found that 76% of girls aged 12-18 had received unsolicited nude images from boys or men. Bumble’s own research has found almost half of people aged 18-24 have been presented with a non-consensual sexual image, with 95% of people under 44 in England and Wales saying more should be done to stop cyberflashing.

SEE ALSO:

Bumble campaigns to make cyberflashing illegal in England and Wales

Other apps have also taken steps to try to combat the rise of this act of image-based sexual violence, with OKCupid announcing in 2017 that it would require all users to take an anti-harassment pledge stating that they would not send unsolicited nude photos on the app. More recently, Mashable reported this Instagram is working on a feature called “Protection from nudity.”

Bumble has made a version of Private Detector widely available on GitHubso that other technology companies can adapt it and create their own features to improve online safety and accountability in the fight against abuse and harassment.

Rachel Haas, VP of Member Safety at Bumble, said in a statement, “Open-sourcing this feature is about standing firm in our belief that everyone deserves healthy and fair relationships, respectful interactions. and friendships online.”

Bumble has campaigned against cyberflashing in the US and UK for the past few years. In 2019, the app’s founder and CEO Whitney Wolfe Herd helped pass a law in Texas (HB 2789) to make non-consensual nude images a punishable offence. Since then, the dating app has helped push similar bills into Virginia (SB 493) and California (SB 53).

In England and Wales, Bumble campaigns for the criminalization of cyberflashingand in March 2022 the government announced that it would become a criminal offense under new laws to be introduced, with perpetrators facing up to two years in prison.

It’s time for the wider tech community to take this breach more seriously and do more to combat it.

Share.

Comments are closed.