The company, which launched as a female-focused dating app but has since expanded its service to networking for friends and jobs, announced Wednesday plans to introduce a feature in June that uses artificial intelligence to flag inappropriate images sent through direct messages. Recipients of the images will be able to choose how to respond: View the image, block it, or report it to Bumble.
The image in question will be blurred, just as all images are when sent through private messages on Bumble. The company enabled photo sharing four years ago and requires recipients to hold down the image in order to view it. That decision was intended to be a guardrail, giving individuals an additional layer of choice to view any photo before it appears with clarity in a chat. Bumble also watermarks all photos with the sender's image — a bid to hold people accountable to the images they share.
But existing safeguards on Bumble apparently aren't enough to crack down on the spread of lewd images, despite the fact that nudity and pornography are banned from its platform.
Bumble CEO and cofounder Whitney Wolfe Herd told CNN Business that the new feature, called Private Detector, is a "gesture" to show that it is "desperately trying to build safety products to engineer a more accountable internet, not just talk about it."
The dating app landscape is crowded and competitive. Bumble has fought to differentiate its service by catering to women, understanding their various pain points with the online ecosystem of apps and introducing features to hopefully mitigate against those issues.
Bumble was cofounded by Wolfe Herd and Russian billionaire Andre Andreev in response to Wolfe Herd's experience at Tinder, where she was also a cofounder. Like Tinder, Bumble users swipe for matches with prominent photos inside the app. But Bumble requires women to initiate conversations if a match is made. Bumble has embraced its roots as a women-first product, letting it inform Its founders have pushed to make decisions about what behavior is and is not appropriate on its app. For example, it has banned mirror selfies, pictures of only children, and even publicly shamed a user who exhibited misogynistic behavior.
With 55 million users worldwide, Bumble said it has 5,000 content moderators fielding 10 million photos per day. The new system uses artificial intelligence trained to detect specific types of images — the same way Bumble has used tech to ban guns on the platform since March 2018. At the time, Bumble stated that gun violence was not in line with its values and, therefore, weapons don't belong on its platform. The action came in response to the mass high school shooting in Parkland, Florida, earlier that year.
The latest announcement comes as Wolfe Herd advocates for the passage of legislation in Texas that would make nonconsensual sharing of lewd images a misdemeanor offense. The bill was introduced in the Texas House of Representatives earlier this year. Bumble, founded in 2014, is based in Austin, Texas.
Wolfe Herd argues that Bumble can do its part to try to protect people from viewing lewd photos, but its reach only goes so far.
"We cannot promise you that if we ban someone on Bumble, that they will not go use another system," she said, in speaking of the importance of legislation criminalizing indecent exposure online.
The Private Detector feature will also be added to global dating apps Badoo, Chappy, and Lumen, which were founded by Andreev.
Bagikan Berita Ini
0 Response to "Bumble says it will soon detect lewd images sent on its app"
Post a Comment