Artificial intelligence will soon weed out any NSFW photos a match sends to you on Bumble. The dating app that requires women to make the first contact said it is launching a "private detector" to warn users about lewd images. Bumble CEO Whitney Wolfe Herd and Andrey Andreev, CEO of the dating app parent company that includes Bumble, Badoo, Chappy and Lumen, made the announcement Wednesday in a press release. Beginning in June, all images sent on Bumble and the other apps will be screened by the AI-assisted "private detector." If a photo is suspected to be lewd or inappropriate, users will have the option to view, block or report the image to moderators before they open it.
Apr-25-2019, 11:49:46 GMT