Facebook launches new artificial intelligence tool to help “revenge porn” victims

NEWS

Popular Facebook is launching a new artificial intelligence tool that helps detects and deletes intimate pictures and videos posted without the user’s consent

The machine learning tool will make sure intimate pictures and videos are taken down, relieving the user of reporting them. Inappropriate picture uploads will have to be flagged by the Facebook users before content moderators will review them

For the service to be able to identify any unauthorized uploads,users will have to send their own intimate images to Facebook

Many users are being reluctant to share their intimate photos with the social media giant because of its history of privacy failures. Facebook is trying to rid the platform of all abusive content and this is the latest attempt. The moderators have claimed of developing post traumatic stress disorder

The company’s new machine is to find and flag pictures automatically then send them to humans to review

Social media sites across the board have had issues containing abusive content user uploads from violent threats to inappropriate photos

The Facebook company has been heavily criticized for allowing inappropriate posts stay too long and deleting images with artistic and historical values

Facebook has been working on expanding its moderation efforts and the company has high hopes that their new technology will help detect inappropriate posts

The new technology which will be used for both Facebook and Instagram is trained using pictures Facebook has formerly confirmed were “revenge porn”

The new technology recognizes nearly nude photos. For example: a lingerie shot coupled with foul texts which would suggest that a user uploaded the photo to either embarrass or seek revenge on another user

Leave a Reply