I have a Rails 5 app that allows users to upload images to their profiles using the new ActiveStorage with AWS S3 Storage process.
I've been searching for a way to detect inappropriate content / explicit images in the uploads so I could prevent the user from displaying those on their accounts, but I'm not sure how I could accomplish this.
I don't want to have to moderate the content uploads. I know there are ways to allow users to "flag as inappropriate". I would prefer to not allow explicit content to be uploaded at all.
I figure the best solution would be for the Rails app to detect the explicit content and put in a placeholder image instead of the user's inappropriate image.
One idea was AWS Rekognition. Has anybody successfully implemented a solution for this problem?