0
votes

I have a Rails 5 app that allows users to upload images to their profiles using the new ActiveStorage with AWS S3 Storage process.

I've been searching for a way to detect inappropriate content / explicit images in the uploads so I could prevent the user from displaying those on their accounts, but I'm not sure how I could accomplish this.

I don't want to have to moderate the content uploads. I know there are ways to allow users to "flag as inappropriate". I would prefer to not allow explicit content to be uploaded at all.

I figure the best solution would be for the Rails app to detect the explicit content and put in a placeholder image instead of the user's inappropriate image.

One idea was AWS Rekognition. Has anybody successfully implemented a solution for this problem?

2

2 Answers

0
votes

Why does it have to be so complex? Most websites simply allow you to flag user content as bad and then an admin can review it to be removed. This would take very little time to implement, if you find that this is inadequate after you implement it you can try something else but I wouldn't jump straight to the most complex and expensive answer to the problem.

0
votes

You can implement AWS Lambda function in Ruby which will be triggered every time your image uploaded to S3. This Lambda function then can call AWS Rekognition. You can find about how to use AWS Rekognition to detect unsafe images here: https://docs.aws.amazon.com/rekognition/latest/dg/procedure-moderate-images.html.

I used Rekognition before, but for a different use case. You can find out more here: https://github.com/johannesridho/aident