NSFW Recognition

Content moderation solution for NSFW (Not Safe For Work) sexual images identification


 About

This solution analyzes images and classifies them into two distinct categories using AI-powered technology.

Not Safe For Work

The algorithm has recognized potentially offensive content in the image, which might not be appropriate for public places or workspace viewing. The solution provides a confidence level (in percentage) to indicate how certain it is that the content is NSFW.

Safe For Work

The algorithm hasn't flagged any inappropriate content in the image, and it is highly likely that it conforms to your community guidelines regarding sexual or similar content. The solution provides a confidence percentage to indicate the level of certainty that the content is SFW or NSFW.

Demo

Try this solution out right now! Select one of the pictures below or upload your own image.

Bubble.io plugin integration

Step-by-step guide

Step 1: Install the plugin via the “Plugins” section in the development portal of your application.

Step 2: To add the “Check image for NSFW” action to your Bubble workflow, simply navigate to the “Plugins” section in the workflow editor and select the desired item from the list.

Step 3: Set the image source. You can choose to set a static image if you prefer, but it is more common to set a dynamic image source.

Example: The typical case is to use FileUploader’s value as the source for a “Dynamic image”.

Step 4 (optional): Adjust the strictness of the NSFW algorithm as desired. You can set values within the range of 0.0 (less strict) to 1.0 (strict). You can explicitly set the strictness value (1.0 is the default) or use a value from a dynamic source, e.g., a Slider input.


For more details, see below ↓

Step 5: Retrieve and use the returned values.

Example: Use the “NSFW probability” to update a Text element through a Custom state.


For more details, see below ↓

About “Strictness”

Term NSFW is not well defined. The same appearance may be appropriate or not depending on context. E.g. photo of woman in bikini may considered both as SFW and NSFW. In order to satisfy needs in various scenarios we introduces strictness query parameter in order to control how strict algorithm should be.

strictness = 1.0 (default)

strictness = 0.0

By default algorithms is as strict as possible (strictness 1.0) and even photo of woman in bikini is considered as NSFW. But you may reduces strictness if it suites better your needs (up to 0.0 value).

Returned values

The “Check image for NSFW” action returns a set of values which can be used to obtain processing results or to handle errors:

  • nsfw (number) – Represents the NSFW probability. The key type is a number, typically ranging from 0.0 (safe) to 1.0 (not safe). A special negative value -1.0 used to indicate processing error.

  • success (yes/no) – A boolean flag indicating whether the processing finished successfully.

  • message (text) – A message explaining the processing status.

Looking for direct HTTP API integration?

Just copy&paste this snippet code into your command line terminal to try the API at your computer
$
curl -X "POST" \
  "https://demo.api4ai.cloud/nsfw/v1/results" \
  -F "url=https://storage.googleapis.com/api4ai-static/samples/nsfw-1.jpg"
Or try web sample right in JSFiddle code playground
Explore more code examples for various programming languages: