NSFW Recognition

Content moderation solution for NSFW (Not Safe For Work) sexual images identification


 About

This solution analyzes images and classifies them into two distinct categories using AI-powered technology.

Not Safe For Work

The algorithm has recognized potentially offensive content in the image, which might not be appropriate for public places or workspace viewing. The solution provides a confidence level (in percentage) to indicate how certain it is that the content is NSFW.

Safe For Work

The algorithm hasn't flagged any inappropriate content in the image, and it is highly likely that it conforms to your community guidelines regarding sexual or similar content. The solution provides a confidence percentage to indicate the level of certainty that the content is SFW or NSFW.

Demo

Try this solution out right now! Select one of the pictures below or upload your own image.

Bubble.io plugin integration

Step-by-step guide

Screenshot of a computer screen showing a webpage with instructions to install a plugin named API4AI NSFW Image Recognition, with blue arrows pointing to the 'Add plugins' button at the top and the instructions, and text labels saying 'Install plugin', 'Read instructions', and 'Ask a question'.

Step 1: Install the plugin via the “Plugins” section in the development portal of your application.

Screenshot of a user interface showing a menu with options such as Account, Navigation, Data, Email, Payment, Analytics, Element Actions, Plugins, and Custom Events. An arrow with text instructs to open the 'Plugins' section, and another arrow indicates to check the image for an NSFW action.

Step 2: To add the “Check image for NSFW” action to your Bubble workflow, simply navigate to the “Plugins” section in the workflow editor and select the desired item from the list.

A screenshot of a software interface showing options to set an image source, including uploading a static image, selecting a dynamic image, or searching for external sources.

Step 3: Set the image source. You can choose to set a static image if you prefer, but it is more common to set a dynamic image source.

Screenshot of a software interface showing options for setting a file uploader state, with an example of setting the value as 'Dynamic image' and a blue arrow pointing to the text.

Example: The typical case is to use FileUploader’s value as the source for a “Dynamic image”.

Screenshot of a software interface for checking images for NSFW content, showing controls for adjusting algorithm strictness, with a blue arrow pointing to the strictness slider and instructional text explaining how to set the values between 0.0 and 1.0.

Step 4 (optional): Adjust the strictness of the NSFW algorithm as desired. You can set values within the range of 0.0 (less strict) to 1.0 (strict). You can explicitly set the strictness value (1.0 is the default) or use a value from a dynamic source, e.g., a Slider input.


For more details, see below ↓

A computer screen showing a user interface for setting a state, with dropdown menus and a text box displaying instructions to use 'NSFW probability' to update the Text element via Custom state.

Step 5: Retrieve and use the returned values.

Example: Use the “NSFW probability” to update a Text element through a Custom state.


For more details, see below ↓

About “Strictness”

Term NSFW is not well defined. The same appearance may be appropriate or not depending on context. E.g. photo of woman in bikini may considered both as SFW and NSFW. In order to satisfy needs in various scenarios we introduces strictness query parameter in order to control how strict algorithm should be.

Woman in a teal bikini with sunglasses standing outdoors with palm trees in the background.

strictness = 1.0 (default)

A woman in sunglasses and a bikini standing outdoors near palm trees.

strictness = 0.0

By default algorithms is as strict as possible (strictness 1.0) and even photo of woman in bikini is considered as NSFW. But you may reduces strictness if it suites better your needs (up to 0.0 value).

Returned values

The “Check image for NSFW” action returns a set of values which can be used to obtain processing results or to handle errors:

  • nsfw (number) – Represents the NSFW probability. The key type is a number, typically ranging from 0.0 (safe) to 1.0 (not safe). A special negative value -1.0 used to indicate processing error.

  • success (yes/no) – A boolean flag indicating whether the processing finished successfully.

  • message (text) – A message explaining the processing status.

Looking for direct HTTP API integration?

Just copy&paste this snippet code into your command line terminal to try the API at your computer
$
curl -X "POST" \
  "https://demo.api4ai.cloud/nsfw/v1/results" \
  -F "url=https://storage.googleapis.com/api4ai-static/samples/nsfw-1.jpg"
Or try web sample right in JSFiddle code playground
Explore more code examples for various programming languages: