Stable Diffusion is an open-source machine learning image filter designed to quickly and accurately identify whether an image contains content that is not safe for work (NSFW).
It is powered by open-source models and can be used with any image, not just AI-generated ones. The tool provides a simple user interface, allowing users to upload or drag and drop PNG or JPG images up to 50MB in size.
The safety checker is poorly documented but can be modified with a few changes. The tool is built by @m1guelpf, an experienced software developer. Stable Diffusion is a powerful tool that can quickly identify NSFW content in images so that it can be avoided or removed.