Detecting NSFW images with CoreML

February 18, 2019

You're making an app that allows people to upload images, perhaps even setting a profile photo, and you don't want to continually moderate inappropriate images. Luckly, with XCode 10 and iOS 12 you will be able to use machine learning to detect if a photo is inappropriate (NSFW) or not in only a few lines.

Let's get started:

  1. Download this model
  2. Drag the .mlmodel to anywhere in your XCode Project
  3. Add this UIImage extension to convert images into pixel buffers:
public extension UIImage {
    var pixelBuffer: CVPixelBuffer? {
        let width = Int(self.size.width)
        let height = Int(self.size.height)
        
        let attrs = [kCVPixelBufferCGImageCompatibilityKey: kCFBooleanTrue,
                     kCVPixelBufferCGBitmapContextCompatibilityKey: kCFBooleanTrue] as CFDictionary
        var pixelBuffer: CVPixelBuffer?
        let status = CVPixelBufferCreate(kCFAllocatorDefault, width, height,
                                         kCVPixelFormatType_32ARGB, attrs, &pixelBuffer)
        guard status == kCVReturnSuccess else {
            return nil
        }
        
        CVPixelBufferLockBaseAddress(pixelBuffer!, CVPixelBufferLockFlags(rawValue: 0))
        let pixelData = CVPixelBufferGetBaseAddress(pixelBuffer!)
        
        let rgbColorSpace = CGColorSpaceCreateDeviceRGB()
        guard let context = CGContext(data: pixelData, width: width, height: height,
                                      bitsPerComponent: 8,
                                      bytesPerRow: CVPixelBufferGetBytesPerRow(pixelBuffer!),
                                      space: rgbColorSpace,
                                      bitmapInfo: CGImageAlphaInfo.noneSkipFirst.rawValue) else {
            return nil
        }
        
        context.translateBy(x: 0, y: CGFloat(height))
        context.scaleBy(x: 1.0, y: -1.0)
        
        UIGraphicsPushContext(context)
        self.draw(in: CGRect(x: 0, y: 0, width: width, height: height))
        UIGraphicsPopContext()
        CVPixelBufferUnlockBaseAddress(pixelBuffer!, CVPixelBufferLockFlags(rawValue: 0))
        
        return pixelBuffer
    }
}
  1. Check if your image contains Adult content:
public func checkAdultContent(image: UIImage) {
        // Only works with iOS 11 and above unfortunately :( 
        guard #available(iOS 11.0, *) else {
            return
        }

        let model = Nudity()
        
        // Try to predict results from image buffer. 
        guard let result = try? model.prediction(data: image.pixelBuffer) else {
            fatalError("Prediction failed!")
        }
        
        // Play with the 0.75 threshold as you please. 
        if let confidence = result.prob["NSFW"], confidence > 0.75 {
            // Not safe for work!! 
        } else {
            // Safe for work!! 
        }
}

Thanks to Nudity-CoreML for providing the model.