AAObnoxiousFilter is a profanity filter for images written in CoreML and Swift. Its a lightweight framework that can detect the inappropriation in UIImage with a single line of code.



To run the example project, clone the repo, and run pod install from the Example directory first.


  • iOS 11.0+
  • Xcode 10.0+
  • Swift 4.2+


AAObnoxiousFilter can be installed using CocoaPods, Carthage, or manually.


AAObnoxiousFilter is available through CocoaPods. To install CocoaPods, run:

$ gem install cocoapods

Then create a Podfile with the following contents:

source 'https://github.com/CocoaPods/Specs.git'
platform :ios, '11.0'

target '<Your Target Name>' do
pod 'AAObnoxiousFilter'

Finally, run the following command to install it:

$ pod install


To install Carthage, run (using Homebrew):

$ brew update
$ brew install carthage

Then add the following line to your Cartfile:

github "EngrAhsanAli/AAObnoxiousFilter" "master"

Then import the library in all files where you use it:

import AAObnoxiousFilter

Manual Installation

If you prefer not to use either of the above mentioned dependency managers, you can integrate AAObnoxiousFilter into your project manually by adding the files contained in the Classes folder to your project.

Getting Started

Detect the Image

if let prediction = image.predictImage() {
    // prediction if greater than 0.5, the image will have inappropriate look
else {
    // Exception in any other case if the Image is not valid to predict