# infinitered_nsfwjs
**Repository Path**: flt2012/infinitered_nsfwjs
## Basic Information
- **Project Name**: infinitered_nsfwjs
- **Description**: No description available
- **Primary Language**: Unknown
- **License**: MIT
- **Default Branch**: master
- **Homepage**: None
- **GVP Project**: No
## Statistics
- **Stars**: 0
- **Forks**: 0
- **Created**: 2019-09-06
- **Last Updated**: 2020-12-19
## Categories & Tags
**Categories**: Uncategorized
**Tags**: None
## README
Client-side indecent content checking
[](#contributors)
[]()
A simple JavaScript library to help you quickly identify unseemly images; all in the client's browser. NSFWJS isn't perfect, but it's pretty accurate (~90% from our test set of 15,000 test images)... and it's getting more accurate all the time.
Why would this be useful? [Check out the announcement blog post](https://shift.infinite.red/avoid-nightmares-nsfw-js-ab7b176978b1).
The library categorizes image probabilities in the following 5 classes:
- `Drawing` - safe for work drawings (including anime)
- `Hentai` - hentai and pornographic drawings
- `Neutral` - safe for work neutral images
- `Porn` - pornographic images, sexual acts
- `Sexy` - sexually explicit images, not pornography
The demo is a continuous deployment source - Give it a go: http://nsfwjs.com/
## How to use the module
With `async/await` support:
```js
import * as nsfwjs from 'nsfwjs'
const img = document.getElementById('img')
// Load model from my S3.
// See the section hosting the model files on your site.
const model = await nsfwjs.load()
// Classify the image
const predictions = await model.classify(img)
console.log('Predictions: ', predictions)
```
Without `async/await` support:
```js
import * as nsfwjs from 'nsfwjs'
const img = document.getElementById('img')
// Load model from my S3.
// See the section hosting the model files on your site.
nsfwjs.load().then(function(model) {
model.classify(img).then(function(predictions) {
// Classify the image
console.log('Predictions: ', predictions)
})
})
```
## API
#### `load` the model
Before you can classify any image, you'll need to load the model. You should use the optional first parameter and load the model from your website, as explained in the install directions.
```js
const model = nsfwjs.load('/path/to/model/directory/')
```
If you're using a model that needs an image of dimension other than 224x224, you can pass the size in the options parameter.
```js
const model = nsfwjs.load('/path/to/different/model/', {size: 299})
```
**Parameters**
- optional URL to the `model.json` folder.
- optional object with size property that your model expects.
**Returns**
- Ready to use NSFWJS model object
#### `classify` an image
This function can take any browser-based image elements (`
`, `