A free, open source, and privacy-focused browser extension to block “not safe for work” content built using TypeScript and TensorFlow.js.
Extension
src
)Codebase
v1.0.0 released!
The images are now hidden when a page loads and become visible only when they are found to be not NSFW. The NSFW images remain hidden.
Performance improvements.
Bug fixes.
What's new???
Bug Fixes Performance Improvements
The lag between loading the web page and performing inference is reduced. The images would be passed to the model as soon as the page loads and as the user scrolls through the site, newly loaded images are passed to the model.
Bug fixes.
Warnings shown when the extension is run in developer mode are due to issues in TensorFlow JS. It would be patched by TensorFlow in the next update. It won't affect the functioning of the model.
This is the first finished beta release.
The basic Extension is available for use on Google Chrome only.
Released for testing on multiple websites, bug-fixes and feature enhancements.