From anywhere you can type, query and stream the output of an LLM or any other script
See the assets to download and install this version.
Full Changelog: https://github.com/jasonjmcghee/plock/compare/v0.1.2...v0.1.3
Full Changelog: https://github.com/jasonjmcghee/plock/compare/v0.1.1...v0.1.2
For Mac Requires: xattr -c plock.app
- I haven't done code signing yet. I recommend building it yourself for now.
Install ollama and make sure to run ollama pull openhermes2.5-mistral
or swap it out in the code for something else.
Windows users must use a custom script which can be set in the settings file.
Full Changelog: https://github.com/jasonjmcghee/plock/compare/v0.1.0...v0.1.1
Requires: xattr -c plock.app
- I haven't done code signing yet. I recommend building it yourself for now.
Install ollama and make sure to run ollama pull openhermes2.5-mistral
or swap it out in the code for something else.
Ctrl / Cmd + Shift + .
: Replace the selected text with the output of the model.
Ctrl / Cmd + Shift + /
: Feed whatever is on your clipboard as "context" and the replace the selected text with the output of the model.
Escape
: Cancel ongoing stream