Logseq plugin to integerate with ollama
A plugin to integrate ollama with logseq
ollama-logseq-config
to add more context manual commandsOllama offers many different models to choose from for various of tasks. This feature configures model on the per block base and the attribute is also used by its immediate children while using context menu commands for blocks. The properties are named after the Ollama's generate API and currently, only the model
is used. Add the ollama-generate-model:: model_name
at the end of the block to specify the model to use for the block and its immediate children.
Write a SciFi story of Shanghai 2050.
ollama-generate-model:: deepseek-llm:7b-chat
Currently, three context menu commands would be affected by this properties.
ollama-logseq-config
The plugin also reads the page ollama-logseq-config
to add more context commands. The page should be a markdown page with the following format.
ollama-context-menu-title:: Ollama: Extract Keywords
ollama-prompt-prefix:: Extract 10 keywords from the following:
Each one of the block with these two properties will create a new context menu command after restarting logseq. The prefix is added in front of the text of the block when the command is invokved on the block.
If you have any features suggestions feel free to open an issue
If this plugin helps you, I'd really appreciate your support. You can buy me a coffee here.