1f31c3d5a5 | ||
---|---|---|
.gitignore | ||
LICENSE | ||
Modelfile | ||
README.md | ||
discollama.py | ||
poetry.lock | ||
pyproject.toml |
README.md
discollama
Dependencies
python3 -m pip install poetry
poetry install
Run discollama.py
poetry run python discollama.py
Note: You must setup a Discord Bot and set environment variable DISCORD_TOKEN
before discollama.py
can access Discord.
discollama.py
requires an Ollama server. By default, it uses 127.0.0.1:11434
but this can be configured with command line parameters --ollama-host
and --ollama-port
.
The default LLM is llama2
but this can be configured with --ollama-model
. A custom personality can be created by changing the SYSTEM
instruction in the Modelfile and running ollama create discollama -f Modelfile
. It can be referenced in discollama.py
with --ollama-model discollama
, e.g. poetry run python discollama.py --ollama-model discollama
.