Run an AI-powered Discord bot from the comfort of your laptop.
 
 
Go to file
Michael Yang ce52558c6a
Merge pull request #6 from mxyng/dependabot/pip/aiohttp-3.9.0
Bump aiohttp from 3.8.6 to 3.9.0
2023-12-01 14:23:00 -08:00
.gitignore Initial commit 2023-07-30 12:43:45 -07:00
Dockerfile Update README.md 2023-11-01 15:16:18 -07:00
LICENSE Initial commit 2023-07-30 12:43:45 -07:00
Modelfile update modelfile 2023-10-12 11:59:26 -07:00
README.md Update README.md 2023-11-01 15:16:18 -07:00
compose.yaml Update README.md 2023-11-01 15:16:18 -07:00
discollama.py check error 2023-10-14 13:46:00 -07:00
poetry.lock Bump aiohttp from 3.8.6 to 3.9.0 2023-11-28 00:50:40 +00:00
pyproject.toml use redis json 2023-10-14 13:38:15 -07:00

README.md

discollama

discollama is a Discord bot powered by a local large language model backed by Ollama.

Dependencies

  • Docker and Docker Compose

Run discollama.py

DISCORD_TOKEN=xxxxx docker compose up

Note: You must setup a Discord Bot and set environment variable DISCORD_TOKEN before discollama.py can access Discord.

discollama.py requires an Ollama server. Follow the steps in jmorganca/ollama repository to setup Ollama.

By default, it uses 127.0.0.1:11434 which can be overwritten with OLLAMA_HOST.

Note: Deploying this on Linux requires updating network configurations and OLLAMA_HOST.

Customize discollama.py

The default LLM is mike/discollama. A custom personality can be added by changing the SYSTEM instruction in the Modelfile and running ollama create:

ollama create mymodel -f Modelfile

This can be changed in compose.yaml:

environment:
  - OLLAMA_MODEL=mymodel

See jmorganca/ollama for more details.

Activating the Bot

Discord users can interact with the bot by mentioning it in a message to start a new conversation or in a reply to a previous response to continue an ongoing conversation.