Run an AI-powered Discord bot from the comfort of your laptop.
 
 
Go to file
dependabot[bot] dff0d11849
Bump the pip group across 1 directory with 2 updates
Bumps the pip group with 2 updates in the / directory: [aiohttp](https://github.com/aio-libs/aiohttp) and [idna](https://github.com/kjd/idna).


Updates `aiohttp` from 3.9.3 to 3.9.4
- [Release notes](https://github.com/aio-libs/aiohttp/releases)
- [Changelog](https://github.com/aio-libs/aiohttp/blob/master/CHANGES.rst)
- [Commits](https://github.com/aio-libs/aiohttp/compare/v3.9.3...v3.9.4)

Updates `idna` from 3.6 to 3.7
- [Release notes](https://github.com/kjd/idna/releases)
- [Changelog](https://github.com/kjd/idna/blob/master/HISTORY.rst)
- [Commits](https://github.com/kjd/idna/compare/v3.6...v3.7)

---
updated-dependencies:
- dependency-name: aiohttp
  dependency-type: indirect
  dependency-group: pip
- dependency-name: idna
  dependency-type: indirect
  dependency-group: pip
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-04-18 16:03:07 +00:00
.gitignore Initial commit 2023-07-30 12:43:45 -07:00
Dockerfile Update README.md 2023-11-01 15:16:18 -07:00
LICENSE Initial commit 2023-07-30 12:43:45 -07:00
Modelfile Update Modelfile 2024-02-02 15:09:40 -08:00
README.md Update README.md 2023-11-01 15:16:18 -07:00
compose.yaml Update README.md 2023-11-01 15:16:18 -07:00
discollama.py fix send empty value 2024-02-02 15:02:23 -08:00
poetry.lock Bump the pip group across 1 directory with 2 updates 2024-04-18 16:03:07 +00:00
pyproject.toml use ollama client 2024-01-12 13:34:12 -08:00

README.md

discollama

discollama is a Discord bot powered by a local large language model backed by Ollama.

Dependencies

  • Docker and Docker Compose

Run discollama.py

DISCORD_TOKEN=xxxxx docker compose up

Note: You must setup a Discord Bot and set environment variable DISCORD_TOKEN before discollama.py can access Discord.

discollama.py requires an Ollama server. Follow the steps in jmorganca/ollama repository to setup Ollama.

By default, it uses 127.0.0.1:11434 which can be overwritten with OLLAMA_HOST.

Note: Deploying this on Linux requires updating network configurations and OLLAMA_HOST.

Customize discollama.py

The default LLM is mike/discollama. A custom personality can be added by changing the SYSTEM instruction in the Modelfile and running ollama create:

ollama create mymodel -f Modelfile

This can be changed in compose.yaml:

environment:
  - OLLAMA_MODEL=mymodel

See jmorganca/ollama for more details.

Activating the Bot

Discord users can interact with the bot by mentioning it in a message to start a new conversation or in a reply to a previous response to continue an ongoing conversation.