Browse Source
Introduce a comprehensive README file detailing the project's purpose, features, environment variables, usage instructions, project structure, and how it operates. This addition enhances documentation and provides clarity for users and developers.main
2 changed files with 105 additions and 8 deletions
@ -0,0 +1,98 @@ |
|||||
|
# Auto-Boot Ollama Host |
||||
|
|
||||
|
A Docker-based service that automatically starts and shuts down an Ollama host based on log patterns from the [Paperless AI](https://github.com/clusterzx/paperless-ai) container. |
||||
|
|
||||
|
## Overview |
||||
|
|
||||
|
This project monitors [Paperless AI](https://github.com/clusterzx/paperless-ai) container logs and automatically starts a remote Ollama service on a Windows host with dedicated graphics card when specific error patterns are detected. The Windows host is powered on via Wake-on-LAN, the Ollama service is started, and after task completion the service is stopped and the host is shut down again. |
||||
|
|
||||
|
## Features |
||||
|
|
||||
|
- **Wake-on-LAN**: Powers on the Windows host with dedicated graphics card via WOL |
||||
|
- **Automatic Ollama Start**: Starts the Ollama service on the Windows host |
||||
|
- **Desktop Session Detection**: Prevents interruptions during active user sessions |
||||
|
- **Automatic Shutdown**: Stops the service and shuts down the host after completion |
||||
|
- **Energy Efficiency**: Host runs only when needed and is automatically shut down |
||||
|
- **Modular Architecture**: Clean separation of functionalities |
||||
|
|
||||
|
## Environment Variables |
||||
|
|
||||
|
The following environment variables must be set in Komodo: |
||||
|
|
||||
|
### SSH Configuration |
||||
|
|
||||
|
```bash |
||||
|
SSH_USER= # Username for SSH connection |
||||
|
SSH_PUBLIC_KEY="[[SSH_PUBLIC_KEY_RTX]]" # Public SSH key |
||||
|
SSH_PRIVATE_KEY="[[SSH_PRIVATE_KEY_RTX]]" # Private SSH key |
||||
|
``` |
||||
|
|
||||
|
**Syntax Note**: The `[[VARIABLE_NAME]]` syntax references secrets defined in Settings → Variables of Komodo. These are replaced at runtime with the actual key values. |
||||
|
|
||||
|
### Wake-on-LAN |
||||
|
|
||||
|
```bash |
||||
|
WOL_MAC= # MAC address of the target host for WOL |
||||
|
``` |
||||
|
|
||||
|
### Additional Configuration |
||||
|
|
||||
|
Additional environment variables can be viewed in `scripts/config.lua`. |
||||
|
|
||||
|
## Usage |
||||
|
|
||||
|
### Docker Compose |
||||
|
|
||||
|
```bash |
||||
|
docker-compose up -d |
||||
|
``` |
||||
|
|
||||
|
### Direct with Docker |
||||
|
|
||||
|
```bash |
||||
|
docker build -t auto-boot-ollama-host . |
||||
|
docker run -d --name auto-boot-ollama-host auto-boot-ollama-host |
||||
|
``` |
||||
|
|
||||
|
## Project Structure |
||||
|
|
||||
|
```text |
||||
|
├── README.md # This file |
||||
|
├── Dockerfile # Docker image definition |
||||
|
├── compose.yaml # Docker Compose configuration |
||||
|
├── .dockerignore # Docker ignore rules |
||||
|
└── scripts/ # Lua scripts |
||||
|
├── README.md # Detailed script documentation |
||||
|
├── auto-boot-ollama-host.lua # Main script |
||||
|
├── config.lua # Configuration management |
||||
|
├── utils.lua # Utility functions |
||||
|
├── network.lua # Network functions |
||||
|
├── ssh.lua # SSH operations |
||||
|
├── ollama_manager.lua # Ollama service management |
||||
|
└── session_check.lua # Windows desktop session detection |
||||
|
``` |
||||
|
|
||||
|
## How It Works |
||||
|
|
||||
|
1. **Log Monitoring**: The script continuously monitors logs of the [Paperless AI](https://github.com/clusterzx/paperless-ai) container |
||||
|
2. **Pattern Detection**: When ERROR_PATTERN is detected, the Windows host with dedicated graphics card is started |
||||
|
3. **Session Check**: Before starting, it checks if a user is logged into the desktop |
||||
|
4. **Wake-on-LAN**: A WOL packet is sent to the Windows host to power it on |
||||
|
5. **SSH Connection**: After booting, an SSH connection to the Windows host is established |
||||
|
6. **Service Start**: The Ollama service is started on the Windows host via SSH |
||||
|
7. **Task Execution**: The Windows host executes Ollama tasks using the dedicated graphics card |
||||
|
8. **Finish Pattern**: When FINISH_PATTERN is detected, the Ollama service is stopped |
||||
|
9. **Shutdown**: The Windows host is automatically shut down to save energy |
||||
|
|
||||
|
## Prerequisites |
||||
|
|
||||
|
- Docker and Docker Compose |
||||
|
- Windows host with dedicated graphics card (for Ollama computations) |
||||
|
- SSH access to the Windows host |
||||
|
- Wake-on-LAN support on the Windows host |
||||
|
- NSSM (Non-Sucking Service Manager) on the Windows host for service management |
||||
|
- Ollama installation on the Windows host |
||||
|
|
||||
|
## Configuration |
||||
|
|
||||
|
Detailed configuration options can be found in `scripts/README.md`. |
||||
Loading…
Reference in new issue