You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
 
 
Bastian (BaM) 127954b2fb Fix command execution in network.send_wol to handle return values correctly. 3 months ago
scripts Fix command execution in network.send_wol to handle return values correctly. 3 months ago
.dockerignore Add .dockerignore and update Dockerfile and compose.yaml for SSH key handling 3 months ago
Dockerfile Refactor to use LuaJIT and improve performance 3 months ago
LUAJIT_README.md Refactor to use LuaJIT and improve performance 3 months ago
README.md Add README.md for Auto-Boot Ollama Host project 3 months ago
compose.yaml Rearrange SSH and DEBUG environment variables in compose.yaml 3 months ago

README.md

Auto-Boot Ollama Host

A Docker-based service that automatically starts and shuts down an Ollama host based on log patterns from the Paperless AI container.

Overview

This project monitors Paperless AI container logs and automatically starts a remote Ollama service on a Windows host with dedicated graphics card when specific error patterns are detected. The Windows host is powered on via Wake-on-LAN, the Ollama service is started, and after task completion the service is stopped and the host is shut down again.

Features

  • Wake-on-LAN: Powers on the Windows host with dedicated graphics card via WOL
  • Automatic Ollama Start: Starts the Ollama service on the Windows host
  • Desktop Session Detection: Prevents interruptions during active user sessions
  • Automatic Shutdown: Stops the service and shuts down the host after completion
  • Energy Efficiency: Host runs only when needed and is automatically shut down
  • Modular Architecture: Clean separation of functionalities

Environment Variables

The following environment variables must be set in Komodo:

SSH Configuration

SSH_USER=                    # Username for SSH connection
SSH_PUBLIC_KEY="[[SSH_PUBLIC_KEY_RTX]]"   # Public SSH key
SSH_PRIVATE_KEY="[[SSH_PRIVATE_KEY_RTX]]" # Private SSH key

Syntax Note: The [[VARIABLE_NAME]] syntax references secrets defined in Settings → Variables of Komodo. These are replaced at runtime with the actual key values.

Wake-on-LAN

WOL_MAC=                     # MAC address of the target host for WOL

Additional Configuration

Additional environment variables can be viewed in scripts/config.lua.

Usage

Docker Compose

docker-compose up -d

Direct with Docker

docker build -t auto-boot-ollama-host .
docker run -d --name auto-boot-ollama-host auto-boot-ollama-host

Project Structure

├── README.md                    # This file
├── Dockerfile                   # Docker image definition
├── compose.yaml                 # Docker Compose configuration
├── .dockerignore               # Docker ignore rules
└── scripts/                    # Lua scripts
    ├── README.md               # Detailed script documentation
    ├── auto-boot-ollama-host.lua # Main script
    ├── config.lua              # Configuration management
    ├── utils.lua               # Utility functions
    ├── network.lua             # Network functions
    ├── ssh.lua                 # SSH operations
    ├── ollama_manager.lua      # Ollama service management
    └── session_check.lua       # Windows desktop session detection

How It Works

  1. Log Monitoring: The script continuously monitors logs of the Paperless AI container
  2. Pattern Detection: When ERROR_PATTERN is detected, the Windows host with dedicated graphics card is started
  3. Session Check: Before starting, it checks if a user is logged into the desktop
  4. Wake-on-LAN: A WOL packet is sent to the Windows host to power it on
  5. SSH Connection: After booting, an SSH connection to the Windows host is established
  6. Service Start: The Ollama service is started on the Windows host via SSH
  7. Task Execution: The Windows host executes Ollama tasks using the dedicated graphics card
  8. Finish Pattern: When FINISH_PATTERN is detected, the Ollama service is stopped
  9. Shutdown: The Windows host is automatically shut down to save energy

Prerequisites

  • Docker and Docker Compose
  • Windows host with dedicated graphics card (for Ollama computations)
  • SSH access to the Windows host
  • Wake-on-LAN support on the Windows host
  • NSSM (Non-Sucking Service Manager) on the Windows host for service management
  • Ollama installation on the Windows host

Configuration

Detailed configuration options can be found in scripts/README.md.