Ollama API Module for Duso

Access local LLMs through Ollama's OpenAI-compatible API.

Setup

  1. Install Ollama
  2. Start Ollama: ollama serve
  3. Pull a model: ollama pull mistral (or any model you want)
  4. Use in Duso:
ollama = require("ollama")
response = ollama.prompt("What is Ollama?")
print(response)

Quick Start

ollama = require("ollama")

// One-shot query
response = ollama.prompt("Explain machine learning simply")

// Multi-turn conversation
chat = ollama.session({
  system = "You are a helpful coding assistant",
  model = "mistral"
})

response1 = chat.prompt("How do I write a loop?")
response2 = chat.prompt("Can you show an example?")
print(chat.usage)

Endpoint

Default: http://localhost:11434

To use a different host/port, modify the endpoint in ollama.du.

Available Models

Run ollama list to see what you have installed. Popular choices:

Pull new models: ollama pull <model-name>

No API Key Required

Ollama runs locally - no authentication needed.

Configuration Options

Same as OpenAI module - see openai.md for full reference.

Key differences:

See Also