Skip to content

add Compose how-to page for Docker Model Runner support with Compose #22392

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 2 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
66 changes: 66 additions & 0 deletions content/manuals/compose/how-tos/model-runner.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,66 @@
---
title: Use Docker Model Runner
description: Learn how to integrate Docker Model Runner with Docker Compose to build AI-powered applications
keywords: compose, docker compose, model runner, ai, llm, artificial intelligence, machine learning

Check failure on line 4 in content/manuals/compose/how-tos/model-runner.md

View workflow job for this annotation

GitHub Actions / vale

[vale] reported by reviewdog 🐶 [Vale.Spelling] Did you really mean 'llm'? Raw Output: {"message": "[Vale.Spelling] Did you really mean 'llm'?", "location": {"path": "content/manuals/compose/how-tos/model-runner.md", "range": {"start": {"line": 4, "column": 54}}}, "severity": "ERROR"}
weight: 111
params:
sidebar:
badge:
color: green
text: New
---

{{< summary-bar feature_name="Compose model runner" >}}

Docker Model Runner can be integrated with Docker Compose to run AI models as part of your multi-container applications.
This lets you define and run AI-powered applications alongside your other services.

## Prerequisites

- Docker Compose v2.35 or later
- Docker Desktop 4.41 or later
- Docker Desktop for Mac with Apple Silicon or Docker Desktop for Windows with NVIDIA GPU
- [Docker Model Runner enabled in Docker Desktop](/manuals/desktop/features/model-runner.md#enable-docker-model-runner)

## Provider services

Compose introduces a new service type called `provider` that allows you to declare platform capabilities required by your application. For AI models, you can use the `model` type to declare model dependencies.

Here's an example of how to define a model provider:

```yaml
services:
chat:
image: my-chat-app
depends_on:
- ai-runner

ai-runner:
provider:
type: model
options:
model: ai/smollm2
```

Notice the dedicated `provider` attribute in the `ai-runner` service.
This attribute specifies that the service is a model provider and lets you define options such as the name of the model to be used.

There is also a `depends_on` attribute in the `chat` service.
This attribute specifies that the `chat` service depends on the `ai-runner` service.
This means that the `ai-runner` service will be started before the `chat` service to allow injection of model information to the `chat` service.

Check warning on line 50 in content/manuals/compose/how-tos/model-runner.md

View workflow job for this annotation

GitHub Actions / vale

[vale] reported by reviewdog 🐶 [Docker.RecommendedWords] Consider using 'let' instead of 'allow' Raw Output: {"message": "[Docker.RecommendedWords] Consider using 'let' instead of 'allow'", "location": {"path": "content/manuals/compose/how-tos/model-runner.md", "range": {"start": {"line": 50, "column": 86}}}, "severity": "INFO"}

## How it works

During the `docker compose up` process, Docker Model Runner automatically pulls and runs the specified model.
It also sends Compose the model tag name and the URL to access the model runner.

This information is then passed to services which declare a dependency on the model provider.
In the example above, the `chat` service receives 2 environment variables prefixed by the service name:

Check warning on line 58 in content/manuals/compose/how-tos/model-runner.md

View workflow job for this annotation

GitHub Actions / vale

[vale] reported by reviewdog 🐶 [Docker.RecommendedWords] Consider using 'previous' instead of 'above' Raw Output: {"message": "[Docker.RecommendedWords] Consider using 'previous' instead of 'above'", "location": {"path": "content/manuals/compose/how-tos/model-runner.md", "range": {"start": {"line": 58, "column": 16}}}, "severity": "INFO"}
- `AI-RUNNER_URL` with the URL to access the model runner
- `AI-RUNNER_MODEL` with the model name which could be passed with the URL to request the model.

This lets the `chat` service to interact with the model and use it for its own purposes.

## Reference

- [Docker Model Runner documentation](/manuals/desktop/features/model-runner.md)
2 changes: 2 additions & 0 deletions data/summary.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -105,6 +105,8 @@ Compose mac address:
requires: Docker Compose [2.23.2](/manuals/compose/releases/release-notes.md#2232) and later
Compose menu:
requires: Docker Compose [2.26.0](/manuals/compose/releases/release-notes.md#2260) and later
Compose model runner:
requires: Docker Compose [2.35.0](/manuals/compose/releases/release-notes.md#2300) and later, and Docker Desktop 4.41 and later
Compose OCI artifact:
requires: Docker Compose [2.34.0](/manuals/compose/releases/release-notes.md#2340) and later
Compose replace file:
Expand Down