Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ spec:
- name: key
value: "mykey"
- name: model
value: claude-3-5-sonnet-20240620
value: '${{DAPR_CONVERSATION_ANTHROPIC_MODEL}}'
- name: cacheTTL
value: 10m
```
Expand All @@ -34,7 +34,7 @@ The above example uses secrets as plain strings. It is recommended to use a secr
| Field | Required | Details | Example |
|--------------------|:--------:|---------|---------|
| `key` | Y | API key for Anthropic. | `"mykey"` |
| `model` | N | The Anthropic LLM to use. Defaults to `claude-3-5-sonnet-20240620` | `claude-3-5-sonnet-20240620` |
| `model` | N | The Anthropic LLM to use. Defaults to `claude-3-5-sonnet-20240620` (configurable via `DAPR_CONVERSATION_ANTHROPIC_MODEL` environment variable). | `${{DAPR_CONVERSATION_ANTHROPIC_MODEL}}` |
| `cacheTTL` | N | A time-to-live value for a prompt cache to expire. Uses Golang duration format. | `10m` |

## Related links
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,42 @@
---
type: docs
title: "GoogleAI"
linkTitle: "GoogleAI"
description: Detailed information on the GoogleAI conversation component
---

## Component format

A Dapr `conversation.yaml` component file has the following structure:

```yaml
apiVersion: dapr.io/v1alpha1
kind: Component
metadata:
name: googleai
spec:
type: conversation.googleai
metadata:
- name: key
value: "mykey"
- name: model
value: '${{DAPR_CONVERSATION_GOOGLEAI_MODEL}}'
- name: cacheTTL
value: 10m
```

{{% alert title="Warning" color="warning" %}}
The above example uses secrets as plain strings. It is recommended to use a secret store for the secrets, as described [here]({{< ref component-secrets.md >}}).
{{% /alert %}}

## Spec metadata fields

| Field | Required | Details | Example |
|--------------------|:--------:|---------|---------|
| `key` | Y | API key for GoogleAI. | `"mykey"` |
| `model` | N | The GoogleAI LLM to use. Defaults to `gemini-1.5-flash` (configurable via `DAPR_CONVERSATION_GOOGLEAI_MODEL` environment variable). | `${{DAPR_CONVERSATION_GOOGLEAI_MODEL}}` |
| `cacheTTL` | N | A time-to-live value for a prompt cache to expire. Uses Golang duration format. | `10m` |

## Related links

- [Conversation API overview]({{< ref conversation-overview.md >}})
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ spec:
- name: key
value: mykey
- name: model
value: meta-llama/Meta-Llama-3-8B
value: '${{DAPR_CONVERSATION_HUGGINGFACE_MODEL}}'
- name: cacheTTL
value: 10m
```
Expand All @@ -34,7 +34,7 @@ The above example uses secrets as plain strings. It is recommended to use a secr
| Field | Required | Details | Example |
|--------------------|:--------:|---------|---------|
| `key` | Y | API key for Huggingface. | `mykey` |
| `model` | N | The Huggingface LLM to use. Defaults to `meta-llama/Meta-Llama-3-8B`. | `meta-llama/Meta-Llama-3-8B` |
| `model` | N | The Huggingface LLM to use. Defaults to `deepseek-ai/DeepSeek-R1-Distill-Qwen-32B` (configurable via `DAPR_CONVERSATION_HUGGINGFACE_MODEL` environment variable). | `${{DAPR_CONVERSATION_HUGGINGFACE_MODEL}}` |
| `cacheTTL` | N | A time-to-live value for a prompt cache to expire. Uses Golang duration format. | `10m` |

## Related links
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ spec:
- name: key
value: mykey
- name: model
value: open-mistral-7b
value: '${{DAPR_CONVERSATION_MISTRAL_MODEL}}'
- name: cacheTTL
value: 10m
```
Expand All @@ -34,7 +34,7 @@ The above example uses secrets as plain strings. It is recommended to use a secr
| Field | Required | Details | Example |
|--------------------|:--------:|---------|---------|
| `key` | Y | API key for Mistral. | `mykey` |
| `model` | N | The Mistral LLM to use. Defaults to `open-mistral-7b`. | `open-mistral-7b` |
| `model` | N | The Mistral LLM to use. Defaults to `open-mistral-7b` (configurable via `DAPR_CONVERSATION_MISTRAL_MODEL` environment variable). | `${{DAPR_CONVERSATION_MISTRAL_MODEL}}` |
| `cacheTTL` | N | A time-to-live value for a prompt cache to expire. Uses Golang duration format. | `10m` |

## Related links
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,39 @@
---
type: docs
title: "Ollama"
linkTitle: "Ollama"
description: Detailed information on the Ollama conversation component
---

## Component format

A Dapr `conversation.yaml` component file has the following structure:

```yaml
apiVersion: dapr.io/v1alpha1
kind: Component
metadata:
name: ollama
spec:
type: conversation.ollama
metadata:
- name: model
value: '${{DAPR_CONVERSATION_OLLAMA_MODEL}}'
- name: cacheTTL
value: 10m
```

{{% alert title="Warning" color="warning" %}}
The above example uses secrets as plain strings. It is recommended to use a secret store for the secrets, as described [here]({{< ref component-secrets.md >}}).
{{% /alert %}}

## Spec metadata fields

| Field | Required | Details | Example |
|--------------------|:--------:|---------|---------|
| `model` | N | The Ollama LLM to use. Defaults to `llama3.2:latest` (configurable via `DAPR_CONVERSATION_OLLAMA_MODEL` environment variable). | `${{DAPR_CONVERSATION_OLLAMA_MODEL}}` |
| `cacheTTL` | N | A time-to-live value for a prompt cache to expire. Uses Golang duration format. | `10m` |

## Related links

- [Conversation API overview]({{< ref conversation-overview.md >}})
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ spec:
- name: key
value: mykey
- name: model
value: gpt-4-turbo
value: '${{DAPR_CONVERSATION_OPENAI_MODEL}}'
- name: cacheTTL
value: 10m
```
Expand All @@ -34,7 +34,7 @@ The above example uses secrets as plain strings. It is recommended to use a secr
| Field | Required | Details | Example |
|--------------------|:--------:|---------|---------|
| `key` | Y | API key for OpenAI. | `mykey` |
| `model` | N | The OpenAI LLM to use. Defaults to `gpt-4-turbo`. | `gpt-4-turbo` |
| `model` | N | The OpenAI LLM to use. Defaults to `gpt-5-nano` (configurable via `DAPR_CONVERSATION_OPENAI_MODEL` environment variable). | `${{DAPR_CONVERSATION_OPENAI_MODEL}}` |
| `cacheTTL` | N | A time-to-live value for a prompt cache to expire. Uses Golang duration format. | `10m` |

## Related links
Expand Down