diff --git a/python/packages/autogen-core/docs/src/user-guide/agentchat-user-guide/tutorial/models.ipynb b/python/packages/autogen-core/docs/src/user-guide/agentchat-user-guide/tutorial/models.ipynb index 095c09d5caf..67fbffe85a4 100644 --- a/python/packages/autogen-core/docs/src/user-guide/agentchat-user-guide/tutorial/models.ipynb +++ b/python/packages/autogen-core/docs/src/user-guide/agentchat-user-guide/tutorial/models.ipynb @@ -229,7 +229,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "## Ollama (Local)\n", + "## Ollama\n", "\n", "[Ollama](https://ollama.com/) is a local model server that can run models locally on your machine.\n", "\n", diff --git a/python/packages/autogen-core/docs/src/user-guide/core-user-guide/components/model-clients.ipynb b/python/packages/autogen-core/docs/src/user-guide/core-user-guide/components/model-clients.ipynb index 6447c6ccb0c..c35ac972c27 100644 --- a/python/packages/autogen-core/docs/src/user-guide/core-user-guide/components/model-clients.ipynb +++ b/python/packages/autogen-core/docs/src/user-guide/core-user-guide/components/model-clients.ipynb @@ -122,7 +122,7 @@ "\n", "You can use the {py:class}`~autogen_ext.models.OpenAIChatCompletionClient` to interact with OpenAI-compatible APIs such as Ollama and Gemini (beta).\n", "\n", - "#### Ollama (local)\n", + "#### Ollama\n", "\n", "The below example shows how to use a local model running on [Ollama](https://ollama.com) server." ]