Skip to content

Commit

Permalink
Merge branch 'main' into assistant_Agent_tools
Browse files Browse the repository at this point in the history
  • Loading branch information
ekzhu authored Dec 10, 2024
2 parents da621e7 + f5140ba commit f5ca240
Show file tree
Hide file tree
Showing 47 changed files with 338 additions and 194 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,6 @@
from typing import Any, AsyncGenerator, Awaitable, Callable, Dict, List, Mapping, Sequence

from autogen_core import CancellationToken, FunctionCall
from autogen_core.components.tools import FunctionTool, Tool
from autogen_core.models import (
AssistantMessage,
ChatCompletionClient,
Expand All @@ -15,6 +14,7 @@
SystemMessage,
UserMessage,
)
from autogen_core.tools import FunctionTool, Tool
from typing_extensions import deprecated

from .. import EVENT_LOGGER_NAME
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -2,10 +2,10 @@
import warnings
from typing import Any, Awaitable, Callable, List

from autogen_core.components.tools import Tool
from autogen_core.models import (
ChatCompletionClient,
)
from autogen_core.tools import Tool

from .. import EVENT_LOGGER_NAME
from ._assistant_agent import AssistantAgent
Expand Down
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
import logging
from typing import Any, Dict

from autogen_core.components.tools import FunctionTool, Tool
from autogen_core.tools import FunctionTool, Tool
from pydantic import BaseModel, Field, model_validator

from .. import EVENT_LOGGER_NAME
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@
ToolCallResultMessage,
)
from autogen_core import Image
from autogen_core.components.tools import FunctionTool
from autogen_core.tools import FunctionTool
from autogen_ext.models import OpenAIChatCompletionClient
from openai.resources.chat.completions import AsyncCompletions
from openai.types.chat.chat_completion import ChatCompletion, Choice
Expand Down
2 changes: 1 addition & 1 deletion python/packages/autogen-agentchat/tests/test_group_chat.py
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@
from autogen_agentchat.teams._group_chat._swarm_group_chat import SwarmGroupChatManager
from autogen_agentchat.ui import Console
from autogen_core import AgentId, CancellationToken
from autogen_core.components.tools import FunctionTool
from autogen_core.tools import FunctionTool
from autogen_ext.code_executors.local import LocalCommandLineCodeExecutor
from autogen_ext.models import OpenAIChatCompletionClient, ReplayChatCompletionClient
from openai.resources.chat.completions import AsyncCompletions
Expand Down
4 changes: 2 additions & 2 deletions python/packages/autogen-core/docs/src/reference/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -29,8 +29,8 @@ python/autogen_core
python/autogen_core.code_executor
python/autogen_core.models
python/autogen_core.model_context
python/autogen_core.components.tools
python/autogen_core.components.tool_agent
python/autogen_core.tools
python/autogen_core.tool_agent
python/autogen_core.exceptions
python/autogen_core.logging
```
Expand Down
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
autogen\_core.components.tool\_agent
autogen\_core.tool\_agent
====================================


.. automodule:: autogen_core.components.tool_agent
.. automodule:: autogen_core.tool_agent
:members:
:undoc-members:
:show-inheritance:
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
autogen\_core.components.tools
autogen\_core.tools
==============================


.. automodule:: autogen_core.components.tools
.. automodule:: autogen_core.tools
:members:
:undoc-members:
:show-inheritance:
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@
"from autogen_agentchat.conditions import TextMentionTermination\n",
"from autogen_agentchat.teams import RoundRobinGroupChat\n",
"from autogen_agentchat.ui import Console\n",
"from autogen_core.components.tools import FunctionTool\n",
"from autogen_core.tools import FunctionTool\n",
"from autogen_ext.models import OpenAIChatCompletionClient"
]
},
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@
"from autogen_agentchat.conditions import TextMentionTermination\n",
"from autogen_agentchat.teams import RoundRobinGroupChat\n",
"from autogen_agentchat.ui import Console\n",
"from autogen_core.components.tools import FunctionTool\n",
"from autogen_core.tools import FunctionTool\n",
"from autogen_ext.models import OpenAIChatCompletionClient"
]
},
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@
"## Assistant Agent\n",
"\n",
"{py:class}`~autogen_agentchat.agents.AssistantAgent` is a built-in agent that\n",
"uses a language model with ability to use tools."
"uses a language model and has the ability to use tools."
]
},
{
Expand Down Expand Up @@ -59,8 +59,10 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"We can call the {py:meth}`~autogen_agentchat.agents.AssistantAgent.on_messages` \n",
"method to get the agent to respond to a message."
"\n",
"## Getting Responses\n",
"\n",
"We can use the {py:meth}`~autogen_agentchat.agents.AssistantAgent.on_messages` method to get the agent response to a given message.\n"
]
},
{
Expand Down Expand Up @@ -134,7 +136,7 @@
"source": [
"The User Proxy agent is ideally used for on-demand human-in-the-loop interactions for scenarios such as Just In Time approvals, human feedback, alerts, etc. For slower user interactions, consider terminating the session using a termination condition and start another one from run or run_stream with another message.\n",
"\n",
"### Stream Messages\n",
"## Streaming Messages\n",
"\n",
"We can also stream each message as it is generated by the agent by using the\n",
"{py:meth}`~autogen_agentchat.agents.AssistantAgent.on_messages_stream` method,\n",
Expand Down Expand Up @@ -172,7 +174,7 @@
"\n",
"\n",
"async def assistant_run_stream() -> None:\n",
" # Option 1: read each message from the stream.\n",
" # Option 1: read each message from the stream (as shown in the previous example).\n",
" # async for message in agent.on_messages_stream(\n",
" # [TextMessage(content=\"Find information on AutoGen\", source=\"user\")],\n",
" # cancellation_token=CancellationToken(),\n",
Expand All @@ -198,12 +200,12 @@
"source": [
"The {py:meth}`~autogen_agentchat.agents.AssistantAgent.on_messages_stream` method\n",
"returns an asynchronous generator that yields each inner message generated by the agent,\n",
"and the last item is the final response message in the {py:attr}`~autogen_agentchat.base.Response.chat_message` attribute.\n",
"with the final item being the response message in the {py:attr}`~autogen_agentchat.base.Response.chat_message` attribute.\n",
"\n",
"From the messages, you can see the assistant agent used the `web_search` tool to\n",
"search for information and responded using the search results.\n",
"From the messages, you can observe that the assistant agent utilized the `web_search` tool to\n",
"gather information and responded based on the search results.\n",
"\n",
"### Understanding Tool Calling\n",
"## Understanding Tool Calling\n",
"\n",
"Large Language Models (LLMs) are typically limited to generating text or code responses. However, many complex tasks benefit from the ability to use external tools that perform specific actions, such as fetching data from APIs or databases.\n",
"\n",
Expand Down Expand Up @@ -233,8 +235,7 @@
"source": [
"## Next Step\n",
"\n",
"Now we have discussed how to use the {py:class}`~autogen_agentchat.agents.AssistantAgent`,\n",
"we can move on to the next section to learn how to use the teams feature of AgentChat."
"Having explored the usage of the {py:class}`~autogen_agentchat.agents.AssistantAgent`, we can now proceed to the next section to learn about the teams feature in AgentChat.\n"
]
},
{
Expand Down
Loading

0 comments on commit f5ca240

Please sign in to comment.