Skip to content

Commit

Permalink
Flatten the outline
Browse files Browse the repository at this point in the history
  • Loading branch information
gagb committed Dec 9, 2024
1 parent c081932 commit beac3fb
Showing 1 changed file with 3 additions and 3 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -60,7 +60,7 @@
"metadata": {},
"source": [
"\n",
"### Getting Responses\n",
"## Getting Responses\n",
"\n",
"We can use the {py:meth}`~autogen_agentchat.agents.AssistantAgent.on_messages` method to get the agent response to a given message.\n"
]
Expand Down Expand Up @@ -136,7 +136,7 @@
"source": [
"The User Proxy agent is ideally used for on-demand human-in-the-loop interactions for scenarios such as Just In Time approvals, human feedback, alerts, etc. For slower user interactions, consider terminating the session using a termination condition and start another one from run or run_stream with another message.\n",
"\n",
"### Streaming Messages\n",
"## Streaming Messages\n",
"\n",
"We can also stream each message as it is generated by the agent by using the\n",
"{py:meth}`~autogen_agentchat.agents.AssistantAgent.on_messages_stream` method,\n",
Expand Down Expand Up @@ -205,7 +205,7 @@
"From the messages, you can observe that the assistant agent utilized the `web_search` tool to\n",
"gather information and responded based on the search results.\n",
"\n",
"### Understanding Tool Calling\n",
"## Understanding Tool Calling\n",
"\n",
"Large Language Models (LLMs) are typically limited to generating text or code responses. However, many complex tasks benefit from the ability to use external tools that perform specific actions, such as fetching data from APIs or databases.\n",
"\n",
Expand Down

0 comments on commit beac3fb

Please sign in to comment.