Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Missing extra_create_kwargs in AssistantAgent #5212

Open
gagb opened this issue Jan 27, 2025 · 1 comment · May be fixed by #5213
Open

Missing extra_create_kwargs in AssistantAgent #5212

gagb opened this issue Jan 27, 2025 · 1 comment · May be fixed by #5213

Comments

@gagb
Copy link
Collaborator

gagb commented Jan 27, 2025

What happened?

Model client might need them, but assistant agent in agentchat never provides an option to set them or pass them to the client.

I discovered this while trying to use the skadapter with assistant agent + deepseek r1.

(python) gaganbansal@MacBook-Pro ~ % python ~/Downloads/ollamask.py
Error processing publish message for assistant/d88e2d75-f480-4e3a-990d-692ca482e085
Traceback (most recent call last):
  File "/Users/gaganbansal/workspace/autogen/python/packages/autogen-core/src/autogen_core/_single_threaded_agent_runtime.py", line 409, in _on_message
    return await agent.on_message(
           ^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/gaganbansal/workspace/autogen/python/packages/autogen-core/src/autogen_core/_base_agent.py", line 113, in on_message
    return await self.on_message_impl(message, ctx)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/gaganbansal/workspace/autogen/python/packages/autogen-agentchat/src/autogen_agentchat/teams/_group_chat/_sequential_routed_agent.py", line 48, in on_message_impl
    return await super().on_message_impl(message, ctx)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/gaganbansal/workspace/autogen/python/packages/autogen-core/src/autogen_core/_routed_agent.py", line 485, in on_message_impl
    return await h(self, message, ctx)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/gaganbansal/workspace/autogen/python/packages/autogen-core/src/autogen_core/_routed_agent.py", line 268, in wrapper
    return_value = await func(self, message, ctx)  # type: ignore
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/gaganbansal/workspace/autogen/python/packages/autogen-agentchat/src/autogen_agentchat/teams/_group_chat/_chat_agent_container.py", line 53, in handle_request
    async for msg in self._agent.on_messages_stream(self._message_buffer, ctx.cancellation_token):
  File "/Users/gaganbansal/workspace/autogen/python/packages/autogen-agentchat/src/autogen_agentchat/agents/_assistant_agent.py", line 386, in on_messages_stream
    model_result = await self._model_client.create(
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/gaganbansal/workspace/autogen/python/packages/autogen-ext/src/autogen_ext/models/semantic_kernel/_sk_chat_completion_adapter.py", line 342, in create
    kernel = self._get_kernel(extra_create_args)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/gaganbansal/workspace/autogen/python/packages/autogen-ext/src/autogen_ext/models/semantic_kernel/_sk_chat_completion_adapter.py", line 301, in _get_kernel
    raise ValueError("kernel must be provided either in constructor or extra_create_args")
ValueError: kernel must be provided either in constructor or extra_create_args

@lspinheiro @ekzhu

What did you expect to happen?

There should be a way to pass the kernel or extra kwargs to the client from the agent. I think the same is true for json output etc.

How can we reproduce it (as minimally and precisely as possible)?

import asyncio

from semantic_kernel import Kernel
from semantic_kernel.memory.null_memory import NullMemory
from semantic_kernel.connectors.ai.ollama import OllamaChatCompletion, OllamaChatPromptExecutionSettings

from autogen_agentchat.agents import AssistantAgent, CodeExecutorAgent
from autogen_agentchat.conditions import TextMentionTermination, MaxMessageTermination
from autogen_agentchat.ui import Console
from autogen_agentchat.teams import RoundRobinGroupChat

from autogen_ext.models.semantic_kernel import SKChatCompletionAdapter
from autogen_ext.code_executors.local import LocalCommandLineCodeExecutor


async def main():
    # Step 1: Initialize the Semantic Kernel
    kernel = Kernel(memory=NullMemory())

    # Step 2: Configure the Ollama client
    ollama_client = OllamaChatCompletion(
        service_id="ollama",
        host="http://localhost:11434",  # Local Ollama server
        ai_model_id="deepseek-r1",        # Replace with your desired Ollama model
    )
    model_client = SKChatCompletionAdapter(sk_client=ollama_client)

    assistant = AssistantAgent(
        name="assistant",
        system_message="You are a helpful assistant. Write all code in python. If the correct output is generated by code execution by user reply 'TERMINATE'. Code must be returned using ``` ```",
        model_client=model_client
    )

    code_executor = CodeExecutorAgent(
        name="code_executor",
        code_executor=LocalCommandLineCodeExecutor(work_dir="coding"),
    )

    # The termination condition is a combination of text termination and max message termination, either of which will cause the chat to terminate.
    termination = TextMentionTermination("TERMINATE") | MaxMessageTermination(10)

    # The group chat will alternate between the assistant and the code executor.
    group_chat = RoundRobinGroupChat([assistant, code_executor], termination_condition=termination)

    # `run_stream` returns an async generator to stream the intermediate messages.
    stream = group_chat.run_stream(task="Print hello world.")
    # `Console` is a simple UI to display the stream.
    await Console(stream)

asyncio.run(main())

AutoGen version

main

Which package was this bug in

AgentChat

Model used

deepseek-r1

Python version

No response

Operating system

No response

Any additional info you think would be helpful for fixing this bug

No response

@ekzhu
Copy link
Collaborator

ekzhu commented Jan 27, 2025

@gagb I believe the SK client cannot be used by agent at this moment. #5134 is a way to address this issue, see discussion here: #4741

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants