Replies: 1 comment 1 reply
-
Response from ADK Answering Agent (experimental, answer may be inaccurate) TLDR: The easiest way to ensure the model receives your modified question is to use a different variable name in your agent's instruction. The name Hello! Thanks for the detailed question. Here’s an explanation of what's happening and how you can achieve your goal. 1. How to ensure the model gets the modified questionYour current approach isn't working because of the order in which ADK resolves variables for prompt templating. When you provide a The most straightforward solution is to avoid this variable name collision. Recommended Change:
Here is the updated code: # In your callback, use a new key, e.g., 'modified_question'
def save_question_in_state(callback_context: CallbackContext) -> Optional[types.Content]:
# Change question to ask for 'LOWEST' price.
callback_context.state['modified_question'] = 'what was the LOWEST price of aapl in Jan 2025?'
return None
# Update the instruction to refer to the new state variable
def get_model_instruction() -> str:
return """blah blah blah {modified_question} blah blah blah"""
# Your agent definition remains the same
question_agent= LlmAgent(
name="question_agent",
model="gemini-2.5-flash-preview-09-2025",
instruction=get_model_instruction(),
description=get_agent_description(),
before_agent_callback=save_question_in_state
) By using 2. Documentation on Variable InterpolationYou're correct that this behavior stems from how ADK handles variable interpolation. The framework automatically makes values available for templating in your agent's instructions. While the documentation may not specify a rigid order of precedence, the behavior you've observed is intentional: variables from the immediate context (like the current user message) are prioritized over variables from the general session state. This ensures that agents are responsive to the most recent user input. The documentation on Session State mentions that you can inject values from the state into an instruction using the For more advanced scenarios, such as modifying the final request sent to the LLM, you could also use the Hopefully, this clarifies things! [1] Session State - ADK Documentation (https://google.github.io/adk-docs/sessions/state/) |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
In a pure test scenario, I am trying to use the before_agent_callback to change a user question before it gets processed by the LLM. In the below code, before_agent_callback saves the changed question to the session state key 'user_question'. Then I try to include the updated question in the model instructions by using the variable {user_question}.
In the below code , a user asks for the highest price of appl in Jan 2025, the callback changes it to ask for the lowest price then returns None so the operation proceeds. I would expect the LLM to answer the lowest price question but it doesn't. It answers the highest price question.
At the very bottom of the code snippet, I print session and you can see the state['user_question'] contains the lowest price question and the session events contain the, user supplied, highest price question.
I guess I have a few questions:
--------Begin Code ---------
Beta Was this translation helpful? Give feedback.
All reactions