You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Prompt-functions should be able to create other prompt functions. This would allow for some form of prompt-optimization and self-improvement.
Example:
frommagenticimportprompt, PromptFunction@prompt("Create a prompt-function that returns the sum of two numbers in natural language")defmake_language_plus() ->PromptFunction[[int, int], str]: ...
# Query the LLM to create the "language_plus" prompt-functionlanguage_plus=make_language_plus()
# Query the LLM using the new prompt-functionoutput=language_plus(2, 3)
print(output)
# 'five'
"""Base class for an LLM prompt template that is directly callable to query the LLM."""
def__init__(
self,
name: str,
parameters: Sequence[inspect.Parameter],
return_type: type[R],
template: str,
functions: list[Callable[..., Any]] |None=None,
stop: list[str] |None=None,
max_retries: int=0,
model: ChatModel|None=None,
):
Due to its parameters, PromptFunction cannot inherit from BaseModel and be immediately serializable, so it will require some custom handling for (de)serialization.
inspect.Parameter should be replaced by an equivalent pydantic model which is serializable.
return_type and functions: For serialization, convert each python object to its import path. For generation, the LLM can return the import path of the type/function as a string and the pydantic's ImportString can be used to convert this to a python object. The prompt will have to list the available type/functions for the LLM. The retry logic might need updating to catch errors from this. https://docs.pydantic.dev/2.0/usage/types/string_types/#importstring
ChatModel: Should be able to make all ChatModel subclasses inherit from BaseModel` as their init parameters are all simple types. And/or this could also be specifiable as a string e.g. "openai:gpt-4o". See #416
Prompt-functions should be able to create other prompt functions. This would allow for some form of prompt-optimization and self-improvement.
Example:
A prompt-function is
magentic/src/magentic/prompt_function.py
Lines 22 to 35 in ee082fc
Due to its parameters,
PromptFunction
cannot inherit fromBaseModel
and be immediately serializable, so it will require some custom handling for (de)serialization.inspect.Parameter
should be replaced by an equivalent pydantic model which is serializable.return_type
andfunctions
: For serialization, convert each python object to its import path. For generation, the LLM can return the import path of the type/function as a string and the pydantic'sImportString
can be used to convert this to a python object. The prompt will have to list the available type/functions for the LLM. The retry logic might need updating to catch errors from this. https://docs.pydantic.dev/2.0/usage/types/string_types/#importstringChatModel
: Should be able to make allChatModel subclasses inherit from
BaseModel` as their init parameters are all simple types. And/or this could also be specifiable as a string e.g. "openai:gpt-4o". See #416Related discussion: #312
The text was updated successfully, but these errors were encountered: