Releases: jackmpcollins/magentic
v0.39.2
What's Changed
- Add tests for Gemini via openai package by @jackmpcollins in #382
Full Changelog: v0.39.1...v0.39.2
v0.39.1
What's Changed
- Add tests and docs for xAI / Grok via OpenaiChatModel by @jackmpcollins in #433
Full Changelog: v0.39.0...v0.39.1
v0.39.0
What's Changed
- Use TypeVar default to remove overloads by @jackmpcollins in #411
- Add missing Field import in docs by @jackmpcollins in #428
- feat: support for passing extra_headers to LitellmChatModel by @ashwin153 in #426
New Contributors
- @ashwin153 made their first contribution in #426
Full Changelog: v0.38.1...v0.39.0
v0.38.1
What's Changed
Full Changelog: v0.38.0...v0.38.1
v0.38.0
What's Changed
- Async streamed response to api message conversion by @ananis25 in #405
- Support AsyncParallelFunctionCall in message_to_X_message by @jackmpcollins in #406
Full Changelog: v0.37.1...v0.38.0
v0.37.1
What's Changed
Anthropic model message serialization now supports StreamedResponse
in AssistantMessage
. Thanks to @ananis25 🎉
PRs
New Contributors
Full Changelog: v0.37.0...v0.37.1
v0.37.0
What's Changed
The @prompt_chain
decorator can now accept a sequence of Message
as input, like @chatprompt
.
from magentic import prompt_chain, UserMessage
def get_current_weather(location, unit="fahrenheit"):
"""Get the current weather in a given location"""
return {"temperature": "72", "forecast": ["sunny", "windy"]}
@prompt_chain(
template=[UserMessage("What's the weather like in {city}?")],
functions=[get_current_weather],
)
def describe_weather(city: str) -> str: ...
describe_weather("Boston")
'The weather in Boston is currently 72°F with sunny and windy conditions.'
PRs
- Allow Messages as input to prompt_chain by @jackmpcollins in #403
Full Changelog: v0.36.0...v0.37.0
v0.36.0
What's Changed
Document the Chat
class and make it importable from the top level.
docs: https://magentic.dev/chat/
from magentic import Chat, OpenaiChatModel, UserMessage
# Create a new Chat instance
chat = Chat(
messages=[UserMessage("Say hello")],
model=OpenaiChatModel("gpt-4o"),
)
# Append a new user message
chat = chat.add_user_message("Actually, say goodbye!")
print(chat.messages)
# [UserMessage('Say hello'), UserMessage('Actually, say goodbye!')]
# Submit the chat to the LLM to get a response
chat = chat.submit()
print(chat.last_message.content)
# 'Hello! Just kidding—goodbye!'
PRs
- Use public import for ChatCompletionStreamState by @jackmpcollins in #398
- Make Chat class public and add docs by @jackmpcollins in #401
- Remove unused content None from openai messages by @jackmpcollins in #402
Full Changelog: v0.35.0...v0.36.0
v0.35.0
What's Changed
UserMessage
now accepts image urls, image bytes, and document bytes directly using the ImageUrl
, ImageBytes
, and DocumentBytes
types.
Example of new UserMessage
syntax and DocumentBytes
from pathlib import Path
from magentic import chatprompt, DocumentBytes, Placeholder, UserMessage
from magentic.chat_model.anthropic_chat_model import AnthropicChatModel
@chatprompt(
UserMessage(
[
"Repeat the contents of this document.",
Placeholder(DocumentBytes, "document_bytes"),
]
),
model=AnthropicChatModel("claude-3-5-sonnet-20241022"),
)
def read_document(document_bytes: bytes) -> str: ...
document_bytes = Path("...").read_bytes()
read_document(document_bytes)
# 'This is a test PDF.'
PRs
- Accept Sequence[Message] instead of list for Chat by @alexchandel in #390
- Bump astral-sh/setup-uv from 4 to 5 by @dependabot in #393
- Support images directly in UserMessage by @jackmpcollins in #387
- Add DocumentBytes for submitting PDF documents by @jackmpcollins in #395
New Contributors
- @alexchandel made their first contribution in #390
Full Changelog: v0.34.1...v0.35.0
v0.34.1
What's Changed
- Consume LLM output stream via returned objects to allow caching by @jackmpcollins in #384
- Improve ruff format/lint rules by @jackmpcollins in #385
- Update overview and configuration docs by @jackmpcollins in #386
Full Changelog: v0.34.0...v0.34.1