Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: add ChatInterface - wrapper around typical chat use-case #478

Open
mhordynski opened this issue Mar 31, 2025 · 0 comments
Open

feat: add ChatInterface - wrapper around typical chat use-case #478

mhordynski opened this issue Mar 31, 2025 · 0 comments
Assignees
Labels
feature New feature or request

Comments

@mhordynski
Copy link
Member

Feature description

Goal of this task is to create a minimal wrapper around typical use-case that we develop - chat interface. ChatInterface should allow to get standardized inputs and stream text and rich elements.

Input:

  • message: current message from the user
  • history: previous messages in the converation
  • context: anything extra coming from API (for example user info)

Output (yields):

  • str: Regular text responses streamed chunk by chunk
  • Reference: Source documents used to generate the answer
  • LiveUpdate: Status updates during processing (tool-call, searching, etc.)
  • Action: Suggested actions for the user to take (follow-up questions, write action, etc.)

Motivation

While #330 will provide a generic approach to define workflows, we still need a dead simple way of orchestrating any chat-like workload without forcing users to write code in any specific way.

Idea behind ChatInterface is to provide light-weight guidelines what input and output can be handled in such scenario and allow user to write any code using components already provided in ragbits.

Additional context

class SimpleChatImplementation(ChatInterface):
    """A simple example implementation of the ChatInterface that demonstrates different response types."""

    async def chat(
        self,
        message: str,
        history: list[Message] | None = None,
        context: dict | None = None,
    ) -> AsyncGenerator[ChatResponse, None]:

        yield self.create_live_update(
            message="Searching for relevant documents...",
        )

        references = await get_document_search().search(message)

        for reference in references:
            yield self.create_reference(reference)

        for word in "Hello my name is John Doe, I'm a software engineer and I'm looking for a new job. Please help me find a new job.".split():
            yield self.create_text_response(word + " ")
            await asyncio.sleep(0.05)
@mhordynski mhordynski added the feature New feature or request label Mar 31, 2025
@mhordynski mhordynski moved this to Backlog in ragbits Mar 31, 2025
@mhordynski mhordynski self-assigned this Mar 31, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature New feature or request
Projects
Status: Backlog
Development

No branches or pull requests

1 participant