Skip to content

feat(client): allow passing async httpx client#1557

Open
hassiebp wants to merge 2 commits intomainfrom
hassieb/lfe-8749-bug-custom-httpx_client-is-not-applied-to-clientasync_api-in
Open

feat(client): allow passing async httpx client#1557
hassiebp wants to merge 2 commits intomainfrom
hassieb/lfe-8749-bug-custom-httpx_client-is-not-applied-to-clientasync_api-in

Conversation

@hassiebp
Copy link
Contributor

@hassiebp hassiebp commented Mar 6, 2026

Disclaimer: Experimental PR review

Greptile Summary

This PR adds support for passing a custom async_httpx_client (httpx.AsyncClient) to the Langfuse constructor, mirroring the existing httpx_client (sync) parameter. The async client is stored on LangfuseResourceManager, passed to AsyncLangfuseAPI, and preserved through get_client() — enabling users who need custom transport settings (e.g. proxies, SSL, connection limits) on client.async_api to supply their own client.

Key changes:

  • langfuse/_client/client.py: New async_httpx_client parameter added to __init__ with docstring update.
  • langfuse/_client/resource_manager.py: Parameter threaded into _initialize_instance; internally creates httpx.AsyncClient if not provided; passes it to AsyncLangfuseAPI (previously AsyncLangfuseAPI created its own internal client).
  • langfuse/_client/get_client.py: async_httpx_client=instance.async_httpx_client added to _create_client_from_instance so get_client() preserves the client across singleton lookups.
  • Tests cover custom client propagation through both direct construction and get_client().

The implementation is clean, follows existing patterns, and is well-tested. Changes correctly thread the parameter through all relevant layers with proper end-to-end testing.

Confidence Score: 5/5

  • Safe to merge — the feature is well-scoped, follows existing patterns, is thoroughly tested, and introduces no functional regressions.
  • The PR implementation is clean and complete: the new async_httpx_client parameter is correctly threaded through all layers (client → resource manager → async API), preserves the client across get_client() calls, and is comprehensively tested. The code quality is high with minimal changes following established patterns. No functional issues or regressions identified.
  • No files require special attention

Sequence Diagram

sequenceDiagram
    participant User
    participant Langfuse
    participant ResourceManager as LangfuseResourceManager
    participant AsyncAPI as AsyncLangfuseAPI

    User->>Langfuse: Langfuse(async_httpx_client=custom_client)
    Langfuse->>ResourceManager: LangfuseResourceManager(..., async_httpx_client=custom_client)
    alt async_httpx_client provided
        ResourceManager->>ResourceManager: self.async_httpx_client = custom_client
    else async_httpx_client is None
        ResourceManager->>ResourceManager: self.async_httpx_client = httpx.AsyncClient(...)
    end
    ResourceManager->>AsyncAPI: AsyncLangfuseAPI(..., httpx_client=self.async_httpx_client)
    AsyncAPI-->>ResourceManager: async_api instance
    ResourceManager-->>Langfuse: resources ready
    Langfuse-->>User: client ready (client.async_api uses custom_client)

    User->>Langfuse: get_client()
    Langfuse->>ResourceManager: _create_client_from_instance(instance)
    ResourceManager-->>Langfuse: Langfuse(..., async_httpx_client=instance.async_httpx_client)
    Langfuse-->>User: client with preserved async_httpx_client
Loading

Last reviewed commit: f47f6fb

Copy link

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: f47f6fbde7

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

@hassiebp
Copy link
Contributor Author

@codex review

@chatgpt-codex-connector
Copy link

Codex Review: Didn't find any major issues. More of your lovely PRs please.

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

@hassiebp hassiebp force-pushed the hassieb/lfe-8749-bug-custom-httpx_client-is-not-applied-to-clientasync_api-in branch from c7ffa7e to f47f6fb Compare March 16, 2026 16:44
@hassiebp hassiebp enabled auto-merge (squash) March 16, 2026 16:44
Copy link

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 9f5b8c8800

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

Comment on lines +250 to 251
httpx_client=self.async_httpx_client,
timeout=timeout,

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2 Badge Preserve custom async client timeouts

When a caller provides async_httpx_client, this still forwards timeout to AsyncLangfuseAPI, and AsyncLangfuseAPI prefers that explicit timeout over the client’s own timeout config; since Langfuse.__init__ resolves timeout to a concrete value by default, requests from client.async_api are forced to that value instead of using the custom httpx.AsyncClient settings. This makes custom async clients partially ineffective and can cause unexpected request timeouts (for example, long-running endpoints being cut off at the SDK default).

Useful? React with 👍 / 👎.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant