-
Notifications
You must be signed in to change notification settings - Fork 270
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix: store pubsubfrontend clone in rpcinner #1977
Conversation
|
||
let rpc_client = RpcClient::builder().layer(retry_layer).ws(ws).await?; | ||
|
||
let provider = ProviderBuilder::new().disable_recommended_fillers().on_client(rpc_client); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: no need for this
let provider = ProviderBuilder::new().disable_recommended_fillers().on_client(rpc_client); | |
let provider = ProviderBuilder::new().on_client(rpc_client); |
also i wish we had the ability to configure layers on the ProviderBuilder
w/o having to drop to RpcClient
, so you'd just do
Providerbuilder::new().layer(retry_layer).on_builtin(url).await?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lgtm, the only thing would be that it creates a copy of the internal state instead of sharing it, so we might want to arc pubsubfrontend internally?
can you elaborate on this? unclear what state you mean |
the Pubsubfrontend struct itself is now both inside of transport and in the new field |
this is fine because this is just another copy of the sender half alloy/crates/pubsub/src/frontend.rs Lines 18 to 22 in 7fb7a51
which also gets dropped when the transport drops |
closes #1972
this is a bit cursed.
It described this in the docs but due to how:
alloy/crates/rpc-client/src/builder.rs
Line 44 in bb77d2f
creates the transport service object if there's an additional layer, this downcast does not work:
alloy/crates/rpc-client/src/client.rs
Lines 211 to 212 in bb77d2f
the solution to this is a workaround that captures the Pubsubfrontend before we collapse the service builder.
this can still be bypassed manually via
alloy/crates/rpc-client/src/client.rs
Line 163 in bb77d2f
but all the builtin transport fn should now capture the pubsubfrontend.
pubsubfrontend is just a channel so capturing an additional clone seems fine