-
-
Notifications
You must be signed in to change notification settings - Fork 37
Performance optimizations #2424
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Warning Rate limit exceeded@ardatan has exceeded the limit for the number of commits or files that can be reviewed per hour. Please wait 4 minutes and 8 seconds before requesting another review. ⌛ How to resolve this issue?After the wait time has elapsed, a review can be triggered using the We recommend that you space out your commits to avoid hitting the rate limit. 🚦 How do rate limits work?CodeRabbit enforces hourly rate limits for each developer per organization. Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout. Please see our FAQ for further information. 📒 Files selected for processing (1)
📝 WalkthroughSummary by CodeRabbit
Summary by CodeRabbit
WalkthroughThis update introduces Node.js 24-focused performance optimizations for the Changes
Sequence Diagram(s)sequenceDiagram
participant Client
participant NodeServer
participant ServerAdapter
participant RequestHandler
Client->>NodeServer: Sends HTTP Request
NodeServer->>ServerAdapter: Passes request, response, options (__useCustomAbortCtrl, __useSingleWriteHead)
ServerAdapter->>RequestHandler: Normalizes request (may use custom abort controller)
RequestHandler-->>ServerAdapter: Returns response
ServerAdapter->>NodeServer: Sends response (may use single writeHead or setHeaders)
NodeServer-->>Client: Returns HTTP Response
Possibly related PRs
Suggested reviewers
Poem
✨ Finishing Touches
🧪 Generate Unit Tests
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Create a ticket on our support page for assistance with any issues or questions. Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
🚀 Snapshot Release (
|
Package | Version | Info |
---|---|---|
@whatwg-node/fetch |
0.10.8-alpha-20250520114553-b3c1d944611af62fcdc5669e94e8ddc3db8bc593 |
npm ↗︎ unpkg ↗︎ |
@whatwg-node/node-fetch |
0.7.21-alpha-20250520114553-b3c1d944611af62fcdc5669e94e8ddc3db8bc593 |
npm ↗︎ unpkg ↗︎ |
@whatwg-node/server |
0.10.9-alpha-20250520114553-b3c1d944611af62fcdc5669e94e8ddc3db8bc593 |
npm ↗︎ unpkg ↗︎ |
✅
|
✅
|
✅
|
✅
|
✅
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
🧹 Nitpick comments (1)
packages/node-fetch/src/utils.ts (1)
89-91
: Consider potential edge cases with customisArray
implementation.This new utility checks for array characteristics but might return false positives for array-like objects (NodeList, arguments, objects with these properties). While likely more performant than
Array.isArray()
, ensure it's only used with inputs where these edge cases won't occur.If you need strict array checking, consider:
export function isArray<T>(value: any): value is T[] { - return value?.length && value?.map && value?.slice && value?.splice; + return value?.length !== undefined && typeof value?.map === 'function' && + typeof value?.slice === 'function' && typeof value?.splice === 'function'; }Or to eliminate all edge cases:
export function isArray<T>(value: any): value is T[] { - return value?.length && value?.map && value?.slice && value?.splice; + return Array.isArray(value); }
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (13)
.changeset/curvy-seas-smoke.md
(1 hunks).changeset/stupid-candies-fetch.md
(0 hunks).github/workflows/benchmarks.yml
(0 hunks)packages/node-fetch/src/Body.ts
(5 hunks)packages/node-fetch/src/Headers.ts
(8 hunks)packages/node-fetch/src/Request.ts
(1 hunks)packages/node-fetch/src/Response.ts
(2 hunks)packages/node-fetch/src/fetchCurl.ts
(3 hunks)packages/node-fetch/src/fetchNodeHttp.ts
(3 hunks)packages/node-fetch/src/utils.ts
(1 hunks)packages/server/src/createServerAdapter.ts
(2 hunks)packages/server/src/plugins/useCors.ts
(2 hunks)packages/server/src/utils.ts
(2 hunks)
💤 Files with no reviewable changes (2)
- .github/workflows/benchmarks.yml
- .changeset/stupid-candies-fetch.md
🧰 Additional context used
🧬 Code Graph Analysis (6)
packages/server/src/plugins/useCors.ts (1)
packages/server/src/utils.ts (1)
isArray
(645-647)
packages/node-fetch/src/fetchCurl.ts (1)
packages/node-fetch/src/Request.ts (1)
signal
(113-116)
packages/node-fetch/src/utils.ts (1)
packages/server/src/utils.ts (1)
isArray
(645-647)
packages/node-fetch/src/Response.ts (2)
packages/node-fetch/src/Headers.ts (1)
isHeadersLike
(7-9)packages/node-fetch/src/utils.ts (1)
isArray
(89-91)
packages/node-fetch/src/Headers.ts (1)
packages/node-fetch/src/utils.ts (1)
isArray
(89-91)
packages/node-fetch/src/Body.ts (2)
packages/node-fetch/src/Blob.ts (3)
PonyfillBlob
(72-321)PonyfillBlob
(323-326)text
(193-217)packages/promise-helpers/src/index.ts (1)
fakePromise
(47-94)
🔇 Additional comments (20)
packages/server/src/plugins/useCors.ts (1)
2-2
: Import change aligns with optimization pattern.Replacing
Array.isArray()
with the customisArray()
utility is consistent with the optimization approach used elsewhere in the codebase.Also applies to: 50-50
packages/node-fetch/src/fetchCurl.ts (2)
38-45
: Well-optimized abort signal handling.This approach avoids unnecessary creation of AbortController objects by prioritizing
_signal
when available. The code properly handles the null case and only falls back to the getter property which might create a new signal as a last resort.
48-48
: Signal reference updates for consistency.All references to
fetchRequest.signal
are properly updated to use the new localsignal
variable, ensuring consistent signal handling throughout the function.Also applies to: 107-107, 114-114, 136-136
.changeset/curvy-seas-smoke.md (1)
1-14
: Clear and comprehensive changeset documentation.The changeset clearly documents all performance optimizations being implemented. This provides excellent context for reviewers and future maintainers.
packages/node-fetch/src/Request.ts (1)
112-116
: Great performance optimization with lazy signal initializationThis change improves performance by lazily initializing the abort signal only when needed, rather than eagerly creating it in the constructor. This avoids the expensive creation of
AbortController
andAbortSignal
objects when they're not actually used.- signal: AbortSignal = new AbortController().signal; // or similar eager initialization + get signal(): AbortSignal { + this._signal ||= new AbortController().signal; + return this._signal; + }packages/server/src/createServerAdapter.ts (2)
4-4
: New import for promise utilitiesAdding
unfakePromise
andMaybePromise
to support the refactored promise chain below.
299-313
: Improved async control flow with promise chainingThis refactoring improves code readability by replacing nested callback-style
handleMaybePromise
calls with a cleaner promise chain. The new approach maintains the same logic flow while making the sequence of operations more explicit.packages/node-fetch/src/fetchNodeHttp.ts (2)
43-51
: Centralized abort signal managementGood improvement to centralize the logic for selecting which abort signal to use. This supports the lazy initialization pattern introduced in the
Request.ts
file and makes the code more maintainable by reducing duplication.
60-61
: Consistent signal usage across all call sitesThese changes correctly update all call sites to use the centralized
signal
variable instead of directly accessingfetchRequest.signal
. This ensures consistent behavior across the entire fetch implementation.Also applies to: 67-68, 123-124, 138-139, 145-146
packages/node-fetch/src/Headers.ts (1)
35-35
: Performance optimization for Headers manipulationThis optimization prevents unnecessary map creation by directly manipulating the original
headersInit
data structure when setting header values. The code handles different types of header initialization objects (arrays, Headers-like objects, and plain objects) appropriately.This aligns with the PR objective of "Preventing the creation of a new map for Headers by attempting to reuse the initialization object for Headers" and will improve performance in scenarios where a single
Headers
object is reused.Also applies to: 76-76, 155-169, 189-189, 218-218, 243-243
packages/node-fetch/src/Response.ts (2)
4-4
: Good addition of utility import.The import of
isArray
from./utils.js
follows the pattern of using utility functions for type checking, which is consistent with the codebase's approach.
68-91
: Excellent header handling optimization injson
method.This refactoring improves performance by handling different header formats more efficiently. The key optimizations are:
- Making
init
parameter optional rather than creating an empty object by default- Handling multiple header formats (objects, arrays, Headers-like objects) with type-specific logic
- Only modifying headers when the content-type isn't already set
This change aligns with the PR objectives of reducing unnecessary object creation and improving efficiency in handling headers.
packages/server/src/utils.ts (3)
318-357
: Excellent header optimization strategy.This implementation significantly improves performance by:
- Directly using the
headersInit
object when possible- Utilizing
setHeaders
method (available since Node.js 18) for efficient header forwarding- Implementing special handling for
set-cookie
headers which often require array supportThese optimizations align perfectly with the PR objectives of preventing creation of new maps for
Headers
and utilizingsetHeaders
when available.
359-366
: Great optimization for string body handling.This addition avoids unnecessary buffer creation for string bodies by writing them directly to the response stream. This is an excellent performance improvement, especially for API responses that frequently return JSON or text content.
645-647
: Good utility function implementation.The
isArray
utility function uses feature detection rather thanArray.isArray
to check for array-like objects. This is consistent with the optimization approach in the PR and allows for more flexible handling of array-like structures.packages/node-fetch/src/Body.ts (5)
48-48
: Good refactoring of signal handling.Changing the
signal
property from public to private_signal
aligns with best practices for encapsulation. Making it optional in the interface allows for more flexible usage while maintaining compatibility.Also applies to: 55-55, 61-61
202-208
: Excellent optimization for string body handling in blob method.This enhancement efficiently handles string body types by:
- Setting the text directly without unnecessary conversions
- Creating a PonyfillBlob with the correct type and size
- Avoiding the overhead of buffer creation until actually needed
This change aligns with the PR objectives of improving performance by avoiding unnecessary object creation.
359-369
: Improved buffer handling for string bodies.This optimization avoids redundant processing when text content is already available, either directly or via the bodyType. The implementation efficiently creates buffers only when needed, reducing memory usage and improving performance.
468-478
: Excellent optimization for string body processing.The changes to
processBodyInit
improve performance by:
- Computing the content length directly using
Buffer.byteLength
- Creating buffers on-demand in the
bodyFactory
function rather than eagerly- Using
Readable.from
with a buffer created only when neededThis deferred buffer creation strategy is a key performance optimization, especially for string bodies which are common in web applications.
275-276
: Signal handling properly updated in form data processing.The reference to
_signal
is correctly updated in the form data parsing logic, maintaining the functionality while using the new private property.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (3)
packages/server/src/utils.ts (1)
650-652
: Consider using additional checks for arrays.The
isArray
utility function checks multiple array properties, but it might have false positives for array-like objects that aren't actual arrays.Consider adding more specific checks or using the native
Array.isArray
when possible:export function isArray<T>(value: any): value is T[] { - return value?.length != null && value?.map && value?.slice && value?.splice; + return value?.length != null && typeof value?.map === 'function' && + typeof value?.slice === 'function' && typeof value?.splice === 'function'; }packages/server/test/server.bench.ts (2)
26-49
: Effective comparative benchmarking setup.This benchmark effectively compares the performance between the native Node.js HTTP server and the whatwg-node/server adapter by:
- Setting up equivalent servers with the same functionality
- Using dynamic port allocation for both servers (good practice)
- Making equivalent requests to both servers
- Using consistent benchmark configuration
This will provide valuable performance data to validate the optimizations mentioned in the PR description.
Consider adding cleanup code to close the servers after benchmarks complete:
describe('Simple JSON Response', () => { const simpleNodeServer = createServer((_req, res) => { res.writeHead(200, { 'Content-Type': 'application/json' }); res.end(JSON.stringify({ hello: 'world' })); }); const simpleNodePort = (simpleNodeServer.listen(0).address() as AddressInfo).port; + + // Clean up server after tests + afterAll(() => { + simpleNodeServer.close(); + }); bench( 'node:http', () => fetch(`http://localhost:${simpleNodePort}`).then(res => res.json()), benchConfig, ); const whatwgNodeServer = createServer(createServerAdapter(() => Response.json({ hello: 'world' }))); const whatwgNodePort = (whatwgNodeServer.listen(0).address() as AddressInfo).port; + + // Clean up server after tests + afterAll(() => { + whatwgNodeServer.close(); + }); bench( '@whatwg-node/server', () => fetch(`http://localhost:${whatwgNodePort}`).then(res => res.json()), benchConfig, ); });
38-42
: Consider refactoring server setup for better readability.The whatwg-node server setup combines server creation, listening, and port extraction in a single complex expression.
Refactor to match the style of the native HTTP server setup for improved readability:
- const whatwgNodeServer = ( - createServer(createServerAdapter(() => Response.json({ hello: 'world' }))) - .listen(0) - .address() as AddressInfo - ).port; + const whatwgNodeServer = createServer(createServerAdapter(() => Response.json({ hello: 'world' }))); + const whatwgNodePort = (whatwgNodeServer.listen(0).address() as AddressInfo).port;Then update the fetch URL to use
whatwgNodePort
instead ofwhatwgNodeServer
.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
⛔ Files ignored due to path filters (1)
yarn.lock
is excluded by!**/yarn.lock
,!**/*.lock
📒 Files selected for processing (8)
package.json
(2 hunks)packages/server/src/createServerAdapter.ts
(4 hunks)packages/server/src/utils.ts
(3 hunks)packages/server/test/server.bench.ts
(1 hunks)tsconfig.build.json
(1 hunks)tsconfig.json
(1 hunks)vitest.config.ts
(1 hunks)vitest.projects.ts
(1 hunks)
✅ Files skipped from review due to trivial changes (3)
- tsconfig.json
- tsconfig.build.json
- vitest.projects.ts
🚧 Files skipped from review as they are similar to previous changes (1)
- packages/server/src/createServerAdapter.ts
🧰 Additional context used
🧬 Code Graph Analysis (1)
packages/server/src/utils.ts (3)
packages/node-fetch/src/utils.ts (2)
isArray
(94-96)safeWrite
(85-92)packages/promise-helpers/src/index.ts (1)
handleMaybePromise
(29-42)packages/node-fetch/src/AbortError.ts (1)
reason
(13-15)
🪛 GitHub Check: lint
vitest.config.ts
[failure] 1-1:
'vite-tsconfig-paths' should be listed in the project's dependencies, not devDependencies
⏰ Context from checks skipped due to timeout of 90000ms (19)
- GitHub Check: unit / node 24
- GitHub Check: unit / node 23
- GitHub Check: unit / deno
- GitHub Check: unit / node 20
- GitHub Check: unit / bun
- GitHub Check: unit / node 18
- GitHub Check: e2e / cloudflare-modules
- GitHub Check: e2e / cloudflare-workers
- GitHub Check: server (undici)
- GitHub Check: server (ponyfill)
- GitHub Check: prettier
- GitHub Check: server (native)
- GitHub Check: alpha / snapshot
- GitHub Check: e2e / azure-function
- GitHub Check: lint
- GitHub Check: node-fetch (consumeBody)
- GitHub Check: esm
- GitHub Check: node-fetch (noConsumeBody)
- GitHub Check: e2e / aws-lambda
🔇 Additional comments (8)
packages/server/src/utils.ts (4)
118-121
: Performance optimization for AbortController creation.This change conditionally creates either a native AbortController or a custom implementation based on the Request implementation. This is a good optimization that avoids creating unnecessary AbortController/AbortSignal objects when they're not needed.
321-362
: Efficient header handling optimization.This optimization intelligently reuses the existing headers object when possible instead of creating a new map. The implementation properly handles special cases:
- Direct use of
writeHead
with the rawheadersInit
object when safe- Using
setHeaders
method when available in Node.js 18+- Falling back to manual header iteration with special handling for
set-cookie
headersThis aligns with the PR objective to prevent creating a new map for
Headers
objects, improving performance for header-heavy applications.
364-371
: Optimized string body handling.This new branch efficiently handles string response bodies without creating unnecessary Buffer objects, which is a good performance optimization for text responses. The implementation leverages the
handleMaybePromise
helper for proper asynchronous flow control.
654-691
: Well-designed custom AbortController implementation.The
CustomAbortControllerSignal
class is well-implemented, extendingEventTarget
and properly implementing bothAbortSignal
andAbortController
interfaces. The class correctly:
- Tracks abort state and reason
- Implements
throwIfAborted
method- Provides a self-referential signal getter
- Properly manages event listeners for the
onabort
handlerThis implementation supports the conditional AbortController creation strategy and allows for performance optimizations.
vitest.config.ts (1)
1-6
:✅ Verification successful
Good implementation of Vitest configuration.
The configuration is well-structured and correctly imports the necessary dependencies to enable TypeScript path resolution for Vitest tests. This will ensure that TypeScript path aliases defined in tsconfig.json work correctly during testing.
🏁 Script executed:
#!/bin/bash # Check if vite-tsconfig-paths is listed in dependencies or devDependencies in package.json grep -E '"vite-tsconfig-paths":' package.jsonLength of output: 83
Vitest configuration verified
- Confirmed that “vite-tsconfig-paths” is declared in package.json (version 5.1.4).
- No further changes needed—configuration is correct.
🧰 Tools
🪛 GitHub Check: lint
[failure] 1-1:
'vite-tsconfig-paths' should be listed in the project's dependencies, not devDependenciespackage.json (2)
11-11
: Good addition of bench script.Adding a benchmark script to run performance tests is a good practice for monitoring and improving code performance, which aligns well with the PR's performance optimization objectives.
73-75
: Dependencies added properly to support benchmarking.The added dependencies (updated TypeScript, vite-tsconfig-paths, and vitest) properly support the benchmarking infrastructure being introduced.
packages/server/test/server.bench.ts (1)
6-25
: Well-structured benchmark configuration with environment awareness.The benchmark configuration is well-designed with:
- CI environment detection
- Configurable parameters via environment variables
- Reasonable defaults for both CI and local environments
- Helper function to parse numeric environment variables
This approach ensures the benchmarks will run appropriately in both local development and CI environments.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (1)
packages/server/test/reproductions.spec.ts (1)
97-113
: Improved error handling for JSON parsing failures.The changes introduce proper error handling for the JSON parsing process in the test server adapter. This is a good improvement as it:
- Prevents unhandled promise rejections when malformed JSON is encountered
- Provides structured error responses that include detailed error information
- Uses cleaner promise chain formatting for better readability
Consider masking stack traces in error responses even in tests, as it's a good practice to avoid exposing detailed implementation information:
.catch(error => Response.json({ error: { name: error.name, message: error.message, - stack: error.stack, + // Removed stack for security best practices }, }), ),
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (2)
packages/server/src/utils.ts
(3 hunks)packages/server/test/reproductions.spec.ts
(1 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
- packages/server/src/utils.ts
⏰ Context from checks skipped due to timeout of 90000ms (13)
- GitHub Check: e2e / azure-function
- GitHub Check: e2e / aws-lambda
- GitHub Check: unit / deno
- GitHub Check: unit / node 24
- GitHub Check: unit / node 23
- GitHub Check: unit / node 20
- GitHub Check: unit / bun
- GitHub Check: unit / node 18
- GitHub Check: server (undici)
- GitHub Check: node-fetch (consumeBody)
- GitHub Check: server (ponyfill)
- GitHub Check: node-fetch (noConsumeBody)
- GitHub Check: server (native)
a77e698
to
1ce5f83
Compare
AbortController
andAbortSignal
if not needed withnew Request
because it is expensiveHeaders
and try to re-use the init object forHeaders
for performance with a single-linewriteHead
.Buffer
forstring
bodies for performancesetHeaders
which acceptsHeaders
since Node 18 if needed to forwardHeaders
to Node