Skip to content

Conversation

@whoiskatrin
Copy link
Contributor

@whoiskatrin whoiskatrin commented Nov 27, 2025

the idea is to type more than one msg at a time
want to give users 2 options:

@changeset-bot
Copy link

changeset-bot bot commented Nov 27, 2025

⚠️ No Changeset found

Latest commit: d7939b7

Merging this PR will not cause a version bump for any packages. If these changes should not result in a new version, you're good to go. If these changes should result in a version bump, you need to add a changeset.

This PR includes no changesets

When changesets are added to this PR, you'll see the packages that this PR includes changesets for and the associated semver types

Click here to learn what changesets are, and how to add one.

Click here if you're a maintainer who wants to add a changeset to this PR

@pkg-pr-new
Copy link

pkg-pr-new bot commented Nov 27, 2025

Open in StackBlitz

npm i https://pkg.pr.new/cloudflare/agents@685

commit: d7939b7

@claude
Copy link

claude bot commented Nov 27, 2025

Claude Code Review

Summary

Solid implementation of chat message batching and typing indicators for AIChatAgent. The core logic is sound with good test coverage. Found one potential race condition that needs attention.

Issues

1. Race condition in batch mode processing (ai-chat-agent.ts:762-764)

When processing completes and finds a new pending request, there's a window where the timer could fire while the new request is being scheduled:

} finally {
  this._isProcessingChat = false;  // Processing lock released
  // ← Race window: timer could fire here before _scheduleProcessing() resets it
  if (this._pendingChatRequest) {
    this._scheduleProcessing();    // This cancels old timer and creates new one
  }
}

If a timer from before processing started fires during this window, _processPendingRequest() could be called twice concurrently since _isProcessingChat is false but the new timer hasn't been created yet.

Fix: Set the lock before scheduling:

} finally {
  // Check for new request before releasing lock
  if (this._pendingChatRequest) {
    this._scheduleProcessing();
  }
  this._isProcessingChat = false;
}

2. Missing test coverage

Tests don't verify sequential mode works correctly. All tests use batch mode. Should add at least one test with chatProcessingMode = "sequential" to ensure both paths work.

Minor observations

  • Good: Comprehensive test suite covers edge cases (cancellation, clearing, multiple connections, typing indicators)
  • Good: Proper cleanup on clear/cancel operations (ai-chat-agent.ts:356-395)
  • Good: Comment explaining DO eviction edge case (ai-chat-agent.ts:346-349)
  • The typing throttle (500ms) in React hook is reasonable but consider making it configurable
  • Example client code properly uses onInputChange (examples/resumable-stream-chat/src/client.tsx:72)

The race condition is the only blocking issue - fix the lock ordering and this is good to merge.

agents-git-bot bot pushed a commit to cloudflare/cloudflare-docs that referenced this pull request Nov 27, 2025
Documents new features from cloudflare/agents#685:
- Chat processing modes (sequential and batch) for AIChatAgent
- Typing indicator support via onInputChange and inputProps
- Configuration options: chatProcessingMode, chatIdleTimeout, chatTypingTimeout
- Updated useAgentChat API reference with new return values
- Added comprehensive examples for both processing modes

Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <[email protected]>
agents-git-bot bot pushed a commit to cloudflare/cloudflare-docs that referenced this pull request Nov 27, 2025
This update documents new features from PR #685 (cloudflare/agents):
- Chat processing modes (sequential vs batch)
- Typing indicators for smart message batching
- New AIChatAgent properties: chatProcessingMode, chatIdleTimeout, chatTypingTimeout
- New useAgentChat returns: onInputChange, inputProps, sendTypingIndicator

The documentation includes:
- API reference updates for AIChatAgent class properties
- Examples showing sequential and batch processing modes
- React examples demonstrating typing indicator integration
- Use cases and best practices for each mode

These features enable better handling of conversational chat patterns where
users send multiple rapid messages.

Related PR: cloudflare/agents#685

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <[email protected]>
@whoiskatrin whoiskatrin marked this pull request as ready for review November 27, 2025 18:14
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant