Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Subissue: Improve support for async command providers #167

Open
dlqqq opened this issue Feb 14, 2025 · 0 comments
Open

Subissue: Improve support for async command providers #167

dlqqq opened this issue Feb 14, 2025 · 0 comments
Labels

Comments

@dlqqq
Copy link
Member

dlqqq commented Feb 14, 2025

Problem

  • Handle the case when getChatCommands() and handleChatCommand() are "truly async" and take a long time to resolve. Important for @file: since it needs to talk to the ContentsManager.

Proposed Solution

  • Use Promise.all() to start async getChatCommands() tasks in parallel.
  • Show chat command suggestions as soon as they resolve, instead of waiting for every provider to finish before opening the chat commands menu.
  • Handle loading states in the UI.

Additional context

Initial implementation of chat commands framework: #161.

@dlqqq dlqqq added the enhancement New feature or request label Feb 14, 2025
@dlqqq dlqqq changed the title Improve support for async command providers Subissue: Improve support for async command providers Feb 14, 2025
@dlqqq dlqqq moved this to Todo in Jupyter AI Feb 17, 2025
@dlqqq dlqqq added this to Jupyter AI Feb 17, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
Status: Active
Development

No branches or pull requests

1 participant