Skip to content

make ai sdk native #698

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 12 commits into
base: main
Choose a base branch
from

Conversation

sameelarif
Copy link
Member

@sameelarif sameelarif commented Apr 23, 2025

why

  • users shouldn't have to copy/paste our aisdk examples

what changed

  • made aisdk a stagehand native client

test plan

  • evals

Copy link

changeset-bot bot commented Apr 23, 2025

🦋 Changeset detected

Latest commit: 1848be5

The changes in this PR will be included in the next version bump.

This PR includes changesets to release 1 package
Name Type
@browserbasehq/stagehand Patch

Not sure what this means? Click here to learn what changesets are.

Click here if you're a maintainer who wants to add another changeset to this PR

@sameelarif sameelarif requested a review from kamath April 23, 2025 18:33
@miguelg719
Copy link
Collaborator

miguelg719 commented Apr 23, 2025

I started looking into this yesterday; it seems like if we're going with aisdk we should probably change our internal inference logic to adapt to generateText, generateObject, and maybe streamText (useful for operator); meaning get rrid of createChatCompletion and upgrade

@miguelg719 miguelg719 marked this pull request as ready for review April 24, 2025 21:26
Copy link

@greptile-apps greptile-apps bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

PR Summary

This PR integrates the AI SDK as a first-class LLM client in Stagehand, moving from an external implementation to a native one with enhanced logging and caching capabilities.

  • Added lib/llm/aisdk.ts implementing AISdkClient with support for multiple AI providers (OpenAI, Google, Anthropic, Groq, Cerebras)
  • Modified model naming convention to use format aisdk/provider/model (e.g. aisdk/openai/gpt-4o) for simplified integration
  • Moved AI SDK dependencies from devDependencies to optionalDependencies in package.json, allowing selective provider installation
  • Added support for provider-specific API keys in environment variables through lib/index.ts
  • Enhanced types/model.ts to support flexible model strings while maintaining type safety

10 file(s) reviewed, 8 comment(s)
Edit PR Review Bot Settings | Greptile

@@ -34,14 +34,15 @@ export const AvailableModelSchema = z.enum([
"gemini-2.5-pro-preview-03-25",
]);

export type AvailableModel = z.infer<typeof AvailableModelSchema>;
export type AvailableModel = z.infer<typeof AvailableModelSchema> | string;
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

style: Adding string to AvailableModel type weakens type safety. Consider using a branded type or maintaining an explicit list of supported models to prevent runtime errors from invalid model names.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants