Skip to content

Make export_llm as a separate binary #8432

Open
@kimishpatel

Description

@kimishpatel

🚀 The feature, motivation and pitch

Today users invoke python examples.models.llama.export_llama... to export model. It would be nice to have a set of binary utils installed as part of pip install executorch that can be used for for model export and lowering.

Alternatives

Continue to use python examples.models.llama.export_llama...

Additional context

This may have some overlap with export wizard @byjlw, however I am asking this one to focus more on generative ai usecases.

RFC (Optional)

No response

cc @mergennachin @iseeyuan @lucylq @helunwencser @tarun292 @jackzhxng

Metadata

Metadata

Assignees

Labels

module: examplesIssues related to demos under examples/triagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate module

Projects

Status

To triage

Status

Todo

Status

Backlog

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions