Extend model compatibility: Any of this options please #513
Replies: 3 comments
-
Hey @kuriel-dev, thanks for submitting this issue.
|
Beta Was this translation helpful? Give feedback.
-
I'm not trying to force anything — I just want to do my small part to contribute to this tool, which I've grown fond of and truly believe in. That's why I suggest a few alternatives. I understand that sometimes changes can mean extra work, but I believe that expanding compatibility could help this project grow in a positive way. There's a lot of effort and professionalism in this library, and like everyone else, I just want to see it thrive. |
Beta Was this translation helpful? Give feedback.
-
First of all, thanks for your input @kuriel-dev, this is very important for us that the community gives us feedback and feature requests and we try to implement those as quickly as possible. MediaPipe Web compatibility Bundle size Converter tool PTE files Thank you for your support 🫶🏻 |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Problem description
Any chance to resolve any of this options?
.task
format from Google AI for Web, iOS and Android.https://mediapipe-studio.webapps.google.com/studio/demo/llm_inference
You can check this basic example for React Native here: https://github.com/cdiddy77/react-native-llm-mediapipe
/react-native-executorch
community (dev users) can upload a.safetensors
in order to transform into.pte
usable file model, with tokenizers. (can havereact-native-executorch
able to use.safetensors
models..pte
in your default.pte
models import, with tokenizers.Proposed solution
.task
format from Google AI for Web, iOS and Android.https://mediapipe-studio.webapps.google.com/studio/demo/llm_inference
You can check basic example for React Native here: https://github.com/cdiddy77/react-native-llm-mediapipe
Alternative solutions
/react-native-executorch
community (dev users) can upload a.safetensors
in order to transform into.pte
usable file model, with tokenizers.Default provided models are so limited and bad trained.
Benefits to React Native ExecuTorch
Expansión, more models, more popular tool, more compatibility.
Beta Was this translation helpful? Give feedback.
All reactions