Skip to content

Support React Native #118

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 181 commits into
base: main
Choose a base branch
from
Open

Support React Native #118

wants to merge 181 commits into from

Conversation

hans00
Copy link
Contributor

@hans00 hans00 commented May 19, 2023

Make it support React Native.

And I made a example for it.
https://github.com/hans00/react-native-transformers-example

On Android need some library patch.

TODO:

  • Check models are works fine
  • Research more efficiently image processing
  • Merge v3
  • Check everything are works fine on v3

Sorry, something went wrong.

@xenova
Copy link
Collaborator

xenova commented May 19, 2023

Woah nice! There are a lot of people who are interested in doing stuff like this! Looking forward to reviewing this when you're ready! cc @pcuenca

hans00 and others added 5 commits May 19, 2023 14:52

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature. The key has expired.
@jhen0409
Copy link

jhen0409 commented May 20, 2023

I'm investigating a performance issue with Whisper tiny.en, and it looks like the performance is not as expected in the example.

I quickly add log for onnxruntime-react-native/lib/backend.ts, this is the result:

# Encode result
 LOG  ONNX session run finished 767 ms

# Decode result
 LOG  ONNX session run finished 4518 ms
# ... 4 runs ...
 LOG  ONNX session run finished 4856 ms
 LOG  Time: 40016
 LOG  Result: {"text": " (buzzing)"}
Click to expand
const t0 = performance.now()
// this.#inferenceSession === NativeModules.Onnxruntime
const results: Binding.ReturnType = await this.#inferenceSession.run(this.#key, input, outputNames, options);
const output = this.decodeReturnType(results);
console.log('ONNX session run finished', performance.now() - t0)

Also see logs of native part (added log to OnnxruntimeModule.java), it is significantly less than the time taken by JS part:

# Encode result
09:47:59.203 ONNXRUNTIME_RN run() is finished, time: 273 ms

# Decode result
09:48:00.280 ONNXRUNTIME_RN run() is finished, time: 339 ms
# ... 4 runs ...
09:48:23.807 ONNXRUNTIME_RN run() is finished, time: 541 ms

I think this problem may come from blocking caused by native module passing too large data. I think request/help onnxruntime for migrate to JSI module may solve this problem.

EDIT: It seems the logger have some bugs leads me to think that the problem is from the native bridge. I add timeout await to the decodeReturnType call and found the issue is from this function.

Verified

This commit was signed with the committer’s verified signature.
hans00 Hans

Verified

This commit was signed with the committer’s verified signature.
hans00 Hans

Verified

This commit was signed with the committer’s verified signature.
hans00 Hans

Verified

This commit was signed with the committer’s verified signature.
hans00 Hans

Verified

This commit was signed with the committer’s verified signature.
hans00 Hans

Verified

This commit was signed with the committer’s verified signature.
hans00 Hans
@hans00
Copy link
Contributor Author

hans00 commented Jan 20, 2025

@xenova here is ready for review

hans00 and others added 14 commits February 11, 2025 14:17

Verified

This commit was signed with the committer’s verified signature.
hans00 Hans

Verified

This commit was signed with the committer’s verified signature.
hans00 Hans

Verified

This commit was signed with the committer’s verified signature.
hans00 Hans

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature.

Verified

This commit was signed with the committer’s verified signature.
hans00 Hans

Verified

This commit was signed with the committer’s verified signature.
hans00 Hans

Verified

This commit was signed with the committer’s verified signature.
hans00 Hans

Verified

This commit was signed with the committer’s verified signature.
hans00 Hans

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature.

Verified

This commit was signed with the committer’s verified signature.
hans00 Hans

Verified

This commit was signed with the committer’s verified signature.
hans00 Hans

Verified

This commit was signed with the committer’s verified signature.
hans00 Hans

Verified

This commit was signed with the committer’s verified signature.
hans00 Hans

Verified

This commit was signed with the committer’s verified signature.
hans00 Hans
@jpcoder2
Copy link

@hans00 , I'v seen a lot of commits from you on this lately. Thanks for that!

Were you able to get this working? Do you know if it will be merged into the main project soon?

@hans00
Copy link
Contributor Author

hans00 commented Mar 25, 2025

@xenova ping

hans00 added 2 commits April 3, 2025 16:00

Verified

This commit was signed with the committer’s verified signature.
hans00 Hans

Verified

This commit was signed with the committer’s verified signature.
hans00 Hans
@xenova
Copy link
Collaborator

xenova commented Apr 13, 2025

Hi @hans00! 👋 Thanks so much for your work here - it will be very impactful.

I'm finally able to commit more time to this, as I now have the ability to do mobile development in my environment. My main concern with the PR in its current state is the additional dependencies that it introduces. Ideally, the only new dependency that would be introduced is onnxruntime-react-native. For things like image processing, if sharp can't handle it, then I think we should let the user implement the image reading themselves (at least to start off with).

What do you think?

Verified

This commit was signed with the committer’s verified signature.
hans00 Hans
@hans00
Copy link
Contributor Author

hans00 commented Apr 14, 2025

For things like image processing, if sharp can't handle it, then I think we should let the user implement the image reading themselves (at least to start off with).

What do you think?

Ok, I had removed JS codecs for RN.
For better performance, our project are uses OffscreenCanvas polyfill.

For FS access, native-universal-fs serves merely as a generic wrapper for the FS modules of React Native or Expo without directly depending on the modules themselves.
This allows for type checking, which is preferable to aliasing fs as the FS module and thereby only supporting a specific FS module.

Verified

This commit was signed with the committer’s verified signature.
hans00 Hans
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet