Skip to content

Support React Native #118

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 185 commits into
base: main
Choose a base branch
from
Open

Support React Native #118

wants to merge 185 commits into from

Conversation

hans00
Copy link
Contributor

@hans00 hans00 commented May 19, 2023

Make it support React Native.

And I made a example for it.
https://github.com/hans00/react-native-transformers-example

On Android need some library patch.

TODO:

  • Check models are works fine
  • Research more efficiently image processing
  • Merge v3
  • Check everything are works fine on v3

@xenova
Copy link
Collaborator

xenova commented May 19, 2023

Woah nice! There are a lot of people who are interested in doing stuff like this! Looking forward to reviewing this when you're ready! cc @pcuenca

@jhen0409
Copy link

jhen0409 commented May 20, 2023

I'm investigating a performance issue with Whisper tiny.en, and it looks like the performance is not as expected in the example.

I quickly add log for onnxruntime-react-native/lib/backend.ts, this is the result:

# Encode result
 LOG  ONNX session run finished 767 ms

# Decode result
 LOG  ONNX session run finished 4518 ms
# ... 4 runs ...
 LOG  ONNX session run finished 4856 ms
 LOG  Time: 40016
 LOG  Result: {"text": " (buzzing)"}
Click to expand
const t0 = performance.now()
// this.#inferenceSession === NativeModules.Onnxruntime
const results: Binding.ReturnType = await this.#inferenceSession.run(this.#key, input, outputNames, options);
const output = this.decodeReturnType(results);
console.log('ONNX session run finished', performance.now() - t0)

Also see logs of native part (added log to OnnxruntimeModule.java), it is significantly less than the time taken by JS part:

# Encode result
09:47:59.203 ONNXRUNTIME_RN run() is finished, time: 273 ms

# Decode result
09:48:00.280 ONNXRUNTIME_RN run() is finished, time: 339 ms
# ... 4 runs ...
09:48:23.807 ONNXRUNTIME_RN run() is finished, time: 541 ms

I think this problem may come from blocking caused by native module passing too large data. I think request/help onnxruntime for migrate to JSI module may solve this problem.

EDIT: It seems the logger have some bugs leads me to think that the problem is from the native bridge. I add timeout await to the decodeReturnType call and found the issue is from this function.

@jpcoder2
Copy link

@hans00 , I'v seen a lot of commits from you on this lately. Thanks for that!

Were you able to get this working? Do you know if it will be merged into the main project soon?

@hans00
Copy link
Contributor Author

hans00 commented Mar 25, 2025

@xenova ping

@xenova
Copy link
Collaborator

xenova commented Apr 13, 2025

Hi @hans00! 👋 Thanks so much for your work here - it will be very impactful.

I'm finally able to commit more time to this, as I now have the ability to do mobile development in my environment. My main concern with the PR in its current state is the additional dependencies that it introduces. Ideally, the only new dependency that would be introduced is onnxruntime-react-native. For things like image processing, if sharp can't handle it, then I think we should let the user implement the image reading themselves (at least to start off with).

What do you think?

@hans00
Copy link
Contributor Author

hans00 commented Apr 14, 2025

For things like image processing, if sharp can't handle it, then I think we should let the user implement the image reading themselves (at least to start off with).

What do you think?

Ok, I had removed JS codecs for RN.
For better performance, our project are uses OffscreenCanvas polyfill.

For FS access, native-universal-fs serves merely as a generic wrapper for the FS modules of React Native or Expo without directly depending on the modules themselves.
This allows for type checking, which is preferable to aliasing fs as the FS module and thereby only supporting a specific FS module.

@simonwh
Copy link

simonwh commented May 27, 2025

This is absolutely brilliant work @hans00, big props for your relentlessness in over almost 2 years 🙇‍♂️ I hope this will get the attention it deserves from @xenova and the team and get merged into main soon!

@TowhidKashem
Copy link

hey @hans00 , sorry to bother you but I've been trying without success to get your changes working for the past day and a half and thought I'd ask... I installed @fugood/transformers along with all the polyfills and such and followed ur example repo.

I manually placed a model in my app's application folder:

https://huggingface.co/distilbert/distilbert-base-uncased-finetuned-sst-2-english/tree/main

the contents are in a folder called distilbert-base-uncased-finetuned-sst-2-english and contains the exact repo's contents above.

When I try to use it like:

import { cacheDirectory, documentDirectory } from 'expo-file-system';

env.localModelPath = documentDirectory;
env.cacheDir = cacheDirectory;
env.allowRemoteModels = false;

export const sentimentAnalysis = async (text: string) => {
  const classifier = await pipeline(
    'sentiment-analysis',
    'distilbert-base-uncased-finetuned-sst-2-english',
    {
      // progress_callback: (progress) => console.info('progress', progress)
    }
  );
  const result = await classifier(text); // [{'label': 'POSITIVE', 'score': 0.999817686}]
  return result;
};

export const sentimentAnalysisAsString = async (text: string) =>
  sentimentAnalysis(text).then((result) => JSON.stringify(result, null, 2));

It gives me these 3 warnings:

WARN  Unable to load from local path "file:///Users/redacted/Library/Developer/CoreSimulator/Devices/B1A4461B-D329-482B-BAE5-991F910EFCCD/data/Containers/Data/Application/78CFCE04-ED0E-448D-BA36-402E2C0C1613/Documents/distilbert-base-uncased-finetuned-sst-2-english/tokenizer.json": "TypeError: native_universal_fs__WEBPACK_IMPORTED_MODULE_2__.exists is not a function (it is undefined)"
WARN  Unable to load from local path "file:///Users/redacted/Library/Developer/CoreSimulator/Devices/B1A4461B-D329-482B-BAE5-991F910EFCCD/data/Containers/Data/Application/78CFCE04-ED0E-448D-BA36-402E2C0C1613/Documents/distilbert-base-uncased-finetuned-sst-2-english/tokenizer_config.json": "TypeError: native_universal_fs__WEBPACK_IMPORTED_MODULE_2__.exists is not a function (it is undefined)"
WARN  Unable to load from local path "file:///Users/redacted/Library/Developer/CoreSimulator/Devices/B1A4461B-D329-482B-BAE5-991F910EFCCD/data/Containers/Data/Application/78CFCE04-ED0E-448D-BA36-402E2C0C1613/Documents/distilbert-base-uncased-finetuned-sst-2-english/config.json": "TypeError: native_universal_fs__WEBPACK_IMPORTED_MODULE_2__.exists is not a function (it is undefined)"

It seems it might have come from the native-universal-fs lib you're using, I tried replacing those instances with expo-file-system directly and using patch-package to update @fugood/transformers but the errors remain.

Any idea what could be the fix?

@axe-me
Copy link

axe-me commented Jul 4, 2025

@TowhidKashem I ran into same issue. I found it was because the metro reads the exports field of the package then loaded the browser version build. You can use my fork for now: https://www.npmjs.com/package/@axe-dev/transformers (I couldn't guarantee that I won't add my changes in there) or you can figure out some ways to load the build that bundled the universal fs package.

@VikingLichens
Copy link

hey @hans00 , sorry to bother you but I've been trying without success to get your changes working for the past day and a half and thought I'd ask... I installed @fugood/transformers along with all the polyfills and such and followed ur example repo.

I manually placed a model in my app's application folder:

https://huggingface.co/distilbert/distilbert-base-uncased-finetuned-sst-2-english/tree/main

the contents are in a folder called distilbert-base-uncased-finetuned-sst-2-english and contains the exact repo's contents above.

When I try to use it like:

import { cacheDirectory, documentDirectory } from 'expo-file-system';

env.localModelPath = documentDirectory;
env.cacheDir = cacheDirectory;
env.allowRemoteModels = false;

export const sentimentAnalysis = async (text: string) => {
  const classifier = await pipeline(
    'sentiment-analysis',
    'distilbert-base-uncased-finetuned-sst-2-english',
    {
      // progress_callback: (progress) => console.info('progress', progress)
    }
  );
  const result = await classifier(text); // [{'label': 'POSITIVE', 'score': 0.999817686}]
  return result;
};

export const sentimentAnalysisAsString = async (text: string) =>
  sentimentAnalysis(text).then((result) => JSON.stringify(result, null, 2));

It gives me these 3 warnings:

WARN  Unable to load from local path "file:///Users/redacted/Library/Developer/CoreSimulator/Devices/B1A4461B-D329-482B-BAE5-991F910EFCCD/data/Containers/Data/Application/78CFCE04-ED0E-448D-BA36-402E2C0C1613/Documents/distilbert-base-uncased-finetuned-sst-2-english/tokenizer.json": "TypeError: native_universal_fs__WEBPACK_IMPORTED_MODULE_2__.exists is not a function (it is undefined)"
WARN  Unable to load from local path "file:///Users/redacted/Library/Developer/CoreSimulator/Devices/B1A4461B-D329-482B-BAE5-991F910EFCCD/data/Containers/Data/Application/78CFCE04-ED0E-448D-BA36-402E2C0C1613/Documents/distilbert-base-uncased-finetuned-sst-2-english/tokenizer_config.json": "TypeError: native_universal_fs__WEBPACK_IMPORTED_MODULE_2__.exists is not a function (it is undefined)"
WARN  Unable to load from local path "file:///Users/redacted/Library/Developer/CoreSimulator/Devices/B1A4461B-D329-482B-BAE5-991F910EFCCD/data/Containers/Data/Application/78CFCE04-ED0E-448D-BA36-402E2C0C1613/Documents/distilbert-base-uncased-finetuned-sst-2-english/config.json": "TypeError: native_universal_fs__WEBPACK_IMPORTED_MODULE_2__.exists is not a function (it is undefined)"

It seems it might have come from the native-universal-fs lib you're using, I tried replacing those instances with expo-file-system directly and using patch-package to update @fugood/transformers but the errors remain.

Any idea what could be the fix?

I had same issue in my recent trials...

@TowhidKashem
Copy link

hey @hans00 , sorry to bother you but I've been trying without success to get your changes working for the past day and a half and thought I'd ask... I installed @fugood/transformers along with all the polyfills and such and followed ur example repo.
I manually placed a model in my app's application folder:
https://huggingface.co/distilbert/distilbert-base-uncased-finetuned-sst-2-english/tree/main
the contents are in a folder called distilbert-base-uncased-finetuned-sst-2-english and contains the exact repo's contents above.
When I try to use it like:

import { cacheDirectory, documentDirectory } from 'expo-file-system';

env.localModelPath = documentDirectory;
env.cacheDir = cacheDirectory;
env.allowRemoteModels = false;

export const sentimentAnalysis = async (text: string) => {
  const classifier = await pipeline(
    'sentiment-analysis',
    'distilbert-base-uncased-finetuned-sst-2-english',
    {
      // progress_callback: (progress) => console.info('progress', progress)
    }
  );
  const result = await classifier(text); // [{'label': 'POSITIVE', 'score': 0.999817686}]
  return result;
};

export const sentimentAnalysisAsString = async (text: string) =>
  sentimentAnalysis(text).then((result) => JSON.stringify(result, null, 2));

It gives me these 3 warnings:

WARN  Unable to load from local path "file:///Users/redacted/Library/Developer/CoreSimulator/Devices/B1A4461B-D329-482B-BAE5-991F910EFCCD/data/Containers/Data/Application/78CFCE04-ED0E-448D-BA36-402E2C0C1613/Documents/distilbert-base-uncased-finetuned-sst-2-english/tokenizer.json": "TypeError: native_universal_fs__WEBPACK_IMPORTED_MODULE_2__.exists is not a function (it is undefined)"
WARN  Unable to load from local path "file:///Users/redacted/Library/Developer/CoreSimulator/Devices/B1A4461B-D329-482B-BAE5-991F910EFCCD/data/Containers/Data/Application/78CFCE04-ED0E-448D-BA36-402E2C0C1613/Documents/distilbert-base-uncased-finetuned-sst-2-english/tokenizer_config.json": "TypeError: native_universal_fs__WEBPACK_IMPORTED_MODULE_2__.exists is not a function (it is undefined)"
WARN  Unable to load from local path "file:///Users/redacted/Library/Developer/CoreSimulator/Devices/B1A4461B-D329-482B-BAE5-991F910EFCCD/data/Containers/Data/Application/78CFCE04-ED0E-448D-BA36-402E2C0C1613/Documents/distilbert-base-uncased-finetuned-sst-2-english/config.json": "TypeError: native_universal_fs__WEBPACK_IMPORTED_MODULE_2__.exists is not a function (it is undefined)"

It seems it might have come from the native-universal-fs lib you're using, I tried replacing those instances with expo-file-system directly and using patch-package to update @fugood/transformers but the errors remain.
Any idea what could be the fix?

I had same issue in my recent trials...

I was able to get onnx models working using just onnxruntime-react-native, that lib also gave that same error which I was able to suppress using a stub:

onnx-web.stub.ts

export default {};
export const InferenceSession = {};
export const Tensor = {};
export const env = {};

babel.config.js:

    plugins: [
      [
        'module-resolver',
        {
          root: ['./'],
          alias: {
            'onnxruntime-web': './onnx-web.stub.ts'
          }
        }

this gets rid of the webpack error and allows u to use onnx models. But the experience isn't as easy as using transformers.js so that would still be much more preferred. Without transformers.js you have to figure out unique tokenizers for every model which is a real pain in the butt..

@TowhidKashem
Copy link

@TowhidKashem I ran into same issue. I found it was because the metro reads the exports field of the package then loaded the browser version build. You can use my fork for now: https://www.npmjs.com/package/@axe-dev/transformers (I couldn't guarantee that I won't add my changes in there) or you can figure out some ways to load the build that bundled the universal fs package.

Thanks, is this the right repo:

https://github.com/axe-me/transformers.js/tree/main

and which branch should I be forking?

@axe-me
Copy link

axe-me commented Jul 5, 2025

@TowhidKashem same as the fugood one, merge branch.
Or you can just install it : npm i @axe-dev/transformers

@hans00
Copy link
Contributor Author

hans00 commented Jul 5, 2025

It seems it might have come from the native-universal-fs lib you're using, I tried replacing those instances with expo-file-system directly and using patch-package to update @fugood/transformers but the errors remain.
Any idea what could be the fix?

I had same issue in my recent trials...

This should fixed.
But currently uses webpack bundle to solve package alias, so that could not install from source.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.