-
Notifications
You must be signed in to change notification settings - Fork 569
Open
Labels
bugSomething isn't workingSomething isn't working
Description
System Info
transformers.js@main
Who can help?
It is mentioned in that wav2vec2-bert is supported in transformers.js.
I tried converting wav2vec2-bert to onnx by following this doc.
Information
- The official example scripts
- My own modified scripts
Tasks
- An officially supported task in the
examples
folder (such as GLUE/SQuAD, ...) - My own task or dataset (give details below)
Reproduction (minimal, reproducible, runnable)
python -m scripts.convert --quantize --model_id Yehor/w2v-bert-uk-v2.1
(mypy) C:\Users\Nawaz-Server\Documents\ml\transformers.js>python -m scripts.convert --quantize --model_id Yehor/w2v-bert-uk-v2.1
config.json: 100%|██████████████████████████████████████████████████████████████████████████████████████████████████| 1.91k/1.91k [00:00<?, ?B/s]
C:\Users\Nawaz-Server\.conda\envs\mypy\Lib\site-packages\huggingface_hub\file_download.py:139: UserWarning: `huggingface_hub` cache-system uses symlinks by default to efficiently store duplicated files but your machine does not support them in C:\Users\Nawaz-Server\.cache\huggingface\hub\models--Yehor--w2v-bert-uk-v2.1. Caching files will still work but in a degraded version that might require more space on your disk. This warning can be disabled by setting the `HF_HUB_DISABLE_SYMLINKS_WARNING` environment variable. For more details, see https://huggingface.co/docs/huggingface_hub/how-to-cache#limitations.
To support symlinks on Windows, you either need to activate Developer Mode or to run Python as an administrator. In order to activate developer mode, see this article: https://docs.microsoft.com/en-us/windows/apps/get-started/enable-your-device-for-development
warnings.warn(message)
tokenizer_config.json: 100%|████████████████████████████████████████████████████████████████████████████████████████| 1.10k/1.10k [00:00<?, ?B/s]
vocab.json: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████| 441/441 [00:00<?, ?B/s]
added_tokens.json: 100%|██████████████████████████████████████████████████████████████████████████████████████| 30.0/30.0 [00:00<00:00, 29.9kB/s]
special_tokens_map.json: 100%|████████████████████████████████████████████████████████████████████████████████████████| 96.0/96.0 [00:00<?, ?B/s]
model.safetensors: 100%|████████████████████████████████████████████████████████████████████████████████████| 2.42G/2.42G [10:56<00:00, 3.69MB/s]
preprocessor_config.json: 100%|██████████████████████████████████████████████████████████████████████████████████| 275/275 [00:00<00:00, 275kB/s]
Traceback (most recent call last):
File "<frozen runpy>", line 198, in _run_module_as_main
File "<frozen runpy>", line 88, in _run_code
File "C:\Users\Nawaz-Server\Documents\ml\transformers.js\scripts\convert.py", line 462, in <module>
main()
File "C:\Users\Nawaz-Server\Documents\ml\transformers.js\scripts\convert.py", line 349, in main
main_export(**export_kwargs)
File "C:\Users\Nawaz-Server\.conda\envs\mypy\Lib\site-packages\optimum\exporters\onnx\__main__.py", line 373, in main_export
onnx_export_from_model(
File "C:\Users\Nawaz-Server\.conda\envs\mypy\Lib\site-packages\optimum\exporters\onnx\convert.py", line 1055, in onnx_export_from_model
raise ValueError(
ValueError: Trying to export a wav2vec2-bert model, that is a custom or unsupported architecture, but no custom onnx configuration was passed as `custom_onnx_configs`. Please refer to https://huggingface.co/docs/optimum/main/en/exporters/onnx/usage_guides/export_a_model#custom-export-of-transformers-models for an example on how to export custom models. Please open an issue at https://github.com/huggingface/optimum/issues if you would like the model type wav2vec2-bert to be supported natively in the ONNX export.
Expected behavior
Onnx Conversion should work
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working