You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
REM TODO Check for mobilenetv2-12_quant_shape.onnx_ctx.onnx
Error in cpuinfo: Unknown chip model name 'snapdragon (tm) 8cx gen 3 @ 3.40 GHz'.
Please add new Windows on Arm SoC/chip support to arm/windows/init.c!
unknown Qualcomm CPU part 0x1 ignored
...
2024-11-08 13:02:36.8328541 [E:onnxruntime:, onnx_ctx_model_helper.cc:162 onnxruntime::qnn::LoadQnnCtxFromOnnxGraph] Failed to load from EpContext model. qnn_backend_manager.cc:636 onnxruntime::qnn::QnnBackendManager::LoadCachedQnnContextFromBuffer Failed to get context binary info.
Exception in run_ort_qnn_ep: Failed to load from EpContext model. qnn_backend_manager.cc:636 onnxruntime::qnn::QnnBackendManager::LoadCachedQnnContextFromBuffer Failed to get context binary info.
// Error 2
REM run mobilenetv2-12_net_qnn_ctx.onnx (generated from native QNN) with QNN HTP backend
s:\src\onnxruntime-inference-examples\c_cxx\QNN_EP\mobilenetv2_classification\build\Release>qnn_ep_sample.exe --qnn mobilenetv2-12_net_qnn_ctx.onnx kitten_input_nhwc.raw
Exception in run_ort_qnn_ep:
The other combo of runs work fine
The text was updated successfully, but these errors were encountered:
I used official latest ORT 1.20 pre-built package per https://onnxruntime.ai/docs/execution-providers/QNN-ExecutionProvider.html#pre-built-packages-windows-only
It looks to use QNN_2.27.0.240926142112_100894
After build and execute, some of the executions will produce error messages or crash
// Error 1
Error in cpuinfo: Unknown chip model name 'snapdragon (tm) 8cx gen 3 @ 3.40 GHz'.
Please add new Windows on Arm SoC/chip support to arm/windows/init.c!
unknown Qualcomm CPU part 0x1 ignored
...
2024-11-08 13:02:36.8328541 [E:onnxruntime:, onnx_ctx_model_helper.cc:162 onnxruntime::qnn::LoadQnnCtxFromOnnxGraph] Failed to load from EpContext model. qnn_backend_manager.cc:636 onnxruntime::qnn::QnnBackendManager::LoadCachedQnnContextFromBuffer Failed to get context binary info.
Exception in run_ort_qnn_ep: Failed to load from EpContext model. qnn_backend_manager.cc:636 onnxruntime::qnn::QnnBackendManager::LoadCachedQnnContextFromBuffer Failed to get context binary info.
// Error 2
s:\src\onnxruntime-inference-examples\c_cxx\QNN_EP\mobilenetv2_classification\build\Release>qnn_ep_sample.exe --qnn mobilenetv2-12_net_qnn_ctx.onnx kitten_input_nhwc.raw
Exception in run_ort_qnn_ep:
The other combo of runs work fine
The text was updated successfully, but these errors were encountered: