Error in Invoking Universal Sentence Encoder Sagemaker Endpoint #2374
Replies: 2 comments
-
Hi @singularity014, The ModelError you had seems to be a TensorFlow specific error, which I've found a potential solution to it: tensorflow/hub#463 Are you using an Here's instruction on how to bring in custom modules (towards the end of the Pre/Post-Processing section): |
Beta Was this translation helpful? Give feedback.
-
Hi..I think it is because of the sentence piece and tensorflow-text missing dependency from the Tensorflow Hub container used by SageMaker. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Describe the bug
Hello,
I deployed a multilingual universal sentence encoder model on using SageMaker notebook instance.
The model got deployed successfully. But when I am trying to predict it from the notebook instance.
I am getting the following error -
ModelError: An error occurred (ModelError) when calling the InvokeEndpoint operation: Received client error (404) from model with message "{ "error": "[Derived]{{function_node __inference_signature_wrapper_227768}} {{function_node __inference_signature_wrapper_227768}} Op type not registered 'SentencepieceOp' in binary running on model.aws.local. Make sure the Op and Kernel are registered in the binary running in this process. Note that if you are loading a saved graph which used ops from tf.contrib, accessing (e.g.)
tf.contrib.resampler
should be done before importing the graph, as contrib ops are lazily registered when the module is first accessed.\n\t [[{{node StatefulPartitionedCall}}]]\n\t [[StatefulPartitionedCall]]" }". See https://eu-west-1.console.aws.amazon.com/cloudwatch/home?region=eu-west-1#logEventViewer:group=/aws/sagemaker/Endpoints/museA in account 647725471721 for more information.To reproduce
from sagemaker import get_execution_role
import sagemaker
role = get_execution_role()
from sagemaker.tensorflow.serving import Model, Predictor
sagemaker_model = Model(
model_data = 's3://naister-platform-models/universal-sentence-encoder-multilingual-large_3.tar.gz',
role = role,
framework_version='2.0.0'
)
predictor = sagemaker_model.deploy(
initial_instance_count=1,
instance_type='ml.t2.medium',
endpoint_name="museA"
)
input = ['Winter is coming', 'Hold the door, Hodor']
output = predictor.predict(input)
Expected behavior
Sentence Embeddings
System information
A description of your system. Please provide:
Assistance requested regarding this problem.
Beta Was this translation helpful? Give feedback.
All reactions