Description
Prior to filing: check that this should be a bug instead of a feature request. Everything supported, including the compatible versions of TensorFlow, is listed in the overview page of each technique. For example, the overview page of quantization-aware training is here. An issue for anything not supported should be a feature request.
Describe the bug
I'm passing a keras sequential model into quantize_model(), and I'm getting the error that the model isn't a sequential model.
System information
Carried out in:
Google Colab
TensorFlow version (installed from source or binary):
2.17.0
TensorFlow Model Optimization version (installed from source or binary):
0.8.0
Python version:
3.10.12
Describe the expected behavior
prepare a keras sequential model built from scratch that was imported via load_model()
Describe the current behavior
ValueError: to_quantize
can only either be a keras Sequential or Functional model.
Code to reproduce the issue
!pip install tensorflow_model_optimization
import pandas as pd
import numpy as np
import time
import tensorflow as tf
import os
import tempfile
import keras
import tensorflow_model_optimization as tfmot
from google.colab import drive
from tensorflow.keras.models import load_model
drive.mount('/content/drive')
%cd /content/drive/My Drive/CS528/HW3
model = load_model('/content/drive/My Drive/CS528/HW3/q1_model.keras')
quant_aware_model_tflite = '/content/drive/My Drive/CS528/HW3/s_mnist_quant_aware_training.tflite'
quantize_model = tfmot.quantization.keras.quantize_model
q_aware_model = quantize_model(model)
Screenshots
If applicable, add screenshots to help explain your problem.
Additional context
I've already checked the same compound conditional using the imported model just before calling quantize_model(), and it behaves as it should. Only when quantize_model() is actually handling the model does it seem to think the model isn't a tf.keras.Sequential object.