You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Callback function is not being called for TrainableModels when using optimzers that don't inherit from SciPyOptimizer
Around 2 months ago a commit named "Fix callback compatibility for trainable_model" introduced an if clause for the callback calling in the _get_objective method, checking if the optimizer is a SciPyOptimizer instance to then execute the callback function.
But some of the optimizers in the repo are not inheriting from SciPyOptimizer, but from Optimizer, causing a bug where the callback function does not execute since the if clause is false.
ADAM, AQGD, GradientDescent, GSLS, SPSA and UMDA are not SciPyOptimizers, but it is important for any to have the callback function executed at each iteraction.
How can we reproduce the issue?
Train a VQC or any TrainableModel implementation
-- with any callback function
-- with any of the following optimizers: ADAM, AQGD, GradientDescent, GSLS, SPSA or UMDA
-- with any data, feature_map and ansatz