Skip to content

Incorrect inputs reordering inside ModelTransformer._get_layers during pattern matchin #1179

Open
@virtualphoton

Description

@virtualphoton

ModelTransformer._match_layer_with_inputs calls self._get_layers(input_layer_names). input_layer_names have strict order, i. e. _get_layers's result in this case must have same order of tensors as in input_layer_names.
Current implementation is:

  def _get_layers(self, layer_names):
    return [
        layer for layer in self._config['layers']
        if layer['config']['name'] in layer_names
    ]

I. e. when first input is declared later than the second one, result would have incorrect order. The simple model to reproduce bug:

import tf_keras as K
import tf_keras.layers as L
a = K.Input(10)
b = L.Dense(10)(a)
c = K.Input(20)
m = K.Model([a, c], L.concatenate([c, b], -1))

Then quantize_model(m) would yield incorrect order for concatenation operation.

My suggestion would be to replace it with something like:

  def _get_layers(self, layer_names):
    name_to_layer = {layer['config']['name']: layer for layer in self._config['layers']}
    return [name_to_layer[name] for name in layer_names]

which preserves order of layer_names

This also seems to be the problem behind #1061

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions