Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

IncrementalClassifiers inside MultiHeadClassifiers are adapted on all experiences #1591

Closed
AlbinSou opened this issue Feb 10, 2024 · 2 comments · Fixed by #1600
Closed

IncrementalClassifiers inside MultiHeadClassifiers are adapted on all experiences #1591

AlbinSou opened this issue Feb 10, 2024 · 2 comments · Fixed by #1600
Assignees
Labels
bug Something isn't working

Comments

@AlbinSou
Copy link
Collaborator

🐛 Describe the bug
MultiHead classifier contains a dictionnary of Incremental classifiers, both MultiHeadClassifier and IncrementalClassifier are DynamicModules, so they get adapted through the following loop:

def avalanche_model_adaptation(model: nn.Module, experience: CLExperience):
    if isinstance(model, DistributedDataParallel):
        raise RuntimeError(
            "The model is wrapped in DistributedDataParallel. "
            "Please unwrap it before calling this method."
        )
    for module in model.modules():
        if isinstance(module, DynamicModule):
            module.adaptation(experience)

However, we would like the IncrementalClassifiers inside the MultiHeadClassifier to be adapted only on experiences that contain the task it corresponds to, just like it is called inside the MultiHeadClassifier

One solution would be to add an option "adaptable" in Incremental classifiers and turn this off when they are created in MultiHeadClassifiers.

@AlbinSou AlbinSou added the bug Something isn't working label Feb 10, 2024
@AlbinSou AlbinSou self-assigned this Feb 10, 2024
@AntonioCarta
Copy link
Collaborator

This is an issue indeed. I agree with your solution of disabling adaptation.

Another issue that we have is that the model adaptation must be called on each module separately. This is a frequent source of errors that I see in the github discussions (calling model.adaptation instead of model_adaptation or having a for loop). Maybe the adaptation method should automatically call its children? I'm thinking of an API like this:

class DynamicModule:
    def adaptation(self, exp):  # called by users and avalanche strategies. Adapts children too
    def _module_adaptation(self, exp):  # adapt itself
    def enable_adaptation(self, is_enabled):  # enable/disable adaptation  

and maybe IncrementalClassifier could have a method grow_classifier(new_units) that can be called by the MultiTaskClassifier that can be called avoiding the adaptation?

@AlbinSou
Copy link
Collaborator Author

This is an issue indeed. I agree with your solution of disabling adaptation.

Another issue that we have is that the model adaptation must be called on each module separately. This is a frequent source of errors that I see in the github discussions (calling model.adaptation instead of model_adaptation or having a for loop). Maybe the adaptation method should automatically call its children? I'm thinking of an API like this:

class DynamicModule:
    def adaptation(self, exp):  # called by users and avalanche strategies. Adapts children too
    def _module_adaptation(self, exp):  # adapt itself
    def enable_adaptation(self, is_enabled):  # enable/disable adaptation  

and maybe IncrementalClassifier could have a method grow_classifier(new_units) that can be called by the MultiTaskClassifier that can be called avoiding the adaptation?

Interesting, yes I agree that it would be less confusing and more usable "out of the box". I will try to do something like that.

@AlbinSou AlbinSou linked a pull request Feb 19, 2024 that will close this issue
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants