Skip to content
This repository was archived by the owner on Aug 6, 2025. It is now read-only.
This repository was archived by the owner on Aug 6, 2025. It is now read-only.

Unexpected Parameter Updates During Second-Stage Fine-Tuning #243

@BolinDeng98

Description

@BolinDeng98

Bug

Thanks for this wonderful work. However,I noticed that during the second-stage fine-tuning of the decoder, if torch.no_grad() is not used to block gradient flow, the parameters of the encoder and entropy model also appear to be updated. Could you please confirm if this is the intended behavior?
Looking forward to your response!

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions