Skip to content

Commit cad24ce

Browse files
authored
cascade: remove dead weight init code (Comfy-Org#13026)
This weight init process is fully shadowed be the weight load and doesnt work in dynamic_vram were the weight allocation is deferred.
1 parent 68d542c commit cad24ce

1 file changed

Lines changed: 1 addition & 10 deletions

File tree

comfy/ldm/cascade/stage_a.py

Lines changed: 1 addition & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -136,16 +136,7 @@ def __init__(self, c, c_hidden):
136136
ops.Linear(c_hidden, c),
137137
)
138138

139-
self.gammas = nn.Parameter(torch.zeros(6), requires_grad=True)
140-
141-
# Init weights
142-
def _basic_init(module):
143-
if isinstance(module, nn.Linear) or isinstance(module, nn.Conv2d):
144-
torch.nn.init.xavier_uniform_(module.weight)
145-
if module.bias is not None:
146-
nn.init.constant_(module.bias, 0)
147-
148-
self.apply(_basic_init)
139+
self.gammas = nn.Parameter(torch.zeros(6), requires_grad=False)
149140

150141
def _norm(self, x, norm):
151142
return norm(x.permute(0, 2, 3, 1)).permute(0, 3, 1, 2)

0 commit comments

Comments
 (0)