Skip to content

Fix bug in slicing of latent encodings in src/sharp/models/encoders/spn_encoder.py#82

Open
Riccardo-Rota wants to merge 1 commit intoapple:mainfrom
Riccardo-Rota:patch-1
Open

Fix bug in slicing of latent encodings in src/sharp/models/encoders/spn_encoder.py#82
Riccardo-Rota wants to merge 1 commit intoapple:mainfrom
Riccardo-Rota:patch-1

Conversation

@Riccardo-Rota
Copy link

Latent encodings were sliced as x_latent0_encodings[: batch_size * x0_tile_size], changed into x_latent0_encodings[: x0_tile_size] because x0_tile_size already accounts for batch size. For example, if batch_size=10, the original x_latent0_encodings has shape [350, 1024, 24, 24] and x0_tile_size=250. If we multiply again by batch_size=10, we slice up to index 2500, making the slicing uneffective. Same happens for x_latent1_encodings

Latent encodings were sliced as x_latent0_encodings[: batch_size * x0_tile_size], changed into x_latent0_encodings[: x0_tile_size] because x0_tile_size already accounts for batch size. For example, if batch_size=10, the original x_latent0_encodings has shape [350, 1024, 24, 24] and x0_tile_size=250. If we multiply again by batch_size=10, we slice up to index 2500, making the slicing uneffective. Same happens for x_latent1_encodings
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant