Skip to content

Long training time and batch size #6

@Amelienew

Description

@Amelienew

Thank you for the impressive job!

However, when I try to reproduce the results, I find the out of memory issue. I only train on one RTX 4090 GPU.

I have reduce the size of dataset to solve. But another question occurred, it require very long time to finish the training. I would like to increase the batch_size, but the whole code seemed only wrote based on batch_size=1. Changing the batch_size would make the whole model not work properly.

P.S. did the SE(3)-transformer applied the faster version contributed by NVIDIA?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions