Parametrize offset models#265
Parametrize offset models#265CeliaBenquet wants to merge 7 commits intoAdaptiveMotorControlLab:mainfrom
Conversation
stes
left a comment
There was a problem hiding this comment.
Good initiative, thanks --- but add tests that confirm that the previously hardcoded models are 1:1 the same. You can do this by:
- copy pasting the old model code to the test folder as reference implementation
- instantiate with the new code, and check that all kinds of properties match 1:1
If this is hard to achieve, the alternative is to just let old and new models co-exist :)
e.g. use "varoffset-10-model" or so for the new ones?
|
@CeliaBenquet lets wrap this one up? |
There was a problem hiding this comment.
Pull request overview
This PR refactors multiple offset model classes into a single parametrized OffsetNModel class that dynamically generates models with different receptive field sizes. This reduces code duplication by consolidating 7 separate model class definitions (Offset5Model, Offset10Model, Offset15Model, Offset20Model, Offset36Model, Offset40Model, Offset50Model) into one parameterized class that handles both odd and even receptive field sizes.
Changes:
- Introduced
OffsetNModelclass with parametrize decorator to generate models for receptive fields of 5, 10, 15, 18, 20, 31, 36, 40, and 50 samples - Removed 7 redundant offset model class definitions
- Updated documentation to reflect the new available model names
Reviewed changes
Copilot reviewed 2 out of 2 changed files in this pull request and generated 3 comments.
| File | Description |
|---|---|
| cebra/models/model.py | Replaced multiple offset model classes with a single parametrized OffsetNModel class and removed deprecated implementations |
| docs/source/usage.rst | Updated the example output to show the new parametrized model names available |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
c771677 to
66c8872
Compare
|
I got this error, related to my changes, but results from an existing latent bug in This was due to the @parametrize wrapper. The Solved by removing the in-place update() and constructing a "fresh" kwargs per instance instead. |
Uh oh!
There was an error while loading. Please reload this page.