Reduce memory retention in caches and dispose paths#1705
Conversation
There was a problem hiding this comment.
Pull request overview
This PR reduces memory retention in long-lived BrighterScript sessions by placing bounds on a frequently-used path normalization cache and by more aggressively releasing cached data during disposal of core objects (scopes and files).
Changes:
- Add a size limit to
Util.standardizePath’s internal cache to prevent unbounded growth (clears the cache when the limit is reached). - Ensure
Scope.dispose()releases cached data and unlinks symbol table relationships. - Ensure
BrsFile.dispose()invalidates_cachedLookupsso cached AST-derived collections don’t remain retained after disposal, with accompanying tests.
Reviewed changes
Copilot reviewed 6 out of 6 changed files in this pull request and generated no comments.
Show a summary per file
| File | Description |
|---|---|
src/util.ts |
Adds a bounded standardizePath cache with a helper method to enforce the limit. |
src/util.spec.ts |
Adds test coverage to verify cache bounding behavior. |
src/Scope.ts |
Clears scope cache and unlinks symbol tables during dispose() to reduce retention. |
src/Scope.spec.ts |
Adds a test ensuring scope cache is cleared on dispose. |
src/files/BrsFile.ts |
Invalidates _cachedLookups during file disposal to release cached lookup data. |
src/files/BrsFile.spec.ts |
Adds a test ensuring cached lookups are cleared on dispose. |
TwitchBronBron
left a comment
There was a problem hiding this comment.
Hmm. So this path caching was added to improve performance. I benchmarked it to verify that it was significantly faster to cache these lookups.
Adding a function call in the middle of this hot function might negate a significant amount the benefit of caching in the first place.
Just curious, did you do any profiling to see how much memory this cache actually gets? Is it actually significant or just a guess?
Summary
This PR reduces memory retention in a few long-lived caches and strengthens disposal cleanup paths.
The biggest functional change is adding a size cap to the standardized path cache so it cannot grow without bound during long-running language server or compiler sessions. It also clears cached data when
BrsFileandScopeinstances are disposed, which helps release references earlier and lowers overall memory pressure.Changes
standardizePathcache and clear it when the limit is reachedBrsFilecached lookups during disposalScopecache state during disposalScopesymbol table during disposal to reduce retained referencesWhy
Some of these caches can live for a long time in editor-driven workflows, especially in the language server. Without explicit cleanup or a cap, they can retain data longer than necessary and gradually increase memory usage over time.
This change keeps the existing behavior intact while putting better bounds around cache growth and ensuring disposable objects actually release their cached state.
Testing
standardizePathcache eviction behaviorBrsFile.dispose()clearing lookup cachesScope.dispose()clearing cached state