Skip to content

Reduce memory retention in caches and dispose paths#1705

Draft
markwpearce wants to merge 1 commit into
v1from
memory_performance
Draft

Reduce memory retention in caches and dispose paths#1705
markwpearce wants to merge 1 commit into
v1from
memory_performance

Conversation

@markwpearce
Copy link
Copy Markdown
Collaborator

@markwpearce markwpearce commented May 11, 2026

Summary

This PR reduces memory retention in a few long-lived caches and strengthens disposal cleanup paths.

The biggest functional change is adding a size cap to the standardized path cache so it cannot grow without bound during long-running language server or compiler sessions. It also clears cached data when BrsFile and Scope instances are disposed, which helps release references earlier and lowers overall memory pressure.

Changes

  • add a limit to the standardizePath cache and clear it when the limit is reached
  • invalidate BrsFile cached lookups during disposal
  • clear Scope cache state during disposal
  • unlink the Scope symbol table during disposal to reduce retained references
  • add regression tests covering cache bounds and disposal cleanup

Why

Some of these caches can live for a long time in editor-driven workflows, especially in the language server. Without explicit cleanup or a cap, they can retain data longer than necessary and gradually increase memory usage over time.

This change keeps the existing behavior intact while putting better bounds around cache growth and ensuring disposable objects actually release their cached state.

Testing

  • added test coverage for standardizePath cache eviction behavior
  • added test coverage for BrsFile.dispose() clearing lookup caches
  • added test coverage for Scope.dispose() clearing cached state

Copy link
Copy Markdown
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR reduces memory retention in long-lived BrighterScript sessions by placing bounds on a frequently-used path normalization cache and by more aggressively releasing cached data during disposal of core objects (scopes and files).

Changes:

  • Add a size limit to Util.standardizePath’s internal cache to prevent unbounded growth (clears the cache when the limit is reached).
  • Ensure Scope.dispose() releases cached data and unlinks symbol table relationships.
  • Ensure BrsFile.dispose() invalidates _cachedLookups so cached AST-derived collections don’t remain retained after disposal, with accompanying tests.

Reviewed changes

Copilot reviewed 6 out of 6 changed files in this pull request and generated no comments.

Show a summary per file
File Description
src/util.ts Adds a bounded standardizePath cache with a helper method to enforce the limit.
src/util.spec.ts Adds test coverage to verify cache bounding behavior.
src/Scope.ts Clears scope cache and unlinks symbol tables during dispose() to reduce retention.
src/Scope.spec.ts Adds a test ensuring scope cache is cleared on dispose.
src/files/BrsFile.ts Invalidates _cachedLookups during file disposal to release cached lookup data.
src/files/BrsFile.spec.ts Adds a test ensuring cached lookups are cleared on dispose.

Copy link
Copy Markdown
Member

@TwitchBronBron TwitchBronBron left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm. So this path caching was added to improve performance. I benchmarked it to verify that it was significantly faster to cache these lookups.

Adding a function call in the middle of this hot function might negate a significant amount the benefit of caching in the first place.

Just curious, did you do any profiling to see how much memory this cache actually gets? Is it actually significant or just a guess?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants