-
Notifications
You must be signed in to change notification settings - Fork 84
ENG-2264 Add in-memory oauth client caching #7204
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
|
The latest updates on your projects. Learn more about Vercel for GitHub. 2 Skipped Deployments
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Greptile Overview
Greptile Summary
This PR implements in-memory caching for OAuth client details to reduce database queries during authentication. While the intent is good, the implementation has several critical issues that need to be addressed before merging.
Major Issues Found
- Critical: SQLAlchemy Detached Instance Error - Cached
ClientDetailobjects are mutable ORM instances that will become detached from their database sessions. Whenis_token_invalidated()tries to access the lazy-loadedclient.user.password_reset_atrelationship, it will fail with aDetachedInstanceErroror return stale data - Thread Safety - The global cache is not thread-safe. Concurrent reads/writes from multiple threads can cause race conditions
- Timezone Issues - Using naive
datetime.now()instead of timezone-awaredatetime.now(timezone.utc)can cause incorrect expiration calculations - Multi-Worker Cache Inconsistency - Each worker process has its own in-memory cache, so invalidations won't propagate and different workers may serve stale data
Minor Issues
- Arbitrary eviction strategy instead of LRU
- No negative result caching (repeated lookups of non-existent clients hit DB)
- Missing test coverage for thread safety, detached instances, and timezone handling
- Tests lack autouse fixture for cache cleanup between test runs
Recommendations
- Eagerly load the
userrelationship before caching to prevent detached instance errors - Add
threading.Lock()for thread-safe cache operations - Use timezone-aware datetimes throughout
- Document multi-worker limitations or consider Redis-based distributed cache
- Add comprehensive test coverage for edge cases
Confidence Score: 1/5
- This PR has critical issues that will cause runtime errors in production
- Score of 1 reflects multiple critical bugs: SQLAlchemy detached instance errors that will break password reset invalidation checks, thread safety issues that can cause race conditions, and timezone-naive datetime usage that can cause incorrect cache expiration. These issues will manifest as runtime errors and security vulnerabilities in production
- Pay close attention to
src/fides/api/oauth/client_cache.py(new caching implementation with thread safety and SQLAlchemy session issues) andsrc/fides/api/oauth/utils.py(will fail when accessing cached client's lazy-loaded user relationship)
Important Files Changed
File Analysis
| Filename | Score | Overview |
|---|---|---|
| src/fides/api/oauth/client_cache.py | 1/5 | New in-memory OAuth client cache with critical issues: missing timezone awareness in datetime operations, no thread safety for concurrent access, caching of mutable SQLAlchemy ORM objects that can become detached, arbitrary eviction strategy, and multi-worker cache inconsistency concerns |
| src/fides/api/oauth/utils.py | 1/5 | Updated to use cached client lookup. Critical issue: accessing lazy-loaded client.user.password_reset_at relationship on cached clients will fail with DetachedInstanceError or return stale data |
| src/fides/config/security_settings.py | 5/5 | Added oauth_client_cache_ttl_seconds configuration field with 300 second (5 minute) default. Implementation looks correct |
| tests/lib/test_client_cache.py | 2/5 | New test file with basic cache functionality tests. Missing critical test coverage: thread safety, SQLAlchemy detached instances, multi-worker scenarios, and missing autouse fixture for cache cleanup between tests |
src/fides/api/oauth/client_cache.py
Outdated
| return None | ||
|
|
||
| # Check if entry has expired | ||
| now = datetime.now() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Use datetime.now(timezone.utc) for timezone-aware datetime. Using naive datetime.now() without timezone can cause issues in distributed systems or when comparing with timezone-aware datetimes from the database.
| now = datetime.now() | |
| now = datetime.now(timezone.utc) |
src/fides/api/oauth/client_cache.py
Outdated
| if len(self._cache) >= self._max_size and client_id not in self._cache: | ||
| self._evict_one() | ||
|
|
||
| expires_at = datetime.now() + timedelta(seconds=ttl_seconds) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Use datetime.now(timezone.utc) for timezone-aware datetime. Consistency with timezone-aware datetimes is important for accurate expiration calculations.
| expires_at = datetime.now() + timedelta(seconds=ttl_seconds) | |
| expires_at = datetime.now(timezone.utc) + timedelta(seconds=ttl_seconds) |
| class ClientCache: | ||
| """ | ||
| TTL cache for ClientDetail objects. | ||
|
|
||
| This cache stores ClientDetail objects keyed by client_id and automatically | ||
| considers entries expired after the configured TTL. The cache has a maximum | ||
| size limit to prevent unbounded memory growth. | ||
| """ | ||
|
|
||
| def __init__(self, max_size: int = MAX_CACHE_SIZE): | ||
| self._cache: Dict[str, CacheEntry] = {} | ||
| self._max_size = max_size |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thread safety concern: The global _client_cache instance is not thread-safe. Dictionary operations in Python are not atomic, and concurrent reads/writes from multiple threads (common in FastAPI with uvicorn workers) can cause race conditions. Consider using threading.Lock() for critical sections or using a thread-safe data structure.
Example fix:
import threading
class ClientCache:
def __init__(self, max_size: int = MAX_CACHE_SIZE):
self._cache: Dict[str, CacheEntry] = {}
self._max_size = max_size
self._lock = threading.Lock()
def get(self, client_id: str) -> Optional[ClientDetail]:
with self._lock:
# existing logic| client: ClientDetail | ||
| expires_at: datetime |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Cached ClientDetail objects are mutable SQLAlchemy ORM instances with lazy-loaded relationships. If the cached object's client.user.password_reset_at is accessed after being cached, it could trigger a lazy-load database query using a stale or closed database session, causing errors. The cached object may also become detached from its session.
Consider either:
- Eagerly loading required relationships before caching
- Serializing/deserializing the client data instead of caching ORM objects
- Documenting that cached objects should not access relationships
| def _evict_one(self) -> None: | ||
| """Evict an arbitrary entry from the cache""" | ||
| try: | ||
| # Pop an arbitrary key | ||
| self._cache.pop(next(iter(self._cache))) | ||
| except (StopIteration, KeyError): | ||
| # Cache is empty or was modified concurrently | ||
| pass |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Eviction strategy is arbitrary and unpredictable. Using next(iter(self._cache)) doesn't guarantee removal of the oldest or least-recently-used entry. In Python 3.7+, dicts maintain insertion order, but this doesn't account for access patterns. Consider implementing a proper LRU (Least Recently Used) eviction policy for better cache performance.
Note: If this suggestion doesn't match your team's coding style, reply to this and let me know. I'll remember it for next time!
|
|
||
| from __future__ import annotations | ||
|
|
||
| from dataclasses import dataclass |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Missing timezone import from datetime module. The suggestion to use datetime.now(timezone.utc) requires this import.
| from dataclasses import dataclass | |
| from datetime import datetime, timedelta, timezone |
| # Global cache instance | ||
| _client_cache = ClientCache() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Multi-worker concern: In production deployments with multiple uvicorn workers or multiple server instances, each process will have its own separate in-memory cache. This means:
- Cache invalidation won't propagate across workers
- Different workers may serve requests with different cached data
- When client permissions/scopes change, some requests may use stale data for up to 5 minutes depending on which worker handles them
The PR description mentions "we will handle cache invalidations in a follow up PR", but this architectural limitation should be documented or addressed with a distributed cache (Redis) instead.
| if client is not None: | ||
| _client_cache.set(client_id, client) | ||
|
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Only caches found clients. If a client lookup returns None (client doesn't exist), the cache doesn't remember this. Every request for a non-existent client will hit the database. Consider caching negative results (with a shorter TTL) to prevent potential DoS attacks where an attacker repeatedly requests invalid client IDs.
| """Tests for the OAuth client cache functionality.""" | ||
|
|
||
| from datetime import datetime, timedelta | ||
|
|
||
| import pytest | ||
|
|
||
| from fides.api.models.client import ClientDetail | ||
| from fides.api.oauth.client_cache import ( | ||
| CacheEntry, | ||
| ClientCache, | ||
| _client_cache, | ||
| clear_client_cache, | ||
| get_cached_client, | ||
| ) | ||
| from fides.common.api.scope_registry import SCOPE_REGISTRY | ||
|
|
||
|
|
||
| class TestClientCache: | ||
| """Tests for the ClientCache class.""" | ||
|
|
||
| @pytest.fixture(scope="function") | ||
| def cache_enabled(self, config): | ||
| """Fixture to enable client cache with 300 second TTL.""" | ||
| original_value = config.security.oauth_client_cache_ttl_seconds | ||
| config.security.oauth_client_cache_ttl_seconds = 300 | ||
| yield config | ||
| config.security.oauth_client_cache_ttl_seconds = original_value | ||
|
|
||
| @pytest.fixture(scope="function") | ||
| def cache_disabled(self, config): | ||
| """Fixture to disable client cache (TTL=0).""" | ||
| original_value = config.security.oauth_client_cache_ttl_seconds | ||
| config.security.oauth_client_cache_ttl_seconds = 0 | ||
| yield config | ||
| config.security.oauth_client_cache_ttl_seconds = original_value | ||
|
|
||
| @pytest.mark.usefixtures("cache_enabled") | ||
| def test_cache_set_and_get(self, oauth_client): | ||
| """Test basic cache set and get operations.""" | ||
| cache = ClientCache() | ||
|
|
||
| # Cache should be empty initially | ||
| assert cache.get(oauth_client.id) is None | ||
|
|
||
| # Set the client in cache | ||
| cache.set(oauth_client.id, oauth_client) | ||
|
|
||
| # Should be able to retrieve it | ||
| cached = cache.get(oauth_client.id) | ||
| assert cached is not None | ||
| assert cached.id == oauth_client.id | ||
|
|
||
| @pytest.mark.usefixtures("cache_enabled") | ||
| def test_cache_expiration(self, oauth_client): | ||
| """Test that cache entries expire after TTL.""" | ||
| cache = ClientCache() | ||
|
|
||
| # Create a cache entry that has already expired | ||
| expired_time = datetime.now() - timedelta(seconds=60) # Expired 1 minute ago | ||
| cache._cache[oauth_client.id] = CacheEntry( | ||
| client=oauth_client, expires_at=expired_time | ||
| ) | ||
|
|
||
| # Entry should be expired | ||
| cached = cache.get(oauth_client.id) | ||
| assert cached is None | ||
|
|
||
| # Entry should have been removed from cache | ||
| assert oauth_client.id not in cache._cache | ||
|
|
||
| @pytest.mark.usefixtures("cache_enabled") | ||
| def test_cache_not_expired(self, oauth_client): | ||
| """Test that cache entries within TTL are returned.""" | ||
| cache = ClientCache() | ||
|
|
||
| # Create a cache entry that expires in the future | ||
| future_time = datetime.now() + timedelta(seconds=300) # Expires in 5 minutes | ||
| cache._cache[oauth_client.id] = CacheEntry( | ||
| client=oauth_client, expires_at=future_time | ||
| ) | ||
|
|
||
| # Entry should still be valid | ||
| cached = cache.get(oauth_client.id) | ||
| assert cached is not None | ||
| assert cached.id == oauth_client.id | ||
|
|
||
| @pytest.mark.usefixtures("cache_disabled") | ||
| def test_cache_disabled_when_ttl_zero(self, oauth_client): | ||
| """Test that cache operations are no-ops when TTL is 0.""" | ||
| cache = ClientCache() | ||
|
|
||
| # Set should do nothing | ||
| cache.set(oauth_client.id, oauth_client) | ||
| assert cache.size() == 0 | ||
|
|
||
| # Get should return None | ||
| cached = cache.get(oauth_client.id) | ||
| assert cached is None | ||
|
|
||
| @pytest.mark.usefixtures("cache_enabled") | ||
| def test_cache_delete(self, oauth_client): | ||
| """Test deleting entries from cache.""" | ||
| cache = ClientCache() | ||
|
|
||
| # Add entry to cache | ||
| cache.set(oauth_client.id, oauth_client) | ||
| assert cache.get(oauth_client.id) is not None | ||
|
|
||
| # Delete the entry | ||
| result = cache.delete(oauth_client.id) | ||
| assert result is True | ||
|
|
||
| # Entry should be gone | ||
| assert cache.get(oauth_client.id) is None | ||
|
|
||
| # Deleting non-existent entry returns False | ||
| assert cache.delete("non-existent-id") is False | ||
|
|
||
| @pytest.mark.usefixtures("cache_enabled") | ||
| def test_cache_clear(self, oauth_client): | ||
| """Test clearing all cache entries.""" | ||
| cache = ClientCache() | ||
|
|
||
| # Add multiple entries | ||
| cache.set(oauth_client.id, oauth_client) | ||
| cache.set("another-id", oauth_client) | ||
| assert cache.size() == 2 | ||
|
|
||
| # Clear the cache | ||
| count = cache.clear() | ||
| assert count == 2 | ||
| assert cache.size() == 0 | ||
|
|
||
| @pytest.mark.usefixtures("cache_enabled") | ||
| def test_cache_eviction_at_max_size(self, oauth_client): | ||
| """Test that an entry is evicted when cache is full.""" | ||
| cache = ClientCache(max_size=2) | ||
|
|
||
| # Add two entries to fill the cache | ||
| cache.set("client1", oauth_client) | ||
| cache.set("client2", oauth_client) | ||
| assert cache.size() == 2 | ||
|
|
||
| # Add third entry - should evict one of the existing entries | ||
| cache.set("client3", oauth_client) | ||
|
|
||
| # Cache should still be at max size | ||
| assert cache.size() == 2 | ||
| # New entry should be present | ||
| assert "client3" in cache._cache | ||
|
|
||
|
|
||
| class TestGetCachedClient: | ||
| """Tests for the get_cached_client function.""" | ||
|
|
||
| @pytest.fixture(scope="function") | ||
| def cache_enabled(self, config): | ||
| """Fixture to enable client cache with 300 second TTL.""" | ||
| assert config.test_mode | ||
| original_value = config.security.oauth_client_cache_ttl_seconds | ||
| config.security.oauth_client_cache_ttl_seconds = 300 | ||
| yield config | ||
| config.security.oauth_client_cache_ttl_seconds = original_value | ||
|
|
||
| @pytest.fixture(scope="function") | ||
| def cache_disabled(self, config): | ||
| """Fixture to disable client cache (TTL=0).""" | ||
| assert config.test_mode | ||
| original_value = config.security.oauth_client_cache_ttl_seconds | ||
| config.security.oauth_client_cache_ttl_seconds = 0 | ||
| yield config | ||
| config.security.oauth_client_cache_ttl_seconds = original_value | ||
|
|
||
| def test_cache_miss_queries_database(self, db, oauth_client, cache_enabled): | ||
| """Test that a cache miss results in a database query.""" | ||
| # Clear the cache first | ||
| clear_client_cache() | ||
|
|
||
| client = get_cached_client( | ||
| db=db, | ||
| client_id=oauth_client.id, | ||
| config=cache_enabled, | ||
| scopes=SCOPE_REGISTRY, | ||
| roles=[], | ||
| ) | ||
|
|
||
| assert client is not None | ||
| assert client.id == oauth_client.id | ||
|
|
||
| # Client should now be in cache | ||
| cached = _client_cache.get(oauth_client.id) | ||
| assert cached is not None | ||
|
|
||
| def test_cache_hit_returns_cached_client(self, db, oauth_client, cache_enabled): | ||
| """Test that a cache hit returns the cached client.""" | ||
| clear_client_cache() | ||
|
|
||
| # First call should query DB and cache | ||
| client1 = get_cached_client( | ||
| db=db, | ||
| client_id=oauth_client.id, | ||
| config=cache_enabled, | ||
| scopes=SCOPE_REGISTRY, | ||
| roles=[], | ||
| ) | ||
|
|
||
| # Second call should return cached client | ||
| client2 = get_cached_client( | ||
| db=db, | ||
| client_id=oauth_client.id, | ||
| config=cache_enabled, | ||
| scopes=SCOPE_REGISTRY, | ||
| roles=[], | ||
| ) | ||
|
|
||
| # Both should be the same object (cached) | ||
| assert client1 is client2 | ||
|
|
||
| def test_root_client_not_cached(self, db, cache_enabled): | ||
| """Test that root client is not cached (it's already in-memory).""" | ||
| clear_client_cache() | ||
|
|
||
| client = get_cached_client( | ||
| db=db, | ||
| client_id=cache_enabled.security.oauth_root_client_id, | ||
| config=cache_enabled, | ||
| scopes=SCOPE_REGISTRY, | ||
| roles=[], | ||
| ) | ||
|
|
||
| assert client is not None | ||
|
|
||
| # Root client should NOT be in cache | ||
| cached = _client_cache.get(cache_enabled.security.oauth_root_client_id) | ||
| assert cached is None | ||
|
|
||
| def test_nonexistent_client_returns_none(self, db, cache_enabled): | ||
| """Test that looking up a non-existent client returns None.""" | ||
| clear_client_cache() | ||
|
|
||
| client = get_cached_client( | ||
| db=db, | ||
| client_id="non-existent-client-id", | ||
| config=cache_enabled, | ||
| scopes=SCOPE_REGISTRY, | ||
| roles=[], | ||
| ) | ||
|
|
||
| assert client is None | ||
|
|
||
| def test_cache_disabled_always_queries_db(self, db, oauth_client, cache_disabled): | ||
| """Test that with cache disabled, every call queries the database.""" | ||
| clear_client_cache() | ||
|
|
||
| client = get_cached_client( | ||
| db=db, | ||
| client_id=oauth_client.id, | ||
| config=cache_disabled, | ||
| scopes=SCOPE_REGISTRY, | ||
| roles=[], | ||
| ) | ||
|
|
||
| assert client is not None | ||
|
|
||
| # Nothing should be cached | ||
| assert _client_cache.size() == 0 | ||
|
|
||
|
|
||
| class TestGlobalCacheFunctions: | ||
| """Tests for the global cache helper functions.""" | ||
|
|
||
| @pytest.fixture(scope="function") | ||
| def cache_enabled(self, config): | ||
| """Fixture to enable client cache with 300 second TTL.""" | ||
| assert config.test_mode | ||
| original_value = config.security.oauth_client_cache_ttl_seconds | ||
| config.security.oauth_client_cache_ttl_seconds = 300 | ||
| yield config | ||
| config.security.oauth_client_cache_ttl_seconds = original_value | ||
|
|
||
| def test_clear_client_cache(self, db, oauth_client, cache_enabled): | ||
| """Test the clear_client_cache helper function.""" | ||
| _client_cache.set(oauth_client.id, oauth_client) | ||
|
|
||
| assert _client_cache.size() > 0 | ||
|
|
||
| count = clear_client_cache() | ||
| assert count > 0 | ||
| assert _client_cache.size() == 0 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Missing test coverage for critical scenarios:
- Thread safety: No tests for concurrent access to the cache from multiple threads
- SQLAlchemy detached instances: No tests verifying that cached clients with
userrelationships can be accessed after caching (will fail whenis_token_invalidatedtries to accessclient.user.password_reset_at) - Multi-worker scenarios: No tests for cache invalidation across multiple processes
- Negative caching: No tests for repeated lookups of non-existent clients
- Timezone handling: No tests verifying timezone-aware datetime behavior
| class TestClientCache: | ||
| """Tests for the ClientCache class.""" | ||
|
|
||
| @pytest.fixture(scope="function") | ||
| def cache_enabled(self, config): | ||
| """Fixture to enable client cache with 300 second TTL.""" | ||
| original_value = config.security.oauth_client_cache_ttl_seconds | ||
| config.security.oauth_client_cache_ttl_seconds = 300 | ||
| yield config | ||
| config.security.oauth_client_cache_ttl_seconds = original_value | ||
|
|
||
| @pytest.fixture(scope="function") | ||
| def cache_disabled(self, config): | ||
| """Fixture to disable client cache (TTL=0).""" | ||
| original_value = config.security.oauth_client_cache_ttl_seconds | ||
| config.security.oauth_client_cache_ttl_seconds = 0 | ||
| yield config | ||
| config.security.oauth_client_cache_ttl_seconds = original_value | ||
|
|
||
| @pytest.mark.usefixtures("cache_enabled") | ||
| def test_cache_set_and_get(self, oauth_client): | ||
| """Test basic cache set and get operations.""" | ||
| cache = ClientCache() | ||
|
|
||
| # Cache should be empty initially | ||
| assert cache.get(oauth_client.id) is None | ||
|
|
||
| # Set the client in cache | ||
| cache.set(oauth_client.id, oauth_client) | ||
|
|
||
| # Should be able to retrieve it | ||
| cached = cache.get(oauth_client.id) | ||
| assert cached is not None | ||
| assert cached.id == oauth_client.id | ||
|
|
||
| @pytest.mark.usefixtures("cache_enabled") | ||
| def test_cache_expiration(self, oauth_client): | ||
| """Test that cache entries expire after TTL.""" | ||
| cache = ClientCache() | ||
|
|
||
| # Create a cache entry that has already expired | ||
| expired_time = datetime.now() - timedelta(seconds=60) # Expired 1 minute ago | ||
| cache._cache[oauth_client.id] = CacheEntry( | ||
| client=oauth_client, expires_at=expired_time | ||
| ) | ||
|
|
||
| # Entry should be expired | ||
| cached = cache.get(oauth_client.id) | ||
| assert cached is None | ||
|
|
||
| # Entry should have been removed from cache | ||
| assert oauth_client.id not in cache._cache | ||
|
|
||
| @pytest.mark.usefixtures("cache_enabled") | ||
| def test_cache_not_expired(self, oauth_client): | ||
| """Test that cache entries within TTL are returned.""" | ||
| cache = ClientCache() | ||
|
|
||
| # Create a cache entry that expires in the future | ||
| future_time = datetime.now() + timedelta(seconds=300) # Expires in 5 minutes | ||
| cache._cache[oauth_client.id] = CacheEntry( | ||
| client=oauth_client, expires_at=future_time | ||
| ) | ||
|
|
||
| # Entry should still be valid | ||
| cached = cache.get(oauth_client.id) | ||
| assert cached is not None | ||
| assert cached.id == oauth_client.id | ||
|
|
||
| @pytest.mark.usefixtures("cache_disabled") | ||
| def test_cache_disabled_when_ttl_zero(self, oauth_client): | ||
| """Test that cache operations are no-ops when TTL is 0.""" | ||
| cache = ClientCache() | ||
|
|
||
| # Set should do nothing | ||
| cache.set(oauth_client.id, oauth_client) | ||
| assert cache.size() == 0 | ||
|
|
||
| # Get should return None | ||
| cached = cache.get(oauth_client.id) | ||
| assert cached is None | ||
|
|
||
| @pytest.mark.usefixtures("cache_enabled") | ||
| def test_cache_delete(self, oauth_client): | ||
| """Test deleting entries from cache.""" | ||
| cache = ClientCache() | ||
|
|
||
| # Add entry to cache | ||
| cache.set(oauth_client.id, oauth_client) | ||
| assert cache.get(oauth_client.id) is not None | ||
|
|
||
| # Delete the entry | ||
| result = cache.delete(oauth_client.id) | ||
| assert result is True | ||
|
|
||
| # Entry should be gone | ||
| assert cache.get(oauth_client.id) is None | ||
|
|
||
| # Deleting non-existent entry returns False | ||
| assert cache.delete("non-existent-id") is False | ||
|
|
||
| @pytest.mark.usefixtures("cache_enabled") | ||
| def test_cache_clear(self, oauth_client): | ||
| """Test clearing all cache entries.""" | ||
| cache = ClientCache() | ||
|
|
||
| # Add multiple entries | ||
| cache.set(oauth_client.id, oauth_client) | ||
| cache.set("another-id", oauth_client) | ||
| assert cache.size() == 2 | ||
|
|
||
| # Clear the cache | ||
| count = cache.clear() | ||
| assert count == 2 | ||
| assert cache.size() == 0 | ||
|
|
||
| @pytest.mark.usefixtures("cache_enabled") | ||
| def test_cache_eviction_at_max_size(self, oauth_client): | ||
| """Test that an entry is evicted when cache is full.""" | ||
| cache = ClientCache(max_size=2) | ||
|
|
||
| # Add two entries to fill the cache | ||
| cache.set("client1", oauth_client) | ||
| cache.set("client2", oauth_client) | ||
| assert cache.size() == 2 | ||
|
|
||
| # Add third entry - should evict one of the existing entries | ||
| cache.set("client3", oauth_client) | ||
|
|
||
| # Cache should still be at max size | ||
| assert cache.size() == 2 | ||
| # New entry should be present | ||
| assert "client3" in cache._cache | ||
|
|
||
|
|
||
| class TestGetCachedClient: | ||
| """Tests for the get_cached_client function.""" | ||
|
|
||
| @pytest.fixture(scope="function") | ||
| def cache_enabled(self, config): | ||
| """Fixture to enable client cache with 300 second TTL.""" | ||
| assert config.test_mode | ||
| original_value = config.security.oauth_client_cache_ttl_seconds | ||
| config.security.oauth_client_cache_ttl_seconds = 300 | ||
| yield config | ||
| config.security.oauth_client_cache_ttl_seconds = original_value | ||
|
|
||
| @pytest.fixture(scope="function") | ||
| def cache_disabled(self, config): | ||
| """Fixture to disable client cache (TTL=0).""" | ||
| assert config.test_mode | ||
| original_value = config.security.oauth_client_cache_ttl_seconds | ||
| config.security.oauth_client_cache_ttl_seconds = 0 | ||
| yield config | ||
| config.security.oauth_client_cache_ttl_seconds = original_value | ||
|
|
||
| def test_cache_miss_queries_database(self, db, oauth_client, cache_enabled): | ||
| """Test that a cache miss results in a database query.""" | ||
| # Clear the cache first | ||
| clear_client_cache() | ||
|
|
||
| client = get_cached_client( | ||
| db=db, | ||
| client_id=oauth_client.id, | ||
| config=cache_enabled, | ||
| scopes=SCOPE_REGISTRY, | ||
| roles=[], | ||
| ) | ||
|
|
||
| assert client is not None | ||
| assert client.id == oauth_client.id | ||
|
|
||
| # Client should now be in cache | ||
| cached = _client_cache.get(oauth_client.id) | ||
| assert cached is not None | ||
|
|
||
| def test_cache_hit_returns_cached_client(self, db, oauth_client, cache_enabled): | ||
| """Test that a cache hit returns the cached client.""" | ||
| clear_client_cache() | ||
|
|
||
| # First call should query DB and cache | ||
| client1 = get_cached_client( | ||
| db=db, | ||
| client_id=oauth_client.id, | ||
| config=cache_enabled, | ||
| scopes=SCOPE_REGISTRY, | ||
| roles=[], | ||
| ) | ||
|
|
||
| # Second call should return cached client | ||
| client2 = get_cached_client( | ||
| db=db, | ||
| client_id=oauth_client.id, | ||
| config=cache_enabled, | ||
| scopes=SCOPE_REGISTRY, | ||
| roles=[], | ||
| ) | ||
|
|
||
| # Both should be the same object (cached) | ||
| assert client1 is client2 | ||
|
|
||
| def test_root_client_not_cached(self, db, cache_enabled): | ||
| """Test that root client is not cached (it's already in-memory).""" | ||
| clear_client_cache() | ||
|
|
||
| client = get_cached_client( | ||
| db=db, | ||
| client_id=cache_enabled.security.oauth_root_client_id, | ||
| config=cache_enabled, | ||
| scopes=SCOPE_REGISTRY, | ||
| roles=[], | ||
| ) | ||
|
|
||
| assert client is not None | ||
|
|
||
| # Root client should NOT be in cache | ||
| cached = _client_cache.get(cache_enabled.security.oauth_root_client_id) | ||
| assert cached is None | ||
|
|
||
| def test_nonexistent_client_returns_none(self, db, cache_enabled): | ||
| """Test that looking up a non-existent client returns None.""" | ||
| clear_client_cache() | ||
|
|
||
| client = get_cached_client( | ||
| db=db, | ||
| client_id="non-existent-client-id", | ||
| config=cache_enabled, | ||
| scopes=SCOPE_REGISTRY, | ||
| roles=[], | ||
| ) | ||
|
|
||
| assert client is None | ||
|
|
||
| def test_cache_disabled_always_queries_db(self, db, oauth_client, cache_disabled): | ||
| """Test that with cache disabled, every call queries the database.""" | ||
| clear_client_cache() | ||
|
|
||
| client = get_cached_client( | ||
| db=db, | ||
| client_id=oauth_client.id, | ||
| config=cache_disabled, | ||
| scopes=SCOPE_REGISTRY, | ||
| roles=[], | ||
| ) | ||
|
|
||
| assert client is not None | ||
|
|
||
| # Nothing should be cached | ||
| assert _client_cache.size() == 0 | ||
|
|
||
|
|
||
| class TestGlobalCacheFunctions: | ||
| """Tests for the global cache helper functions.""" | ||
|
|
||
| @pytest.fixture(scope="function") | ||
| def cache_enabled(self, config): | ||
| """Fixture to enable client cache with 300 second TTL.""" | ||
| assert config.test_mode | ||
| original_value = config.security.oauth_client_cache_ttl_seconds | ||
| config.security.oauth_client_cache_ttl_seconds = 300 | ||
| yield config | ||
| config.security.oauth_client_cache_ttl_seconds = original_value | ||
|
|
||
| def test_clear_client_cache(self, db, oauth_client, cache_enabled): | ||
| """Test the clear_client_cache helper function.""" | ||
| _client_cache.set(oauth_client.id, oauth_client) | ||
|
|
||
| assert _client_cache.size() > 0 | ||
|
|
||
| count = clear_client_cache() | ||
| assert count > 0 | ||
| assert _client_cache.size() == 0 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Missing global cache cleanup fixture. The global _client_cache instance is shared across all tests. Without an autouse fixture that clears the cache before/after each test, tests may interfere with each other, leading to flaky test results. Add:
@pytest.fixture(scope="function", autouse=True)
def clear_cache():
"""Clear the global cache before each test."""
clear_client_cache()
yield
clear_client_cache()
Additional Comments (1)
This needs to be addressed by either:
|
Ticket ENG-2264
Description Of Changes
This PR adds an in-memory cache of oauth client details, to avoid an extra query to the DB on every request. The cache TTL can be configured via the security setting
oauth_client_cache_ttl_seconds(and disabled setting it to 0). Because we default it to 5min , we're ok with having stale info for up to that period of time. We will handle cache invalidations in a follow up PR.Code Changes
Steps to Confirm
Pre-Merge Checklist
CHANGELOG.mdupdatedmaindowngrade()migration is correct and works