Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
25 commits
Select commit Hold shift + click to select a range
4f25080
feat: Restrict decorating async methods with sync enignes of Redis, M…
gencurrent Feb 15, 2026
fc4f216
feat: Add compatibility unit tests against the async mode keyword; In…
gencurrent Feb 15, 2026
3d4013f
fix: Add conftest.py and clients.py files in core testing packages
gencurrent Feb 15, 2026
a7aa7c3
fix: Disable engine disposal
gencurrent Feb 15, 2026
98a6a90
fix: Remove imports from conftest.py files, simplify the test files
gencurrent Feb 15, 2026
f21bec7
fix: Increase tests coverage
gencurrent Feb 15, 2026
bc69d2b
fix: Increase tests coverage
gencurrent Feb 15, 2026
7f94242
fix: Increase tests coverage
gencurrent Feb 15, 2026
7d33808
fix: Increase tests coverage
gencurrent Feb 15, 2026
ad0d1ed
feat: Update README.rst
gencurrent Feb 15, 2026
3f79f3a
feat: Restrict decorating async methods with sync enignes of Redis, M…
gencurrent Feb 15, 2026
1f28e99
feat: Add compatibility unit tests against the async mode keyword; In…
gencurrent Feb 15, 2026
40afc3a
fix: Add conftest.py and clients.py files in core testing packages
gencurrent Feb 15, 2026
aae3fd2
fix: Disable engine disposal
gencurrent Feb 15, 2026
27fd5a6
fix: Remove imports from conftest.py files, simplify the test files
gencurrent Feb 15, 2026
afb2997
fix: Increase tests coverage
gencurrent Feb 15, 2026
9a49b9e
fix: Increase tests coverage
gencurrent Feb 15, 2026
19496bb
fix: Increase tests coverage
gencurrent Feb 15, 2026
92bea00
fix: Increase tests coverage
gencurrent Feb 15, 2026
8632099
feat: Update README.rst
gencurrent Feb 15, 2026
fbbbe9e
Merge branch 'feat/restrict-incogrous-decorations' of github.com:genc…
gencurrent Feb 17, 2026
1109e15
fix: Remove root tests conftest.py file
gencurrent Feb 17, 2026
479b65b
fix: Extra tests of `_is_async_redis_client`
gencurrent Feb 17, 2026
0099f8d
fix: Use the actual async engine from fixtures in SQL tests
gencurrent Feb 17, 2026
f39d8fd
fix: Add type hinting allowing Callable[[], "AsyncEngine"] as sql_eng…
gencurrent Feb 17, 2026
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
69 changes: 65 additions & 4 deletions README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -328,9 +328,12 @@ By default, ``cachier`` does not cache ``None`` values. You can override this be
Cachier Cores
=============


Pickle Core
-----------

**Sync/Async Support:** Both sync and async functions are supported with no additional setup. Async operations are internally delegated to the sync implementation, so no async-specific configuration is needed.

The default core for Cachier is pickle based, meaning each function will store its cache in a separate pickle file in the ``~/.cachier`` directory. Naturally, this kind of cache is both machine-specific and user-specific.

You can configure ``cachier`` to use another directory by providing the ``cache_dir`` parameter with the path to that directory:
Expand Down Expand Up @@ -369,6 +372,12 @@ You can get the fully qualified path to the directory of cache files used by ``c

MongoDB Core
------------

**Sync/Async Support:** Both sync and async functions are supported, but the ``mongetter`` callable type must match the decorated function:

- **Sync functions** require a sync ``mongetter`` (a regular callable returning a ``pymongo.Collection``).
- **Async functions** require an async ``mongetter`` (a coroutine callable returning an async collection, e.g. via ``motor`` or ``pymongo.asynchronous``). Passing a sync ``mongetter`` to an async function raises ``TypeError``.

You can set a MongoDB-based cache by assigning ``mongetter`` with a callable that returns a ``pymongo.Collection`` object with writing permissions:

**Usage Example (MongoDB sync):**
Expand Down Expand Up @@ -404,8 +413,6 @@ You can set a MongoDB-based cache by assigning ``mongetter`` with a callable tha
await asyncio.sleep(0.01)
return x * 2

**Note:** An async ``mongetter`` callable is supported only for async cached functions.

This allows you to have a cross-machine, albeit slower, cache. This functionality requires that the installation of the ``pymongo`` python package.

In certain cases the MongoDB backend might leave a deadlock behind, blocking all subsequent requests from being processed. If you encounter this issue, supply the ``wait_for_calc_timeout`` with a reasonable number of seconds; calls will then wait at most this number of seconds before triggering a recalculation.
Expand All @@ -418,6 +425,8 @@ In certain cases the MongoDB backend might leave a deadlock behind, blocking all
Memory Core
-----------

**Sync/Async Support:** Both sync and async functions are supported with no additional setup. Async operations are internally delegated to the sync implementation, so no async-specific configuration is needed.

You can set an in-memory cache by assigning the ``backend`` parameter with ``'memory'``:

.. code-block:: python
Expand All @@ -429,6 +438,11 @@ Note, however, that ``cachier``'s in-memory core is simple, and has no monitorin
SQLAlchemy (SQL) Core
---------------------

**Sync/Async Support:** Both sync and async functions are supported, but the ``sql_engine`` type must match the decorated function:

- **Sync functions** require a sync ``Engine`` (or a connection string / callable that resolves to one).
- **Async functions** require a SQLAlchemy ``AsyncEngine`` (e.g. created with ``create_async_engine``). Passing a sync engine to an async function raises ``TypeError``, and passing an async engine to a sync function also raises ``TypeError``.

**Note:** The SQL core requires SQLAlchemy to be installed. It is not installed by default with cachier. To use the SQL backend, run::

pip install SQLAlchemy
Expand Down Expand Up @@ -476,6 +490,11 @@ Cachier supports a generic SQL backend via SQLAlchemy, allowing you to use SQLit
Redis Core
----------

**Sync/Async Support:** Both sync and async functions are supported, but the ``redis_client`` callable type must match the decorated function:

- **Sync functions** require a sync ``redis.Redis`` client or a sync callable returning one.
- **Async functions** require an async callable returning a ``redis.asyncio.Redis`` client. Passing a sync callable to an async function raises ``TypeError``.

**Note:** The Redis core requires the redis package to be installed. It is not installed by default with cachier. To use the Redis backend, run::

pip install redis
Expand Down Expand Up @@ -546,8 +565,6 @@ Cachier supports Redis-based caching for high-performance scenarios. Redis provi

asyncio.run(main())

**Note:** An async ``redis_client`` callable is supported only for async cached functions.

**Configuration Options:**

- ``sql_engine``: SQLAlchemy connection string, Engine, or callable returning an Engine.
Expand All @@ -572,6 +589,50 @@ Cachier supports Redis-based caching for high-performance scenarios. Redis provi
- For best performance, ensure your DB supports row-level locking


Core Sync/Async Compatibility
------------------------------

The table below summarises sync and async function support across all cachier cores.
Cores marked as *delegated* run async operations on top of the sync implementation
(no event loop or async driver is required). Cores marked as *native* use dedicated
async drivers and require the client or engine type to match the decorated function.

.. list-table::
:header-rows: 1
:widths: 15 12 12 50

* - Core
- Sync
- Async
- Constraint
* - **Pickle**
- Yes
- Yes (delegated)
- None. No special configuration needed for async functions.
* - **Memory**
- Yes
- Yes (delegated)
- None. No special configuration needed for async functions.
* - **MongoDB**
- Yes
- Yes (native)
- ``mongetter`` must be a sync callable for sync functions and an async callable
for async functions. Passing a sync ``mongetter`` to an async function raises
``TypeError``.
* - **SQL**
- Yes
- Yes (native)
- ``sql_engine`` must be a sync ``Engine`` (or connection string) for sync
functions and a SQLAlchemy ``AsyncEngine`` for async functions. A type mismatch
in either direction raises ``TypeError``.
* - **Redis**
- Yes
- Yes (native)
- ``redis_client`` must be a sync client or sync callable for sync functions and
an async callable returning a ``redis.asyncio.Redis`` client for async
functions. Passing a sync callable to an async function raises ``TypeError``.


Contributing
============

Expand Down
55 changes: 53 additions & 2 deletions src/cachier/core.py
Original file line number Diff line number Diff line change
Expand Up @@ -156,6 +156,13 @@ def _pop_kwds_with_deprecation(kwds, name: str, default_value: bool):
return kwds.pop(name, default_value)


def _is_async_redis_client(client: Any) -> bool:
if client is None:
return False
method_names = ("hgetall", "hset", "keys", "delete", "hget")
return all(inspect.iscoroutinefunction(getattr(client, name, None)) for name in method_names)


def cachier(
hash_func: Optional[HashFunc] = None,
hash_params: Optional[HashFunc] = None,
Expand Down Expand Up @@ -300,6 +307,42 @@ def cachier(

def _cachier_decorator(func):
core.set_func(func)
is_coroutine = inspect.iscoroutinefunction(func)

if backend == "mongo":
if is_coroutine and not inspect.iscoroutinefunction(mongetter):
msg = "Async cached functions with Mongo backend require an async mongetter."
raise TypeError(msg)
if (not is_coroutine) and inspect.iscoroutinefunction(mongetter):
msg = "Async mongetter requires an async cached function."
raise TypeError(msg)

if backend == "redis":
if is_coroutine:
if callable(redis_client):
if not inspect.iscoroutinefunction(redis_client):
msg = "Async cached functions with Redis backend require an async redis_client callable."
raise TypeError(msg)
elif not _is_async_redis_client(redis_client):
msg = "Async cached functions with Redis backend require an async Redis client."
raise TypeError(msg)
else:
if callable(redis_client) and inspect.iscoroutinefunction(redis_client):
msg = "Async redis_client callable requires an async cached function."
raise TypeError(msg)
if _is_async_redis_client(redis_client):
msg = "Async Redis client requires an async cached function."
raise TypeError(msg)

if backend == "sql":
sql_core = core
assert isinstance(sql_core, _SQLCore) # noqa: S101
if is_coroutine and not sql_core.has_async_engine():
msg = "Async cached functions with SQL backend require an AsyncEngine sql_engine."
raise TypeError(msg)
if (not is_coroutine) and sql_core.has_async_engine():
msg = "Async SQL engines require an async cached function."
raise TypeError(msg)

last_cleanup = datetime.min
cleanup_lock = threading.Lock()
Expand Down Expand Up @@ -501,8 +544,6 @@ async def _call_async(*args, max_age: Optional[timedelta] = None, **kwds):
# argument.
# For async functions, we create an async wrapper that calls
# _call_async.
is_coroutine = inspect.iscoroutinefunction(func)

if is_coroutine:

@wraps(func)
Expand All @@ -522,6 +563,14 @@ def _clear_being_calculated():
"""Mark all entries in this cache as not being calculated."""
core.clear_being_calculated()

async def _aclear_cache():
"""Clear the cache asynchronously."""
await core.aclear_cache()

async def _aclear_being_calculated():
"""Mark all entries in this cache as not being calculated asynchronously."""
await core.aclear_being_calculated()

def _cache_dpath():
"""Return the path to the cache dir, if exists; None if not."""
return getattr(core, "cache_dir", None)
Expand All @@ -541,6 +590,8 @@ def _precache_value(*args, value_to_cache, **kwds): # noqa: D417

func_wrapper.clear_cache = _clear_cache
func_wrapper.clear_being_calculated = _clear_being_calculated
func_wrapper.aclear_cache = _aclear_cache
func_wrapper.aclear_being_calculated = _aclear_being_calculated
func_wrapper.cache_dpath = _cache_dpath
func_wrapper.precache_value = _precache_value
return func_wrapper
Expand Down
52 changes: 11 additions & 41 deletions src/cachier/cores/mongo.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,6 @@
import warnings # to warn if pymongo is missing
from contextlib import suppress
from datetime import datetime, timedelta
from inspect import isawaitable
from typing import Any, Optional, Tuple

from .._types import HashFunc, Mongetter
Expand Down Expand Up @@ -68,16 +67,7 @@ def _ensure_collection(self) -> Any:

with self.lock:
if self.mongo_collection is None:
coll = self.mongetter()
if isawaitable(coll):
# Avoid "coroutine was never awaited" warnings.
close = getattr(coll, "close", None)
if callable(close):
with suppress(Exception):
close()
msg = "async mongetter is only supported for async cached functions"
raise TypeError(msg)
self.mongo_collection = coll
self.mongo_collection = self.mongetter()

if not self._index_verified:
index_inf = self.mongo_collection.index_information()
Expand All @@ -96,23 +86,17 @@ async def _ensure_collection_async(self) -> Any:
if self.mongo_collection is not None and self._index_verified:
return self.mongo_collection

coll = self.mongetter()
if isawaitable(coll):
coll = await coll
coll = await self.mongetter()
self.mongo_collection = coll

if not self._index_verified:
index_inf = self.mongo_collection.index_information()
if isawaitable(index_inf):
index_inf = await index_inf
index_inf = await self.mongo_collection.index_information()
if _MongoCore._INDEX_NAME not in index_inf:
func1key1 = IndexModel(
keys=[("func", ASCENDING), ("key", ASCENDING)],
name=_MongoCore._INDEX_NAME,
)
res = self.mongo_collection.create_indexes([func1key1])
if isawaitable(res):
await res
await self.mongo_collection.create_indexes([func1key1])
self._index_verified = True

return self.mongo_collection
Expand Down Expand Up @@ -144,9 +128,7 @@ async def aget_entry(self, args, kwds) -> Tuple[str, Optional[CacheEntry]]:

async def aget_entry_by_key(self, key: str) -> Tuple[str, Optional[CacheEntry]]:
mongo_collection = await self._ensure_collection_async()
res = mongo_collection.find_one({"func": self._func_str, "key": key})
if isawaitable(res):
res = await res
res = await mongo_collection.find_one({"func": self._func_str, "key": key})
if not res:
return key, None
val = None
Expand Down Expand Up @@ -188,7 +170,7 @@ async def aset_entry(self, key: str, func_res: Any) -> bool:
return False
mongo_collection = await self._ensure_collection_async()
thebytes = pickle.dumps(func_res)
res = mongo_collection.update_one(
await mongo_collection.update_one(
filter={"func": self._func_str, "key": key},
update={
"$set": {
Expand All @@ -203,8 +185,6 @@ async def aset_entry(self, key: str, func_res: Any) -> bool:
},
upsert=True,
)
if isawaitable(res):
await res
return True

def mark_entry_being_calculated(self, key: str) -> None:
Expand All @@ -217,13 +197,11 @@ def mark_entry_being_calculated(self, key: str) -> None:

async def amark_entry_being_calculated(self, key: str) -> None:
mongo_collection = await self._ensure_collection_async()
res = mongo_collection.update_one(
await mongo_collection.update_one(
filter={"func": self._func_str, "key": key},
update={"$set": {"processing": True}},
upsert=True,
)
if isawaitable(res):
await res

def mark_entry_not_calculated(self, key: str) -> None:
mongo_collection = self._ensure_collection()
Expand All @@ -240,13 +218,11 @@ def mark_entry_not_calculated(self, key: str) -> None:
async def amark_entry_not_calculated(self, key: str) -> None:
mongo_collection = await self._ensure_collection_async()
with suppress(OperationFailure):
res = mongo_collection.update_one(
await mongo_collection.update_one(
filter={"func": self._func_str, "key": key},
update={"$set": {"processing": False}},
upsert=False,
)
if isawaitable(res):
await res

def wait_on_entry_calc(self, key: str) -> Any:
time_spent = 0
Expand All @@ -266,9 +242,7 @@ def clear_cache(self) -> None:

async def aclear_cache(self) -> None:
mongo_collection = await self._ensure_collection_async()
res = mongo_collection.delete_many(filter={"func": self._func_str})
if isawaitable(res):
await res
await mongo_collection.delete_many(filter={"func": self._func_str})

def clear_being_calculated(self) -> None:
mongo_collection = self._ensure_collection()
Expand All @@ -279,12 +253,10 @@ def clear_being_calculated(self) -> None:

async def aclear_being_calculated(self) -> None:
mongo_collection = await self._ensure_collection_async()
res = mongo_collection.update_many(
await mongo_collection.update_many(
filter={"func": self._func_str, "processing": True},
update={"$set": {"processing": False}},
)
if isawaitable(res):
await res

def delete_stale_entries(self, stale_after: timedelta) -> None:
"""Delete stale entries from the MongoDB cache."""
Expand All @@ -296,6 +268,4 @@ async def adelete_stale_entries(self, stale_after: timedelta) -> None:
"""Delete stale entries from the MongoDB cache."""
mongo_collection = await self._ensure_collection_async()
threshold = datetime.now() - stale_after
res = mongo_collection.delete_many(filter={"func": self._func_str, "time": {"$lt": threshold}})
if isawaitable(res):
await res
await mongo_collection.delete_many(filter={"func": self._func_str, "time": {"$lt": threshold}})
Loading