Skip to content

fix(types): correct prompt_cache_retention value from in-memory to in_memory#1783

Open
s-zx wants to merge 1 commit intoopenai:masterfrom
s-zx:fix/1756-prompt-cache-retention-type
Open

fix(types): correct prompt_cache_retention value from in-memory to in_memory#1783
s-zx wants to merge 1 commit intoopenai:masterfrom
s-zx:fix/1756-prompt-cache-retention-type

Conversation

@s-zx
Copy link

@s-zx s-zx commented Mar 21, 2026

Summary

Fixes #1756

The OpenAI API documentation specifies in_memory (underscore) as the valid value for prompt_cache_retention, but the SDK type definitions were using in-memory (hyphen), causing a type mismatch for developers following the official docs.

Changes

Updated the prompt_cache_retention type union in all four locations across two files:

  • src/resources/responses/responses.ts (3 occurrences)
  • src/resources/chat/completions/completions.ts (1 occurrence)

Before:

prompt_cache_retention?: 'in-memory' | '24h' | null;

After:

prompt_cache_retention?: 'in_memory' | '24h' | null;

This aligns the SDK types with the official API documentation which documents in_memory as the correct value.

…_memory

The OpenAI API documentation specifies `in_memory` (underscore) as the valid
value for `prompt_cache_retention`, but the SDK was using `in-memory` (hyphen).

This commit updates all four type definitions in responses.ts and completions.ts
to use the correct `in_memory` value, matching the official API docs.
@s-zx s-zx requested a review from a team as a code owner March 21, 2026 22:49
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Prompt Caching retention policy value mismatch

2 participants