Skip to content
This repository was archived by the owner on Jun 5, 2025. It is now read-only.
This repository was archived by the owner on Jun 5, 2025. It is now read-only.

[Task]: ensure that system prompts/custom prompts can be cached (prompt cache) #941

@kantord

Description

@kantord

Description

For a primer/generic info on prompt caching, you can check out this document: https://docs.google.com/document/d/1gEgkIyT1CRcTMAwYiU6ldeeosKKbWMX5HM8AdQx3xV0/edit?tab=t.0

This is especially relevant for Anthropic, where prompt caching is opt-in, and requires marking each message with cache_control directives. This means that system prompts should be sent as separate messages, with cache control enabled. It seems like we are already correctly sending them as separate messages:

async def _construct_system_prompt(
self,
wrksp_custom_instr: str,
req_sys_prompt: Optional[str],
should_add_codegate_sys_prompt: bool,
) -> ChatCompletionSystemMessage:
def _start_or_append(existing_prompt: str, new_prompt: str) -> str:
if existing_prompt:
return existing_prompt + "\n\nHere are additional instructions:\n\n" + new_prompt
return new_prompt
system_prompt = ""
# Add codegate system prompt if secrets or bad packages are found at the beginning
if should_add_codegate_sys_prompt:
system_prompt = _start_or_append(system_prompt, self.codegate_system_prompt)
# Add workspace system prompt if present
if wrksp_custom_instr:
system_prompt = _start_or_append(system_prompt, wrksp_custom_instr)
# Add request system prompt if present
if req_sys_prompt and "codegate" not in req_sys_prompt.lower():
system_prompt = _start_or_append(system_prompt, req_sys_prompt)
return system_prompt
at least as of today.

Additional Context

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions