Skip to content

feat(ai-integrations): custom prompts#12603

Draft
hamza221 wants to merge 1 commit intomainfrom
feat/edit-promptes
Draft

feat(ai-integrations): custom prompts#12603
hamza221 wants to merge 1 commit intomainfrom
feat/edit-promptes

Conversation

@hamza221
Copy link
Copy Markdown
Contributor

@hamza221 hamza221 commented Mar 18, 2026

fix #11285
Vibe coded

Signed-off-by: Hamza <hamzamahjoubi221@gmail.com>
Copy link
Copy Markdown

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Adds administrator-configurable prompt templates for the Mail app’s AI/LLM features, addressing #11285 by exposing default prompts, storing custom overrides, and wiring them into the existing AI task execution.

Changes:

  • Introduce default/custom prompt management in AiIntegrationsService and expose both via admin initial-state.
  • Add an admin UI section to edit/reset/save custom prompts, backed by a new settings API endpoint.
  • Update unit test expectations for additional initial-state keys.

Reviewed changes

Copilot reviewed 7 out of 7 changed files in this pull request and generated 5 comments.

Show a summary per file
File Description
lib/Service/AiIntegrations/AiIntegrationsService.php Centralizes default prompt templates, loads overrides from app config, and uses templates when building LLM tasks.
lib/Settings/AdminSettings.php Provides llm_custom_prompts + llm_default_prompts via initial state for the admin settings UI.
src/components/settings/AdminSettings.vue Adds UI to edit/reset prompts and persist them through the settings API.
src/service/SettingsService.js Adds client method to PUT custom prompts to the new endpoint.
lib/Controller/SettingsController.php Adds setLlmCustomPrompts() endpoint to persist per-prompt overrides.
appinfo/routes.php Registers the new PUT /api/settings/llm-prompts route.
tests/Unit/Settings/AdminSettingsTest.php Adjusts initial-state call count and expected keys for the new prompt states.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

You can also share your feedback on Copilot code review. Take the survey.

Comment on lines +139 to +141
foreach ($prompts as $key => $value) {
$this->aiIntegrationsService->setCustomPrompt($key, $value);
}
Comment on lines +132 to +143
/**
* Update custom LLM prompts. Each key maps to a prompt config key,
* and the value is the custom prompt string (empty string resets to default).
*
* @param array<string, string> $prompts
*/
public function setLlmCustomPrompts(array $prompts): JSONResponse {
foreach ($prompts as $key => $value) {
$this->aiIntegrationsService->setCustomPrompt($key, $value);
}
return new JSONResponse([]);
}
Comment on lines +150 to +160
/**
* Get all custom prompts (only the ones that have been customized).
*
* @return array<string, string>
*/
public function getCustomPrompts(): array {
$prompts = [];
foreach (array_keys(self::PROMPT_DEFAULTS) as $key) {
$prompts[$key] = $this->config->getAppValue(Application::APP_ID, $key, '');
}
return $prompts;
Comment on lines +157 to +163
<div
v-if="isLlmSummaryConfigured && isLlmEnabled"
class="app-description">
<h3>{{ t('mail', 'Custom LLM prompts') }}</h3>
<article>
<p>
{{ t('mail', 'Customize the prompts used for AI-powered features. Use {body} as a placeholder for the email content and {language} for the user language code where applicable.', { body: '{body}', language: '{language}' }) }}
Comment on lines +105 to +113
/**
* @param {object} prompts - Map of prompt keys to custom prompt strings
* @return {Promise<void>}
*/
export async function setLlmCustomPrompts(prompts) {
const url = generateUrl('/apps/mail/api/settings/llm-prompts')
const resp = await axios.put(url, { prompts })
return resp.data
}
Copy link
Copy Markdown

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Adds administrator-configurable prompt templates for the Mail app’s AI/LLM-powered features, addressing #11285 by exposing defaults + custom overrides in the admin settings UI and persisting overrides server-side.

Changes:

  • Provide default + custom prompt maps via initial state, and render/edit them in Admin Settings UI.
  • Add a new settings API endpoint and JS client call to persist custom prompt overrides.
  • Refactor AI integration service to use configurable prompt templates instead of hardcoded strings.

Reviewed changes

Copilot reviewed 7 out of 7 changed files in this pull request and generated 6 comments.

Show a summary per file
File Description
tests/Unit/Settings/AdminSettingsTest.php Updates initial-state expectations to include default/custom prompt state.
src/service/SettingsService.js Adds client helper to PUT custom prompt overrides to the server.
src/components/settings/AdminSettings.vue Adds admin UI for viewing/editing/resetting/saving prompt templates.
lib/Settings/AdminSettings.php Provides llm_custom_prompts + llm_default_prompts via initial state.
lib/Service/AiIntegrations/AiIntegrationsService.php Introduces prompt keys/defaults and uses them in LLM tasks.
lib/Controller/SettingsController.php Adds endpoint to persist custom prompt overrides.
appinfo/routes.php Registers the new PUT route for updating prompts.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

You can also share your feedback on Copilot code review. Take the survey.

Comment on lines +105 to +113
/**
* @param {object} prompts - Map of prompt keys to custom prompt strings
* @return {Promise<void>}
*/
export async function setLlmCustomPrompts(prompts) {
const url = generateUrl('/apps/mail/api/settings/llm-prompts')
const resp = await axios.put(url, { prompts })
return resp.data
}
public const PROMPT_TRANSLATION = 'llm_prompt_translation';
public const PROMPT_EVENT_DATA = 'llm_prompt_event_data';

private const DEFAULT_PROMPT_SUMMARIZE = "You are tasked with formulating a helpful summary of a email message. \r\nThe summary should be in the language of this language code {language}. \r\nThe summary should be less than 160 characters. \r\nOutput *ONLY* the summary itself, leave out any introduction. \r\nHere is the ***E-MAIL*** for which you must generate a helpful summary: \r\n***START_OF_E-MAIL***\r\n{body}\r\n***END_OF_E-MAIL***\r\n";
Comment on lines +150 to +160
/**
* Get all custom prompts (only the ones that have been customized).
*
* @return array<string, string>
*/
public function getCustomPrompts(): array {
$prompts = [];
foreach (array_keys(self::PROMPT_DEFAULTS) as $key) {
$prompts[$key] = $this->config->getAppValue(Application::APP_ID, $key, '');
}
return $prompts;
Please, output *ONLY* a valid JSON string with the keys 'reply1' and 'reply2' for the reply suggestions. Leave out any other text besides the JSON! Be extremely succinct and write the replies from my point of view.
";
$prompt = str_replace('{body}', $messageBody, $this->getPrompt(self::PROMPT_SMART_REPLY));
$task = new TextProcessingTask(FreePromptTaskType::class, $prompt, 'mail,', $currentUserId);
Comment on lines +138 to +141
public function setLlmCustomPrompts(array $prompts): JSONResponse {
foreach ($prompts as $key => $value) {
$this->aiIntegrationsService->setCustomPrompt($key, $value);
}
</article>
</div>
<div
v-if="isLlmSummaryConfigured && isLlmEnabled"
.prompt-fields {
display: flex;
flex-direction: column;
gap: 16px;
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
gap: 16px;
gap: calc(var(--default-grid-baseline) * 4);

Can u please tell copilot to use the CSS vars

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Make AI integration prompts configurable

3 participants