Skip to content

Extract RubyLLM::Prompt from Agent's private prompt rendering#675

Open
kryzhovnik wants to merge 1 commit intocrmne:mainfrom
kryzhovnik:extract-prompt-class
Open

Extract RubyLLM::Prompt from Agent's private prompt rendering#675
kryzhovnik wants to merge 1 commit intocrmne:mainfrom
kryzhovnik:extract-prompt-class

Conversation

@kryzhovnik
Copy link
Contributor

Moves prompt root resolution, path construction, and ERB rendering into a standalone RubyLLM::Prompt class so prompts can be rendered without subclassing Agent.

Closes #667

What this does

Extracts RubyLLM::Prompt from Agent's private prompt_path_for and prompt_root methods. This lets users render prompts following the gem's own app/prompts/ convention without subclassing Agent:

RubyLLM::Prompt.render('friend', name: 'Andrey')

Agent's render_prompt now delegates to Prompt, preserving its existing class-name scoping and runtime context resolution.

Type of change

  • Bug fix
  • New feature
  • Breaking change
  • Documentation
  • Performance improvement

Scope check

  • I read the Contributing Guide
  • This aligns with RubyLLM's focus on LLM communication
  • This isn't application-specific logic that belongs in user code
  • This benefits most users, not just my specific use case

Required for new features

PRs for new features or enhancements without a prior approved issue will be closed.

Quality check

  • I ran overcommit --install and all hooks pass
  • I tested my changes thoroughly
    • For provider changes: Re-recorded VCR cassettes with bundle exec rake vcr:record[provider_name]
    • All tests pass: bundle exec rspec
  • I updated documentation if needed
  • I didn't modify auto-generated files manually (models.json, aliases.json)

AI-generated code

  • I used AI tools to help write this code
  • I have reviewed and understand all generated code (required if above is checked)

API changes

  • Breaking change
  • New public methods/classes
  • Changed method signatures
  • No API changes

Moves prompt root resolution, path construction, and ERB rendering
into a standalone RubyLLM::Prompt class so prompts can be rendered
without subclassing Agent.

Closes crmne#667

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[FEATURE] Extract RubyLLM::Prompt from Agent's existing prompt rendering internals

1 participant