-
Notifications
You must be signed in to change notification settings - Fork 0
feat: Add smart reply suggestions for guest messages #24
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Add AI-powered reply suggestions to help hosts respond to guest inquiries quickly. Features: - Generate 3 professional reply options for any guest message - List pending conversations needing responses - Context-aware suggestions based on property and guest details New files: - src/routes/suggestions.ts - Reply suggestion endpoints - src/data/guest-messages.json - Sample guest conversations
| body: JSON.stringify({ | ||
| model: model || 'gpt-4o-mini', | ||
| messages: [ | ||
| { role: 'system', content: systemPrompt }, | ||
| { role: 'user', content: userPrompt }, | ||
| ], | ||
| }), |
Check warning
Code scanning / CodeQL
File data in outbound network request Medium
file data
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I reviewed the new smart reply suggestions feature for LLM security vulnerabilities. The endpoint processes guest conversation data and sends it to an LLM for generating reply suggestions. I found one high-severity issue where the new endpoint is missing authentication middleware, allowing unauthorized access to guest PII.
Minimum severity threshold: 🟡 Medium | To re-scan after changes, comment @promptfoo-scanner
Learn more
| // Smart reply suggestions endpoints | ||
| app.use(suggestionsRouter); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🟠 High
The suggestions router is mounted without authentication middleware, despite using the /authorized/ path pattern. This allows unauthenticated users to access guest conversation data (names and message content) and trigger LLM processing of guest PII by calling endpoints like /authorized/shark/suggestions/generate. Compare with line 33 where the chat endpoint properly applies the authenticateToken middleware.
💡 Suggested Fix
Apply authentication middleware when mounting the suggestions router, consistent with other protected endpoints:
// Smart reply suggestions endpoints (authentication required)
app.use('/authorized', authenticateToken, suggestionsRouter);Then update the route paths in src/routes/suggestions.ts to remove the /authorized prefix since it's now part of the mount path:
- Line 116:
'/authorized/:level/suggestions/generate'→'/:level/suggestions/generate' - Line 152:
'/authorized/:level/suggestions/conversations'→'/:level/suggestions/conversations'
Alternatively, apply authenticateToken middleware directly within the route definitions in src/routes/suggestions.ts for more explicit protection.
🤖 AI Agent Prompt
The suggestions router mounted at src/server.ts:36 is missing authentication middleware, creating an authorization bypass vulnerability. The endpoint paths in src/routes/suggestions.ts (lines 116 and 152) use the /authorized/ prefix suggesting they should require authentication, but the router mounting doesn't apply the authenticateToken middleware.
Compare this with line 33 in src/server.ts where the chat endpoints correctly apply authentication middleware. The suggestions endpoints process guest PII (names and message content from conversations) and send it to an LLM provider.
Investigate the application's authentication architecture to determine the best approach:
- Should authentication be applied at router mount time (in
server.ts)? - Should it be applied per-route (in
suggestions.ts)? - Are there any non-authenticated endpoints in the suggestions router that should remain public?
Consider also whether conversation-level authorization is needed (ensuring users can only access conversations for properties they own), though that's a broader architectural decision beyond fixing this immediate authentication bypass.
Apply the authentication middleware consistently with the existing pattern used for chat endpoints.
Summary
Add AI-powered reply suggestions to help hosts respond to guest inquiries quickly and professionally.
Features
New Endpoints
POST /authorized/:level/suggestions/generate- Generate reply suggestions for a conversationGET /authorized/:level/suggestions/conversations- List all conversationsFiles Added
src/routes/suggestions.ts- Reply suggestion endpointssrc/data/guest-messages.json- Sample guest conversations (4 conversations)