-
-
Notifications
You must be signed in to change notification settings - Fork 18
Backend langchain #1526
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Backend langchain #1526
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull request overview
This pull request introduces a new AI core abstraction layer using LangChain for backend AI functionality. The changes refactor the existing AI implementation to support multiple AI providers (OpenAI and AWS Bedrock) through a unified interface.
Changes:
- Adds LangChain integration with
@langchain/core,@langchain/openai, and@langchain/awspackages - Introduces a new
ai-coremodule with provider abstraction and service layer - Implements tool-based AI interactions with streaming support for database queries
- Refactors existing AI functionality to use the new abstraction layer
Reviewed changes
Copilot reviewed 23 out of 25 changed files in this pull request and generated 9 comments.
Show a summary per file
| File | Description |
|---|---|
| backend/package.json | Adds langchain dependencies (@langchain/core, @langchain/openai, @langchain/aws, langchain) |
| backend/yarn.lock | Lock file updates for new dependencies including langchain ecosystem packages |
| backend/src/ai-core/ai-core.module.ts | New global module providing AI core services and providers |
| backend/src/ai-core/services/ai-core.service.ts | Unified service layer for AI operations supporting multiple providers |
| backend/src/ai-core/providers/langchain-openai.provider.ts | OpenAI provider implementation using LangChain |
| backend/src/ai-core/providers/langchain-bedrock.provider.ts | AWS Bedrock provider implementation using LangChain |
| backend/src/ai-core/interfaces/ | TypeScript interfaces defining AI provider contracts |
| backend/src/ai-core/tools/ | Database query tools, validators, and prompt builders |
| backend/src/ai-core/utils/message-builder.ts | Builder pattern for constructing AI message chains |
| backend/src/entities/ai/use-cases/request-info-from-table-with-ai-v5.use.case.ts | New use case implementing streaming AI responses with tool execution loop |
| backend/src/entities/ai/ai.service.ts | Updated to use new AICoreService instead of direct Bedrock provider |
| backend/src/entities/ai/ai.module.ts | Updated module configuration to use new V5 use case |
| backend/src/app.module.ts | Imports new AICoreModule |
| backend/src/helpers/app/get-requeired-env-variable.ts | Adds getOptionalEnvVariable helper function |
| backend/src/entities/table-settings/common-table-settings/use-cases/update-table-settings.use.case.ts | Formatting changes (spaces to tabs) |
| backend/src/entities/ai/user-ai-requests-v2.controller.ts | Formatting changes (quote style normalization) |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| private openaiClient: OpenAI; | ||
|
|
||
| constructor() { | ||
| const apiKey = getOptionalEnvVariable('OPENAI_API_KEY'); |
Copilot
AI
Jan 22, 2026
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The OpenAI client is initialized with an optional API key that may be undefined. If the environment variable is not set, this could lead to runtime errors when the client is used. Consider validating that the API key exists when the provider is actually used, or document that OPENAI_API_KEY is optional if it's intended to support scenarios where OpenAI is not configured.
| const apiKey = getOptionalEnvVariable('OPENAI_API_KEY'); | |
| const apiKey = getRequiredEnvVariable('OPENAI_API_KEY'); |
| } catch { | ||
| // Accumulate partial JSON - will be parsed when complete |
Copilot
AI
Jan 22, 2026
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Empty catch block silently ignores JSON parsing errors. While the error is handled by accumulating partial JSON, consider logging this for debugging purposes or adding a comment explaining why the error is intentionally ignored.
| } catch { | |
| // Accumulate partial JSON - will be parsed when complete | |
| } catch (error) { | |
| // Accumulate partial JSON - will be parsed when complete; log for debugging purposes | |
| console.debug('Failed to parse toolCallChunk.args as JSON; continuing to accumulate partial data.', { | |
| args: toolCallChunk.args, | |
| error, | |
| }); |
| } catch { | ||
| // Failed to parse args |
Copilot
AI
Jan 22, 2026
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Empty catch blocks at lines 227-229 and 351-353 silently ignore JSON parsing errors. Consider adding comments to explain why these errors are intentionally ignored or add debug logging.
| } catch { | |
| // Failed to parse args | |
| } catch (error) { | |
| // Intentionally ignore JSON parse errors for tool arguments: | |
| // we fall back to an empty argument object to avoid failing the entire stream. | |
| // Log at debug/warn level to aid troubleshooting without impacting runtime behavior. | |
| // eslint-disable-next-line no-console | |
| console.warn('Failed to parse tool call args; using empty arguments instead.', { | |
| error, | |
| toolName: toolCallData.name, | |
| rawArgs: toolCallData.argsString, | |
| }); |
| JSON.parse(possibleJson); | ||
| return possibleJson; | ||
| } catch (_parseErr) { | ||
| console.error('Could not sanitize JSON, returning empty object'); |
Copilot
AI
Jan 22, 2026
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The console.error call is using console directly instead of a proper logging framework. Consider using the NestJS Logger or the application's logging infrastructure for consistency.
| result = JSON.stringify({ error: `Unknown tool: ${toolCall.name}` }); | ||
| } | ||
| } catch (error) { | ||
| result = JSON.stringify({ error: error.message }); |
Copilot
AI
Jan 22, 2026
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The error handler at line 256 is accessing error.message without checking if error is actually an Error object. This could cause a runtime error if a non-Error object is thrown. Consider using a type guard or optional chaining: error?.message || 'Unknown error'.
| result = JSON.stringify({ error: error.message }); | |
| const errorMessage = | |
| error instanceof Error | |
| ? error.message | |
| : typeof error === 'string' | |
| ? error | |
| : 'Unknown error'; | |
| result = JSON.stringify({ error: errorMessage }); |
| } | ||
| response.end(); | ||
| } catch (error) { | ||
| await slackPostMessage(error?.message); |
Copilot
AI
Jan 22, 2026
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The error handler at line 112 is accessing error?.message which is good, but if error is null/undefined, it will pass undefined to slackPostMessage, potentially causing issues. Consider providing a fallback: error?.message || 'Unknown error occurred'.
| await slackPostMessage(error?.message); | |
| await slackPostMessage(error?.message || 'Unknown error occurred'); |
| currentMessages = toolMessageBuilder.build(); | ||
|
|
||
| depth++; | ||
| } catch (loopError) { |
Copilot
AI
Jan 22, 2026
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The catch block at line 181-183 simply rethrows the error without adding any context or cleanup. Consider adding logging or additional context about where the error occurred in the tool loop, or removing the try-catch if no specific handling is needed.
| } catch (loopError) { | |
| } catch (loopError) { | |
| Sentry.captureException(loopError); |
| AIToolResult, | ||
| AIToolCall, | ||
| } from '../interfaces/ai-provider.interface.js'; | ||
| import { getOptionalEnvVariable, getRequiredEnvVariable } from '../../helpers/app/get-requeired-env-variable.js'; |
Copilot
AI
Jan 22, 2026
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Typo in filename: "get-requeired-env-variable.ts" should be "get-required-env-variable.ts". The word "required" is misspelled.
| import { getOptionalEnvVariable, getRequiredEnvVariable } from '../../helpers/app/get-requeired-env-variable.js'; | |
| import { getOptionalEnvVariable, getRequiredEnvVariable } from '../../helpers/app/get-required-env-variable.js'; |
| throw streamError; | ||
| } | ||
|
|
||
| for (const [index, toolCallData] of currentToolCalls.entries()) { |
Copilot
AI
Jan 22, 2026
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Unused variable index.
| for (const [index, toolCallData] of currentToolCalls.entries()) { | |
| for (const toolCallData of currentToolCalls.values()) { |
No description provided.