Skip to content

feat: add Astraflow (UCloud) chat model provider#6368

Open
ucloudnb666 wants to merge 3 commits intoFlowiseAI:mainfrom
ucloudnb666:feat/astraflow-1778318768
Open

feat: add Astraflow (UCloud) chat model provider#6368
ucloudnb666 wants to merge 3 commits intoFlowiseAI:mainfrom
ucloudnb666:feat/astraflow-1778318768

Conversation

@ucloudnb666
Copy link
Copy Markdown

Description

Adds support for Astraflow (by UCloud / 优刻得) as a Chat Model provider. Astraflow is an OpenAI-compatible AI model aggregation platform that provides access to 200+ models through a unified API.

Because Astraflow exposes an OpenAI-compatible Chat Completions API, the integration is implemented as a thin wrapper around ChatOpenAI (same approach already used by ChatCometAPI, ChatOpenRouter, Deepseek, etc.) — no new SDK is added.

Changes

Three small files added, no existing files modified:

  1. packages/components/credentials/AstraflowApi.credential.ts — new credential type astraflowApi with a single astraflowApiKey password input.
  2. packages/components/nodes/chatmodels/ChatAstraflow/ChatAstraflow.ts — new chat model node ChatAstraflow that wraps ChatOpenAI and points it at the Astraflow base URL.
  3. packages/components/nodes/chatmodels/ChatAstraflow/astraflow.svg — node icon.

Endpoints

The node exposes a Base URL input so users can switch between the two Astraflow endpoints:

  • Global (default): https://api-us-ca.umodelverse.ai/v1 — corresponds to ASTRAFLOW_API_KEY
  • China: https://api.modelverse.cn/v1 — corresponds to ASTRAFLOW_CN_API_KEY

Notes

  • Follows the existing OpenAI-compatible provider pattern in the codebase (closest reference: ChatCometAPI).
  • No changes to tests, docs, or the monorepo's other packages — Flowise auto-discovers nodes/credentials from these directories at build time via gulpfile.ts.
  • Inputs (Temperature, Max Tokens, Top P, Frequency / Presence Penalty, Streaming, Base Options) mirror the existing ChatCometAPI node for UX consistency.

Copy link
Copy Markdown
Contributor

@gemini-code-assist gemini-code-assist Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces the Astraflow (UCloud) integration, including a new credential class and a Chat Model node that wraps the LangChain ChatOpenAI implementation. Feedback was provided to improve the robustness of the node by handling optional temperature values to avoid NaN results, updating the token parameter to maxCompletionTokens for better alignment with newer LangChain versions, and ensuring custom headers in baseOptions are correctly passed to the underlying client.

Comment on lines +173 to +176
configuration: {
baseURL: basePath,
...parsedBaseOptions
}
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

The baseOptions should be passed as defaultHeaders within the configuration object to ensure custom headers are correctly handled by the underlying OpenAI client, maintaining consistency with the ChatOpenAI node implementation.

Suggested change
configuration: {
baseURL: basePath,
...parsedBaseOptions
}
configuration: {
baseURL: basePath,
defaultHeaders: parsedBaseOptions
}

Comment on lines +143 to +149
const obj: ChatOpenAIFields = {
temperature: parseFloat(temperature),
modelName,
openAIApiKey: astraflowApiKey,
apiKey: astraflowApiKey,
streaming: streaming ?? true
}
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The temperature input is optional. If it is missing or empty, parseFloat(temperature) will result in NaN, which can cause issues during model initialization. It is safer to only assign it if a value is present.

Suggested change
const obj: ChatOpenAIFields = {
temperature: parseFloat(temperature),
modelName,
openAIApiKey: astraflowApiKey,
apiKey: astraflowApiKey,
streaming: streaming ?? true
}
const obj: ChatOpenAIFields = {
modelName,
openAIApiKey: astraflowApiKey,
apiKey: astraflowApiKey,
streaming: streaming ?? true
}
if (temperature) obj.temperature = parseFloat(temperature)

streaming: streaming ?? true
}

if (maxTokens) obj.maxTokens = parseInt(maxTokens, 10)
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

For consistency with the main ChatOpenAI node and to align with newer versions of @langchain/openai, use maxCompletionTokens instead of maxTokens.

Suggested change
if (maxTokens) obj.maxTokens = parseInt(maxTokens, 10)
if (maxTokens) obj.maxCompletionTokens = parseInt(maxTokens, 10)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant