-
Notifications
You must be signed in to change notification settings - Fork 1.1k
Disable parallel tool calls for search agent & fix azure api #873
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
a7m-1st
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi there @LuoPengcheng12138 according to my test, given:
options = {
"extra_params": {
"project_id": "my_project",
"timeout": 30,
"api_version": "v1",
"azure_ad_token": "token_value",
},
"project_id": "my_project",
}
It processes the outputs to:
Init Params: {'timeout': 30, 'api_version': 'v1', 'azure_ad_token': 'token_value'}
Model Config: {'user': 'my_project', 'project_id': 'my_project'}
Is that what is intended? If so I have an enhancement PR that streamlines the process in #886
backend/app/utils/agent.py
Outdated
| try: | ||
| model_platform_enum = ModelPlatformType(options.model_platform.lower()) | ||
| except (ValueError, AttributeError): | ||
| model_platform_enum = None | ||
| if model_platform_enum in { | ||
| ModelPlatformType.OPENAI, | ||
| ModelPlatformType.AZURE, | ||
| ModelPlatformType.AIHUBMIX, | ||
| }: | ||
| model_config["parallel_tool_calls"] = False | ||
|
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is it safe to proceed if model_platform_enum is None?
| try: | |
| model_platform_enum = ModelPlatformType(options.model_platform.lower()) | |
| except (ValueError, AttributeError): | |
| model_platform_enum = None | |
| if model_platform_enum in { | |
| ModelPlatformType.OPENAI, | |
| ModelPlatformType.AZURE, | |
| ModelPlatformType.AIHUBMIX, | |
| }: | |
| model_config["parallel_tool_calls"] = False | |
| try: | |
| model_platform_enum = ModelPlatformType(options.model_platform.lower()) | |
| if model_platform_enum in { | |
| ModelPlatformType.OPENAI, | |
| ModelPlatformType.AZURE, | |
| ModelPlatformType.AIHUBMIX, | |
| }: | |
| model_config["parallel_tool_calls"] = False | |
| except (ValueError, AttributeError): | |
| model_platform_enum = None | |
a7m-1st
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't have access to azure models tbh perhaps @4pmtong can take a look if you have since I think you raised this?
|
hi @LuoPengcheng12138 , thanks for this pr! In addition, we also need to verify whether these platforms require the parallel_tool_calls parameter to be removed when no tools are provided. Based on my tests, OpenAI requires this parameter to be removed if no tools are specified, and I have submitted a PR for this at camel-ai/camel#3706 If you find that other models or platforms have the same requirement, we should add the corresponding adaptations for them in this PR as well. |
| if model_platform_enum in { | ||
| ModelPlatformType.OPENAI, | ||
| ModelPlatformType.AZURE, | ||
| ModelPlatformType.AIHUBMIX, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
how about openai comptible, and why we specify AIHUBMIX here? seems not supported in eigent
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In Camel, the parallel_tool_calls parameter exists only in openai_config and aihubmix_config. Among these, openai_config corresponds to the openai_model and azure_openai_model models. Although the aihubmix_model is not used in Eigent, this check is still retained for alignment with Camel. As for OpenAICompatibleModel, its model_config_dict in Camel is a native dictionary, making it impossible to determine whether the parallel_tool_calls parameter is present.
Only |
we need to check from the model platform, not from camel's config part, since we support directly config passing using dict and the current model config maybe outdated |
I fully understand your meaning, so currently this can be split into two issues: One issue is that for platforms like OpenAI, when no tools are used, the The other issue is that in Eigent, even when tool calls are involved and Therefore, the part that needs to be enhanced in this PR is to check whether each model platform supports |
|
Based on the current research, only the |
Description
Only
camel/configs/openai_config.pyandcamel/configs/aihubmix_config.pydeclareparallel_tool_calls. Therefore, the only explicitly supported platforms are OpenAI/Azure OpenAI (which share the same configuration) and AihubMix (corresponding tocamel/models/openai_model.py,camel/models/azure_openai_model.pyandcamel/models/aihubmix_model.py).What is the purpose of this pull request?