Skip to content

[Bug]: model_supports_json is no longer supported on v3, breaking custom model setups #2263

@Voileexperiments

Description

@Voileexperiments

Do you need to file an issue?

  • I have searched the existing issues and this bug is not already filed.
  • My model is hosted on OpenAI or Azure. If not, please look at the "model providers" issue and don't file a new one here.
  • I believe this is a legitimate bug, not just a question. If this is a question, please use the Discussions area.

Describe the bug

Most models do not support structured output. In this case the LLM caller (liteLLM) usually falls back to a tool call. However, a tool call will cause response content to be empty, which then the LLM caller tries to parse, and fail.

In v2 there is a model_supports_json param, which can be disabled to not use structured output in the LLM calls, to avoid this issue. In v3 this setting is gone, and so every model that doesn't support structured output will fail to work.

Steps to reproduce

No response

Expected Behavior

No response

GraphRAG Config Used

# Paste your config here

Logs and screenshots

No response

Additional Information

  • GraphRAG Version: 3.0.5

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workingtriageDefault label assignment, indicates new issue needs reviewed by a maintainer

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions