Skip to content

Conversation

@torresmateo
Copy link
Collaborator

@torresmateo torresmateo commented Dec 10, 2025

Preview here: https://docs-git-mateo-dev-25-write-connecting-arcade-a84eaa-arcade-ai.vercel.app/en/home/connect-arcade-to-your-llm


Note

Adds a Python-focused guide for integrating Arcade tools with an LLM and links it in the Guides sidebar.

  • Docs:
    • Add app/en/home/connect-arcade-to-your-llm/page.mdx guide covering:
      • Project setup with uv, environment config, and OpenRouter usage.
      • Fetching Arcade tool definitions (formatted.get(..., format="openai")).
      • Helper to authorize and execute tools (tools.authorize, auth.wait_for_completion, tools.execute).
      • Multi-turn ReAct-style loop invoking the LLM (chat.completions.create) with tool calls handling.
      • Simple CLI chat loop and runnable example (uv run main.py).
  • Navigation:
    • Update app/en/home/_meta.tsx to add "connect-arcade-to-your-llm" under Guides with title "Integrate Arcade tools into your LLM application".

Written by Cursor Bugbot for commit c707a66. This will update automatically on new commits. Configure here.

@vercel
Copy link

vercel bot commented Dec 10, 2025

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Review Updated (UTC)
docs Ready Ready Preview, Comment Dec 18, 2025 1:28pm

"content": tool_result,
})

continue
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Bug: Missing assistant message before tool results in history

When the LLM returns tool calls, the code appends tool result messages to history but never appends the assistant message that contained the tool_calls. The OpenAI API (and compatible APIs like OpenRouter) requires the assistant message with tool_calls to appear in the conversation history before the corresponding tool result messages. This will cause an API error on the next iteration of the loop when the malformed history is sent back to the model.

Additional Locations (1)

Fix in Cursor Fix in Web

Copy link
Contributor

@nearestnabors nearestnabors left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I need more openrouter tokens to actually test that it works, but wanted to share what I have so far!

torresmateo and others added 2 commits December 16, 2025 14:58
Co-authored-by: RL "Nearest" Nabors <236306+nearestnabors@users.noreply.github.com>
Co-authored-by: RL "Nearest" Nabors <236306+nearestnabors@users.noreply.github.com>

# Print the latest assistant response
assistant_response = history[-1]["content"]
print(f"\n🤖 Assistant: {assistant_response}\n")
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Bug: Tool result shown as assistant response when max_turns exhausted

When invoke_llm exhausts max_turns while the assistant is still making tool calls, the function returns with a tool response as the last history item. The chat() function then accesses history[-1]["content"] and prints it prefixed with "🤖 Assistant:", displaying raw tool output as if it were the assistant's response. This produces confusing output when many consecutive tool calls are needed.

Additional Locations (1)

Fix in Cursor Fix in Web

Copy link
Contributor

@nearestnabors nearestnabors left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Still getting the same malformed directory structure after running the commands in step 1:
Screenshot 2025-12-17 at 5 03 31 PM

Screenshot 2025-12-17 at 5 03 23 PM

Co-authored-by: vfanelle <vfanelle@gmail.com>
@torresmateo torresmateo enabled auto-merge (squash) December 19, 2025 16:44
@torresmateo torresmateo disabled auto-merge December 19, 2025 17:10
@torresmateo
Copy link
Collaborator Author

holding the merge until nav is merged
FYI @nearestnabors

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants