Welcome to the Generative UI Global Hackathon: Agentic Interfaces! This starter kit gives you a complete AI-powered application with durable conversation threads, an agent-driven canvas, real-world MCP integrations, and a deployable MCP App — wired up with CopilotKit, LangChain Deep Agents, Gemini, A2UI, Notion MCP (via mcp-use), Manufact, and Daytona.
GenUI.Global.Hackathon.StarterKit.intro.mp4
This is a starter template for building agentic interfaces using Generative UI. It provides a modern Next.js application with an integrated LangGraph Deep Agent that manages a visual canvas of interactive cards with real-time AI synchronization and external tool integrations (a Notion "Leads" database, for this example) through MCP. A second deployable MCP server, built on mcp-use, gives the agent a third surface that runs natively in Claude or ChatGPT.
This is an example application that we built to help you get started quickly. Everything you see can be customized, replaced, augmented, or built upon.
StarterKit.mp4
- Persistent threads. Every conversation is named, listed in the sidebar, and survives reloads, restarts, and resumes mid-run.
- Agent-driven canvas. Lead cards, follow-up notes, and pipeline charts the AI can create, edit, and organize while you watch.
- Real integrations via MCP. Notion Leads database sync out of the box; swap to any other MCP server with one config edit.
- Deployable MCP server. A third agent surface that runs in Claude or ChatGPT, deployable with one command.
- Generative UI primed. Stream Gemini-rendered components without re-plumbing.
"Generative UI" describes any AI-driven interface that the agent chooses, composes, or writes at runtime. Approaches sit on a spectrum — from more control on one end to more flexibility on the other — and most real apps mix several tiers.
The highest level of control. The developer provides the agent with a set of predefined React components, and the agent selects the appropriate one and populates it with props. This ensures the interface stays on-brand and pixel-perfect, making it ideal for standard, repeatable application workflows. See Display Components in the CopilotKit docs.
Utilizing the A2UI specification, this method uses a schema to map agent outputs to a catalog of renderers. It offers a balance between control and flexibility, allowing the agent to handle more varied UI layouts without requiring a unique tool for every single component. It is particularly effective for the "long tail" of user interactions. See A2UI in the CopilotKit docs.
The "Wild West" of generative UI — the agent generates raw HTML that is rendered within a secure, sandboxed double-iframe. While it is the most flexible — enabling the creation of disposable, data-grounded interfaces on the fly — it is the hardest to style consistently and can behave unpredictably. See opengenerativeui.copilotkit.ai for a live demo, and the CopilotKit docs on MCP Apps and Open Generative UI.
This kit is wired for all three: the canvas surface uses controlled cards for lead entities, A2UI streams declarative components from Gemini, and the deployable MCP server in apps/mcp/ extends the same agent into Claude and ChatGPT's open-ended generative UI surface.
Go deeper:
- 🎥 Talk — The Generative UI spectrum
- 📝 Article — CopilotKit on Generative UI
CopilotKit connects your app's logic, state, and user context to the AI agents that deliver the animated and interactive part of your app experience — across both embedded UIs and fully headless interfaces. The kit ships with CopilotKit Intelligence wired in, giving you durable conversation threads (Postgres-backed), a runtime that bridges your frontend to any LangGraph agent, and built-in support for generative UI and MCP App composition.
LangChain Deep Agents is a Python framework that gives an LLM agent built-in planning, sub-agent dispatch, a virtual filesystem, and a TODO loop — the patterns popularized by Claude Code and Manus, packaged as a create_deep_agent(...) call on top of LangGraph. The kit uses Deep Agents as the brain behind the canvas: a single prompt like "import the workshop leads and draft outreach to the top 5" triggers a multi-step plan that the agent executes tool-by-tool while you watch the cards appear.
Gemini 3.1 Flash-Lite is Google's high-volume workhorse in the Gemini 3 family — fast, cheap, and tool-calling-capable. The kit defaults to gemini-3.1-flash-lite for chat — pick up an API key from Google AI Studio, drop it into .env, and you're done. Need a more reasoning-heavy model? Swap to Gemini 3 Pro Preview or Gemini 3 Flash with a one-line edit in apps/agent/src/runtime.py (_gemini_llm). Swapping to OpenAI, Anthropic, or any other LangChain-supported model is also a one-line edit (see Switching to a different model).
A2UI is a protocol for agent-driven interfaces — it lets AI agents generate rich, interactive UI that renders natively across web, mobile, and desktop without executing arbitrary code. That sandboxed-by-default model pairs well with the kit's generative UI surface: Gemini emits A2UI components, the renderer paints them, and the agent never ships executable code to the client. Browse the custom catalog for component examples.
The kit ships with a Notion Leads database demo wired through the official Notion MCP server (@notionhq/notion-mcp-server), called from Python via mcp-use. MCP is the open protocol for connecting LLMs to tools — Anthropic publishes it, and Notion ships a first-party server. Swap to any other MCP server (Linear, Slack, GitHub, Google Drive, …) by changing one config dict in apps/agent/src/notion_mcp.py and updating the prompt's INTEGRATION_PROMPT.
The kit's apps/mcp/ package is an MCP server built with mcp-use, an open-source TypeScript framework for building MCP servers and MCP Apps. npm run dev:mcp gives you a full development environment with a local Inspector and support for hot reload for quick iteration. Easily deploy the server to Manufact Cloud with npm run -w mcp deploy.
Daytona is a secure and elastic infrastructure runtime for AI-generated code execution and agent workflows. Sandboxes spin up in under 90ms with full isolation — dedicated kernel, filesystem, network stack, and allocated vCPU/RAM/disk — and run any Python, TypeScript, or JavaScript code. Built on OCI/Docker compatibility with stateful environment snapshots, it's a natural fit when an agent in this kit needs to execute generated code or persist a workspace across sessions. Agents and developers interact with sandboxes programmatically through Daytona's SDKs, API, and CLI.
- Run
npx @copilotkit/cli@latest initand select Intelligence when prompted. - Drop a Gemini API key into both
.envandapps/agent/.env. Then follow Notion setup below for the integration token + database id. - Run
npm installthennpm run dev(ornpm run dev:fullto include the MCP server).
npm run devruns a pre-flight check (scripts/check-env.sh) before booting anything — it'll fail loudly with a numbered list of any missing keys, an unreachable Notion database, or a Docker daemon that isn't running. Fix what it lists, re-run, and you're off. See dev-docs/troubleshooting.md for fixes per failure mode.
Please give us feedback on your experience with it!
The kit calls Notion through the official Notion MCP server — a standalone process spawned on demand via npx -y @notionhq/notion-mcp-server. Auth is a single Notion integration token plus an explicit per-database share. No global install, no OAuth flow, no third-party broker.
The kit is wired against an "AI Workshop Provider Community" lead-form database. The fastest path is to duplicate the public sample into your own workspace; you can also re-import a CSV/ZIP if you'd rather start from a snapshot.
1. Get the database into your workspace.
- Option A — duplicate the public sample (recommended). Open the public template: AI Workshop Provider Community. In the top-right of the page, click the Duplicate icon (two overlapping squares, next to the share icon and the
…menu) and pick a destination workspace — schema, views, and seed rows all come along. Bookmark the URL of the duplicated copy; you'll need its database id in step 3. - Option B — re-import the bundled snapshot. In Notion, Settings → Workspace → Import → Notion (CSV/ZIP) and upload
data/notion-leads-sample/ai-workshop-provider-community.zip. A quick-look CSV lives next to it atai-workshop-provider-community.csv.
2. Create an integration and share it with the database.
- Go to notion.so/profile/integrations/internal → New integration → name it (e.g. "genai-starterkit") → copy the Internal Integration Token (starts with
ntn_…orsecret_…). Bookmark this page — it's also where you'll come back to rotate the token or audit which databases the integration can see. - Open the duplicated database in Notion. Click the
…menu in the top-right → Connections (count badge will read0) → Add connection → pick the integration you just created. The panel will flip to Active connections with your integration listed.
Notion's permission model is per-database — a fresh integration token sees zero databases until it's been shared into them. Forgetting this share step is the most common point of failure. If
npm run devboots cleanly butImport the leadsfails with "object not found", come back here.
Learn more: Notion's Getting started with the Notion API covers integration types, the per-database share model, and the API surface the official MCP server wraps.
3. Paste the credentials into .env.
Pull the database id from the URL of your duplicated copy: it's the 32-char hex string between the workspace slug and the ?v= query (e.g. a274791c4e1e826d882d01562af74de9).
Paste both into apps/agent/.env (and .env at the repo root):
NOTION_TOKEN=<paste the Internal Integration Token>
NOTION_LEADS_DATABASE_ID=<paste the database id from its Notion URL>4. Restart the agent.
npm run devThen try: "Import the workshop leads."
Need the manual / Docker-free path, or want to swap Notion for a different MCP server (Linear, Slack, GitHub, …)? See dev-docs/setup.md.
The kit ships with skills pre-installed for Cursor, Claude Code, and any agent reading .agent/. Open the project in your coding tool and they're picked up automatically — no extra setup. They teach your coding agent CopilotKit's v2 API surface, MCP server / MCP App authoring patterns, and this kit's own conventions.
.
├── .agent/skills/ ← agent-tool-agnostic (read by any agent following the AGENTS.md convention)
├── .claude/skills/ ← Claude Code
└── .cursor/skills/ ← Cursor
Each directory carries the same set of 11 skills:
- CopilotKit (8):
copilotkit-{setup, develop, integrations, debug, upgrade, contribute, agui, self-update}— from CopilotKit/skills. - MCP (3):
mcp-builder,mcp-apps-builder,chatgpt-app-builder— from the Manufact reference. They cover authoring an MCP server (the open protocol Anthropic publishes for wiring LLMs to external tools — the same protocol the kit's Notion integration uses) and packaging it as an MCP App that runs natively in Claude or ChatGPT.
To update the CopilotKit skills to the latest upstream:
npx skills add copilotkit/skills --full-depth -yCopilotKit also exposes a hosted MCP server that gives your coding agent live access to the latest CopilotKit reference material — handy when the checked-in skills lag upstream or you want to ask the docs questions interactively.
MCP endpoint: https://mcp.copilotkit.ai/mcp
Claude Web (Anthropic's web app — attaches MCP servers via Connectors):
- Open Claude, click your user in the bottom-left of the chat box, and select Settings.
- In the left-hand menu, select Connectors (or jump straight to the Connectors settings page).
- Click Add custom connector.
- Name:
CopilotKit - URL:
https://mcp.copilotkit.ai/mcp - Click Add.
Setup for Claude Code, Cursor, ChatGPT, and other coding agents is documented at docs.copilotkit.ai/coding-agents.
Reference docs: CopilotKit Coding Agents · CopilotKit Skills repo · Agent Skills standard.
Deeper guides live in dev-docs/:
- Setup · Model switching · MCP server
- Architecture · Customization · Threads / Intelligence
- Scripts · Demo prompts · Troubleshooting
MIT.
Built for the Generative UI Global Hackathon: Agentic Interfaces.

