A chat agent with tool-calling capabilities, built with FastAPI, PostgreSQL, and the OpenAI SDK.
- Python 3.11+
- Docker & Docker Compose
- Node.js 20+ and pnpm
- uv (Python package manager)
./start.shThis single command installs dependencies, prompts for your API key (provided in the email with task instructions), starts Docker, and launches the backend + frontend.
Open http://localhost:3000 to verify the chat interface loads. Press Ctrl+C to stop everything.
uv run pytestNote: 2 of 4 tests currently pass. This is expected — you'll address the failing tests during the interview.
├── backend/
│ ├── main.py # FastAPI application
│ ├── config.py # Settings (env vars)
│ ├── db.py # Database session setup
│ ├── models/ # SQLAlchemy models
│ ├── schemas/ # Pydantic request/response schemas
│ ├── routers/ # API endpoints
│ ├── agent/
│ │ ├── loop.py # Agent loop (LLM <> tool execution)
│ │ ├── prompts.py # System prompts
│ │ └── tools/ # Tool registry & implementations
│ └── services/
│ └── llm.py # LLM client configuration
├── frontend/ # React/TypeScript chat UI
├── tests/ # Test suite
├── docker-compose.yml # Postgres + Redis
├── start.sh # One-command setup & run
└── pyproject.toml # Python dependencies
- Docker not starting? Make sure ports 5432 and 6379 are free.
- Backend won't start? Check that
.envexists anddocker compose psshows healthy services. - Frontend not loading? Make sure the backend is running on port 8000 (the frontend proxies API calls to it).
- Tests failing? 2 of 4 tests are expected to fail. If all 4 fail, check your Python environment (
uv sync --all-extras).