Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
13 changes: 9 additions & 4 deletions .github/copilot-instructions.md
Original file line number Diff line number Diff line change
Expand Up @@ -64,8 +64,12 @@ DEEPSEEK_API_KEY=...
# Full stack with Docker Compose
docker-compose up --build

# Backend only (for API development)
cd backend && uvicorn fastapi_generate_quiz:app --reload --host 0.0.0.0 --port 8000
# Backend only (for API development) - UV method
cd backend && uv sync --dev --no-install-project && uv run --script dev

# Testing
uv run pytest -v # Unit tests only (no API calls)
uv run pytest -m integration # Integration tests with real API calls

# Access points:
# - Frontend: http://localhost:8080
Expand All @@ -79,10 +83,11 @@ cd backend && uvicorn fastapi_generate_quiz:app --reload --host 0.0.0.0 --port 8

## Code Quality Standards

- **Linting**: Uses `ruff` for Python code formatting and linting
- **Package Management**: Uses UV for fast dependency management and virtual environments
- **Linting**: Uses `ruff` for Python code formatting and linting (`uv run ruff check .`)
- **Type Checking**: Expected for new Python code
- **Testing**: Always add unit tests; integration tests for API changes
- **Dependencies**: Keep `requirements.txt` minimal; dev dependencies in `requirements-dev.txt`
- **Dependencies**: Managed via `pyproject.toml`; dev dependencies in `[project.optional-dependencies]`

## Common Development Tasks

Expand Down
18 changes: 14 additions & 4 deletions .github/workflows/ci_python.yml
Original file line number Diff line number Diff line change
Expand Up @@ -21,13 +21,23 @@ jobs:
with:
python-version: "3.10"

- name: Install UV
run: curl -LsSf https://astral.sh/uv/install.sh | sh

- name: Install dependencies
run: |
# Install ruff for linting and the development requirements (including pytest)
pip install -r backend/requirements-dev.txt
source $HOME/.cargo/env
cd backend
uv sync --dev --no-install-project

- name: Lint Python with ruff 🚀
run: ruff check backend/
run: |
source $HOME/.cargo/env
cd backend
uv run ruff check .

- name: Run Pytest 🧪
run: pytest -q backend/tests/ -v
run: |
source $HOME/.cargo/env
cd backend
uv run pytest -q tests/ -v -m "not integration"
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review Feedback:

  1. Shell Commands Safety: Running curl ... | sh directly can be risky unless the source is entirely trusted. Consider downloading and inspecting the contents of the script before executing it.

  2. Efficiency Improvement: Instead of sourcing $HOME/.cargo/env repeatedly, consider sourcing it once at the beginning or manage the environment variables in a single place.

  3. Consistency: Ensure consistent formatting and handling within each step for better readability and maintainability.

  4. Dependency Syncing: Ensure that syncing dependencies with uv sync is well managed and satisfies all project requirements adequately.

  5. Testing Environment Awareness: Verify that the testing environment and setup are robust to avoid false negatives or positives within test results.

  6. Code Paths & Mapping: Confirm that paths are correctly set and correspond to the expected file locations for operations like linting and running tests.

  7. Integrating Comments: Consider adding comments to explain complex or critical steps to improve the code's understandability for future maintainers or collaborators.

  8. Error Handling: Implement error handling where necessary, especially for critical steps like dependency installation, linting, and testing phases.

  9. Security Checks: Include security checks where relevant to ensure sensitive information does not leak during any part of the process.

  10. Version Control: Be wary of hardcoded versions in commands to ensure compatibility over time; prefer using version constraints in your project configuration.

Bug Risks:

  • Dependence on external scripts without thorough validation could introduce security vulnerabilities.
  • Inadequate error checking and recovery mechanisms may lead to failed builds without clear diagnoses.
  • Path assumptions may cause failures if directories are restructured or if there are unexpected changes in the project structure.

Overall Improvement Suggestions:

  • Enhance script safety by validating external sources thoroughly before execution.
  • Optimize environment setup for efficiency and ease of maintenance.
  • Maintain consistency in command execution and structuring for clarity.
  • Test thoroughly to verify correctness across multiple scenarios.
  • Implement appropriate error handling and feedback mechanisms.

With these adjustments, you can make the CI/CD process more robust and reliable while decreasing potential vulnerabilities and enhancing code quality.

6 changes: 4 additions & 2 deletions .vscode/settings.json
Original file line number Diff line number Diff line change
Expand Up @@ -7,8 +7,10 @@
"azureFunctions.projectRuntime": "~4",
"debug.internalConsoleOptions": "neverOpen",
"python.testing.pytestArgs": [
"backend"
"backend",
"-v"
],
"python.testing.unittestEnabled": false,
"python.testing.pytestEnabled": true
"python.testing.pytestEnabled": true,
"python-envs.pythonProjects": []
}
18 changes: 15 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -54,25 +54,37 @@ This project uses Docker Compose to run both the FastAPI backend and the fronten
### Running Locally

1. **Set Environment Variables**
Make sure your `OPENAI_API_KEY` is set in your environment or in a `.env` file at the project root:
Make sure your API keys are set in your environment or in a `.env` file at the project root:
```sh
export OPENAI_API_KEY=your_openai_api_key_here
export GEMINI_API_KEY=your_gemini_api_key_here
# ... other API keys
```
Or create a `.env` file with:
```
OPENAI_API_KEY=your_openai_api_key_here
GEMINI_API_KEY=your_gemini_api_key_here
# ... other API keys
```

2. **Build and Run the Containers**
2. **Build and Run with Docker Compose**
From the project root, run:
```sh
docker-compose up --build
```
This command builds and starts both the backend and frontend containers.

3. **Alternative: Run Backend Locally with UV**
For faster development iteration:
```sh
cd backend
uv sync --dev
uv run uvicorn fastapi_generate_quiz:app --reload --host 0.0.0.0 --port 8000
```

3. **Access the Services**
- **Backend API (FastAPI)**: [http://localhost:8000](http://localhost:8000)
- **Frontend**: [http://localhost:8080](http://localhost:8080)

With these steps, you can easily test both the backend API and the static frontend locally using Docker Compose.
With these steps, you can easily test both the backend API and the static frontend locally using Docker Compose or UV for faster backend development.

13 changes: 9 additions & 4 deletions backend/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -2,18 +2,23 @@
# https://fastapi.tiangolo.com/deployment/docker/#dockerfile
FROM python:3.10-slim

# Install UV
COPY --from=ghcr.io/astral-sh/uv:latest /uv /bin/uv

# Set working directory
WORKDIR /code

# Copy requirements file and install dependencies
COPY ./requirements.txt /code/requirements.txt
RUN pip install --no-cache-dir --upgrade -r /code/requirements.txt
# Copy project files
COPY ./pyproject.toml /code/pyproject.toml

# Install dependencies without installing the project itself
RUN uv sync --no-dev --no-install-project

# Copy application code
COPY . /code

# Command to run the application using Uvicorn
CMD ["uvicorn", "fastapi_generate_quiz:app", "--host", "0.0.0.0", "--port", "8000"]
CMD ["uv", "run", "uvicorn", "fastapi_generate_quiz:app", "--host", "0.0.0.0", "--port", "8000"]

# docker build -t fastapi_generate_quiz:latest . # Build container
# docker run -p 8000:8000 -e OPENAI_API_KEY fastapi_generate_quiz:latest # Run container
Expand Down
72 changes: 64 additions & 8 deletions backend/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,6 +34,59 @@ Certain functions may require environment variables (e.g., `OPENAI_API_KEY`). Th
## Debug
To debug locally, follow these steps:

### Using UV (Recommended)

1. **Install UV** (if not already installed):
```sh
curl -LsSf https://astral.sh/uv/install.sh | sh
```

2. **Install dependencies**:
```sh
cd backend
uv sync --dev --no-install-project
```

3. **Run the FastAPI application**:
```sh
# Option 1: Using the configured script
uv run --script dev

# Option 2: Direct command
uv run uvicorn fastapi_generate_quiz:app --reload --host 0.0.0.0 --port 8000
```

4. **Test the endpoints** (requires valid API keys and quota):
```sh
# Test quiz generation (will fail if no API quota)
curl "http://localhost:8000/GenerateQuiz?topic=UK%20History&difficulty=easy&n_questions=3"

# Test image generation
curl "http://localhost:8000/GenerateImage?prompt=A%20Juicy%20Burger"
```

5. **Run tests**:
```sh
# Unit tests only (default - no API calls required)
uv run pytest -v

# Integration tests (requires API keys and quota)
uv run pytest -m integration

# All tests
uv run pytest -v --tb=short
```

6. **Run linting**:
```sh
uv run ruff check .
uv run ruff format .
```

> **Note**: Direct execution of `generate_quiz.py` requires valid API keys and quota. For development without making API calls, use the unit tests (`uv run pytest -v`) which use mocked responses.

### Using Docker

1. **Build the Docker container**:
```sh
docker build -t fastapi_generate_quiz:latest .
Expand All @@ -56,6 +109,8 @@ To debug locally, follow these steps:
curl "http://localhost:8000/GenerateImage?prompt=Kangeroo%20Playing%20BasketBall"
```

### Docker Registry Commands

4. **Tag the Docker image for GitHub Container Registry**:
```sh
docker tag fastapi_generate_quiz:latest ghcr.io/djsaunders1997/fastapi_generate_quiz:latest
Expand All @@ -78,27 +133,28 @@ Our test suite is divided into **unit tests** and **integration tests**.

### Default Behavior

By default, integration tests are **excluded** from the test run. This is achieved by configuring `pytest` in our `pytest.ini` file (located in the `backend` directory):
By default, integration tests are **excluded** from the test run. This is achieved by configuring `pytest` in our `pyproject.toml` file:

```ini
[pytest]
markers =
integration: mark test as an integration test.
addopts = -m "not integration"
```toml
[tool.pytest.ini_options]
markers = [
"integration: mark test as an integration test."
]
addopts = "-m 'not integration'"
```

This configuration tells `pytest` to skip any test marked with `@pytest.mark.integration` when you run:

```bash
pytest -v
uv run pytest -v
```

### Running Integration Tests

To run the integration tests, override the default marker filter by using the `-m` option:

```bash
pytest -m integration
uv run pytest -m integration
```

> **Note:** Integration tests make real API calls and require the `OPENAI_API_KEY` environment variable to be set. Make sure you have this environment variable configured before running these tests.
Expand Down
Loading