Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
89 changes: 68 additions & 21 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,37 +1,84 @@
## <img src="img/logo.png" alt="drawing" width="200"/>

[![License: CC BY-NC-ND 4.0](https://img.shields.io/badge/License-CC_BY--NC--ND_4.0-lightgrey.svg)](https://creativecommons.org/licenses/by-nc-nd/4.0/)

# Blocksembler REST API

This is the Blocksembler backend API powered by FastAPI, designed to handle and store logging events for the
Blocksembler platform.
This is the Blocksembler Backend API powered by FastAPI, designed to handle and store logging events, exercises and
automatic grading of exercise submissions.

## Getting Started

### Run from Source

To get started with the Assembler Programming Learning Environment, follow these steps:

1. **Clone the Repository**: Clone this repository to your local machine using

```bash
git clone https://github.com/Blocksembler/blocksembler-api.git
```

2. **Create and activate a new Virtual Environment**: Navigate to the project directory and execute following command
```bash
python -m venv .venv
source .venv/bin/activate
```
3. **Install requirements**: Install all python packages required for this application
```bash
pip install -r requirements
```

## Setup
4. **Run the Application**: Start the application by running following command:
```bash
fastapi run app/main.py --port 8081 --host=0.0.0.0
```

### Prerequisites
This will launch the backend locally. In case the DEBUG environment variable is set to `true` the swagger ui can be
accessed on [http://localhost:8081/docs](http://localhost:8081/docs).

- Docker
### Run from Docker Image

### Installation
Blocksembler is also available on [Docker Hub](https://hub.docker.com/r/blocksembler/blocksembler-frontend/tags). To get
started, follow these steps:

```shell
docker run --name blocksembler-api --port 80 blocksembler-api:latest
```
1. **Pull Docker Image**: Pull the blocksembler/blocksembler-frontend image
`docker pull blocksembler/blocksembler-api:latest`

2. **Run the Container**: Start up a new container instance that runs the blocksembler application
`docker run blocksembler/blocksembler-api:latest -p 8081:8081 -d`. This will launch the application locally. In case
the DEBUG environment variable is set to `true` the swagger ui can be accessed
on [http://localhost:8081/docs](http://localhost:8081/docs).

### Environment Variables

| Name | Default | Description |
|-----------------------------|----------------------------------------------------------------------|----------------------------------------------------------|
| `DEBUG` | `True` | Runs the API in debug mode, enabling detailed error logs |
| `BLOCKSEMBLER_API_URI` | `postgresql+asyncpg://postgres:postgres@localhost:5432/blocksembler` | Host address of the MongoDB instance (e.g., `localhost`) |
| `BLOCKSEMBLER_API_BASE_URL` | `/` | Base URL path under which this API is served |
| `BLOCKSEMBLER_ORIGINS` | `*` | Allowed Origins |
#### Genearl Settings

## Contributing
| Name | Default | Description |
|-----------------------------|---------|----------------------------------------------------------|
| `DEBUG` | `True` | Runs the API in debug mode, enabling detailed error logs |
| `BLOCKSEMBLER_ORIGINS` | `*` | Allowed Origins |
| `BLOCKSEMBLER_API_BASE_URL` | `/` | Base URL path under which this API is served |

#### Database Settings

| Name | Default | Description |
|-----------------------|----------------------------------------------------------------------|-------------------------------------------------------------|
| `BLOCKSEMBLER_DB_URI` | `postgresql+asyncpg://postgres:postgres@localhost:5432/blocksembler` | Host address of the *postgres* instance (e.g., `localhost`) |

Contributions are welcome! To get started:
#### Message Queue Settings

| Name | Default | Description |
|-------------------------------------------|----------------------|--------------------------------------------------------------------------------------|
| `BLOCKSEMBLER_MESSAGE_QUEUE_URL` | `localhost` | URL of the RabbitMQ message broker used for communication between services. |
| `BLOCKSEMBLER_MESSAGE_QUEUE_USER` | `blocksembler` | Username used to authenticate with the RabbitMQ message broker. |
| `BLOCKSEMBLER_MESSAGE_QUEUE_PASSWORD` | `blocksembler` | Password used to authenticate with the RabbitMQ message broker. |
| `BLOCKSEMBLER_GRADING_RESPONSE_QUEUE_TTL` | `1000*60*15 (15min)` | Time-to-live (TTL) in milliseconds for grading response queues before auto-deletion. |

## Contributing

- Fork the repository
- Create a new branch (git checkout -b feature/your-feature)
- Make your changes
- Open a pull request
Contributions to this project are welcome! If you have ideas for new features, improvements, or bug fixes, feel free to
open an issue or submit a pull request.

## Contact

Expand Down
1 change: 1 addition & 0 deletions alembic/env.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@
from sqlalchemy.engine import Connection
from sqlalchemy.ext.asyncio import async_engine_from_config

import app.db.model # noqa
from app.db.database import DATABASE_URL, Base

# this is the Alembic Config object, which provides
Expand Down
1 change: 0 additions & 1 deletion alembic/script.py.mako
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,6 @@ from typing import Sequence, Union

from alembic import op

import sqlmodel
import sqlalchemy as sa
${imports if imports else ""}

Expand Down
101 changes: 101 additions & 0 deletions alembic/versions/9a72ad7167bf_initial_db_schema.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,101 @@
"""initial db schema

Revision ID: 9a72ad7167bf
Revises:
Create Date: 2025-10-08 10:30:47.144381

"""
from typing import Sequence, Union

from alembic import op

import sqlalchemy as sa


# revision identifiers, used by Alembic.
revision: str = '9a72ad7167bf'
down_revision: Union[str, Sequence[str], None] = None
branch_labels: Union[str, Sequence[str], None] = None
depends_on: Union[str, Sequence[str], None] = None


def upgrade() -> None:
"""Upgrade schema."""
# ### commands auto generated by Alembic - please adjust! ###
op.create_table('exercise',
sa.Column('id', sa.INTEGER(), nullable=False),
sa.Column('title', sa.TEXT(), nullable=False),
sa.Column('markdown', sa.TEXT(), nullable=False),
sa.Column('coding_mode', sa.VARCHAR(length=3), nullable=False),
sa.Column('next_exercise_id', sa.Integer(), nullable=True),
sa.Column('allow_skip_after', sa.Integer(), nullable=True),
sa.ForeignKeyConstraint(['next_exercise_id'], ['exercise.id'], ),
sa.PrimaryKeyConstraint('id')
)
op.create_table('competition',
sa.Column('id', sa.INTEGER(), nullable=False),
sa.Column('name', sa.TEXT(), nullable=False),
sa.Column('start_time', sa.DateTime(timezone=True), nullable=True),
sa.Column('end_time', sa.DateTime(timezone=True), nullable=True),
sa.Column('first_exercise_id', sa.INTEGER(), nullable=False),
sa.ForeignKeyConstraint(['first_exercise_id'], ['exercise.id'], ),
sa.PrimaryKeyConstraint('id')
)
op.create_table('tan',
sa.Column('code', sa.VARCHAR(length=29), nullable=False),
sa.Column('valid_from', sa.DateTime(timezone=True), nullable=True),
sa.Column('valid_to', sa.DateTime(timezone=True), nullable=True),
sa.Column('competition_id', sa.INTEGER(), nullable=True),
sa.ForeignKeyConstraint(['competition_id'], ['competition.id'], ),
sa.PrimaryKeyConstraint('code')
)
op.create_table('exercise_progress',
sa.Column('id', sa.INTEGER(), nullable=False),
sa.Column('tan_code', sa.VARCHAR(length=20), nullable=False),
sa.Column('exercise_id', sa.INTEGER(), nullable=False),
sa.Column('start_time', sa.DateTime(timezone=True), nullable=False),
sa.Column('end_time', sa.DateTime(timezone=True), nullable=True),
sa.Column('skipped', sa.BOOLEAN(), nullable=False),
sa.ForeignKeyConstraint(['exercise_id'], ['exercise.id'], ),
sa.ForeignKeyConstraint(['tan_code'], ['tan.code'], ),
sa.PrimaryKeyConstraint('id'),
sa.UniqueConstraint('tan_code', 'end_time', name='unique_tan_code_and_end_time')
)
op.create_table('grading_job',
sa.Column('id', sa.UUID(), nullable=False),
sa.Column('tan_code', sa.VARCHAR(length=20), nullable=True),
sa.Column('exercise_id', sa.INTEGER(), nullable=True),
sa.Column('status', sa.VARCHAR(length=20), nullable=True),
sa.Column('started', sa.DateTime(timezone=True), nullable=True),
sa.Column('terminated', sa.DateTime(timezone=True), nullable=True),
sa.ForeignKeyConstraint(['exercise_id'], ['exercise.id'], ),
sa.ForeignKeyConstraint(['tan_code'], ['tan.code'], ),
sa.PrimaryKeyConstraint('id')
)
op.create_table('logging_event',
sa.Column('id', sa.INTEGER(), nullable=False),
sa.Column('tan_code', sa.VARCHAR(length=20), nullable=True),
sa.Column('timestamp', sa.DateTime(timezone=True), nullable=True),
sa.Column('source', sa.TEXT(), nullable=True),
sa.Column('type', sa.TEXT(), nullable=True),
sa.Column('payload', sa.JSON(), nullable=True),
sa.Column('exercise_id', sa.INTEGER(), nullable=True),
sa.ForeignKeyConstraint(['exercise_id'], ['exercise.id'], ),
sa.ForeignKeyConstraint(['tan_code'], ['tan.code'], ),
sa.PrimaryKeyConstraint('id')
)
op.create_index(op.f('ix_logging_event_id'), 'logging_event', ['id'], unique=False)
# ### end Alembic commands ###


def downgrade() -> None:
"""Downgrade schema."""
# ### commands auto generated by Alembic - please adjust! ###
op.drop_index(op.f('ix_logging_event_id'), table_name='logging_event')
op.drop_table('logging_event')
op.drop_table('grading_job')
op.drop_table('exercise_progress')
op.drop_table('tan')
op.drop_table('competition')
op.drop_table('exercise')
# ### end Alembic commands ###
52 changes: 0 additions & 52 deletions alembic/versions/f4eaab963d7b_add_tan_and_logging_events_table.py

This file was deleted.

17 changes: 17 additions & 0 deletions app/api/schema/grading.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
from datetime import datetime
from typing import Optional

from pydantic import BaseModel


class ExerciseSubmission(BaseModel):
tan_code: str
exercise_id: int
solution_code: str


class GradingResult(BaseModel):
success: bool
penalty: datetime
feedback: str
hint: Optional[str]
71 changes: 71 additions & 0 deletions app/api/v1/grading.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,71 @@
import datetime
import json
import logging
from uuid import uuid4

import amqp
from amqp import Channel
from fastapi import APIRouter, Depends, HTTPException
from fastapi import status
from sqlalchemy import select
from sqlalchemy.ext.asyncio import AsyncSession

from app.api.schema.grading import ExerciseSubmission
from app.config import GRADING_RESPONSE_QUEUE_TTL
from app.db.database import get_session
from app.db.model.exercise import ExerciseProgress
from app.db.model.grading import GradingJob
from app.mq.message_queue import get_mq_channel

router = APIRouter(
prefix="/submission",
tags=["submission"],
)


async def submit_grading_job(job_msg: dict, session: AsyncSession, ch: Channel):
ch.queue_declare(queue=f'grading_response.{job_msg["job_id"]}', durable=True, arguments={
"x-expires": GRADING_RESPONSE_QUEUE_TTL,
})

session.add(GradingJob(
id=job_msg["job_id"],
tan_code=job_msg["tan_code"],
exercise_id=job_msg["exercise_id"],
status="pending",
started=datetime.datetime.now()
))

job_msg["job_id"] = str(job_msg["job_id"])
ch.basic_publish(amqp.Message(json.dumps(job_msg)), routing_key='grading_jobs')


@router.post("/", status_code=status.HTTP_201_CREATED, response_model=str)
async def create_submission(new_submission: ExerciseSubmission, session: AsyncSession = Depends(get_session),
mq_channel: Channel = Depends(get_mq_channel)) -> str:
stmt = select(ExerciseProgress).where(ExerciseProgress.exercise_id == new_submission.exercise_id,
ExerciseProgress.tan_code == new_submission.tan_code,
ExerciseProgress.end_time.is_(None))
result = await session.execute(stmt)
exercise_progress = result.scalars().first()

if not exercise_progress:
raise HTTPException(status_code=status.HTTP_400_BAD_REQUEST, detail="Tried to submit to an inactive exercise.")

try:
job_msg = {
"job_id": uuid4(),
"exercise_id": new_submission.exercise_id,
"tan_code": new_submission.tan_code,
"solution_code": new_submission.solution_code
}

await submit_grading_job(job_msg, session, mq_channel)
await session.commit()

return str(job_msg["job_id"])

except Exception as e:
logging.error(e)
await session.rollback()
raise HTTPException(status_code=500, detail=f"Scheduling a grading job failed. {str(e)}")
8 changes: 7 additions & 1 deletion app/config.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,12 @@
import os

DEBUG = os.environ.get('DEBUG', 'true').lower() == 'true'
ORIGINS = os.environ.get("BLOCKSEMBLER_ORIGINS", "*").split(',')
BASE_URL = os.environ.get('BLOCKSEMBLER_API_BASE_URL', '')
DEBUG = os.environ.get('DEBUG', 'true').lower() == 'true'

DATABASE_URL = os.getenv("BLOCKSEMBLER_DB_URI", "postgresql+asyncpg://postgres:postgres@localhost:5432/blocksembler")

MESSAGE_QUEUE_URL = os.environ.get('BLOCKSEMBLER_MESSAGE_QUEUE_URL', 'localhost')
MESSAGE_QUEUE_USER = os.environ.get('BLOCKSEMBLER_MESSAGE_QUEUE_USER', 'blocksembler')
MESSAGE_QUEUE_PASSWORD = os.environ.get('BLOCKSEMBLER_MESSAGE_QUEUE_PASSWORD', 'blocksembler')
GRADING_RESPONSE_QUEUE_TTL = os.environ.get('BLOCKSEMBLER_GRADING_RESPONSE_QUEUE_TTL', 1000 * 60 * 15)
2 changes: 1 addition & 1 deletion app/db/database.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ def to_dict(self):

async def get_session() -> AsyncSession:
async with async_session_maker() as session:
yield session
yield session # noqa


async def create_tables():
Expand Down
1 change: 1 addition & 0 deletions app/db/model/__init__.py
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
from .exercise import Exercise
from .grading import GradingJob
from .logging_event import LoggingEvent
from .tan import Tan
Loading