A full‑stack application for visualizing synthetic brain tumor data. This project includes:
- Database: PostgreSQL (Docker Compose)
- Backend: Flask + SQLAlchemy + Alembic migrations
- Frontend: Next.js (React) UI
- Data Generators: Python scripts to populate sample data (
sample_data) and generate synthetic NIfTI files (.nii.gz) with tumor “hotspots”
-
[TODO][#todo]
- Migrate codebase to run on Docker for more robust development
- Change file structure
- Create Postgres image + table
- Migrate filestore to bind mounts
- implement filtered pycortex render logic
- Update render logic to use filestore
- Create filter aggregation + database query scripts
- Implement data visualizations from clicked location
- Pivot from nifti filestore to database storage
- Find solution for storing voxel data in entries efficiently (allows for faster location pinpointing)
- modify filter aggregation scripts to use this format
- create additional table or modify current table to hold extra data for graph data visualizations
- Add support for more visualization types
- Pivot from nifti filestore to database storage
- Docker & Docker Compose
- Python 3.8+ (for local scripts)
- A Unix‑style terminal (bash, zsh, etc.)
git clone <repository-url> brain-visualizer
cd brain-visualizerIf you plan on accessing the database or running the application from your local machine, ensure to install the necessary requirements:
cd backend
pip install -r requirements.txtand
cd frontend
npm installWe also recommend you configure a virtual environment of your choice in the backend folder. This project was developed using venv:
cd backend
python3 -m venv <venv_name>for creation and
cd backend
source <venv_name>/bin/activatefor activation (on Mac).
Create .env files in both the frontend and backend folders, including the following:
Backend:
DATABASE_URL(e.g.postgresql://myuser:mypassword@db:5432/brain_dev)
Note: replace 'db' with 'localhost' if you plan on accessing the database from your own terminal. The URL for the docker environment is configured in
docker-compse.yml
Frontend:
NEXT_PUBLIC_API_URL(e.g.http://localhost:5001)
The project uses Docker Compose to orchestrate:
- db: PostgreSQL database
- backend: Flask API
- frontend: Next.js application
Note: Ensure you’re in the project root where
docker-compose.ymllives.
docker-compose up -d dbThis spins up the db service with default credentials:
POSTGRES_USER: myuser
POSTGRES_PASSWORD: mypassword
POSTGRES_DB: brain_devApply Alembic migrations to create the schema in brain_dev:
docker-compose exec db bash -c "alembic upgrade head"(This will populate the database with the necessary table and rows.)
Generate and insert data into the sample_data table:
cd backend
python3 -m file_loading.generate_sample_dataCreate a local filestore of .nii.gz volumes in filestore/test_db_nifti/:
cd backend
python3 -m file_loading.generate_sample_niftiEach file will be named <uuid>.nii.gz matching entries in sample_data.
Once the database is prepared and data is in place, bring up all services:
docker-compose up --build -d- Backend API: http://localhost:5001
- Frontend UI: http://localhost:3000
| Script | Description |
|---|---|
generate_sample_data.py |
Inserts random records into the sample_data table. |
generate_sample_nifti.py |
Builds synthetic .nii.gz volumes for each sample_data.id |
alembic upgrade head (via Docker) |
Applies DB migrations |
DATABASE_URLmissing: Ensure.envfiles are populated or export the var manually.- Migrations fail: Verify
migrations/folder andalembic.iniare present, then rerun. - Scripts can’t connect: Confirm
dbservice is running and credentials match. - Permission errors writing to
filestore/:mkdir -p filestore/test_db_nifti && chmod u+w ...