MIP Candy is Project Neura's next-generation infrastructure framework for medical image processing. It defines a handful of common network architectures with their corresponding training, inference, and evaluation pipelines that are out-of-the-box ready to use. Additionally, it also provides integrations with popular frontend dashboards such as Notion, WandB, and TensorBoard.
We provide a flexible and extensible framework for medical image processing researchers to quickly prototype their ideas. MIP Candy takes care of all the rest, so you can focus on only the key experiment designs.
🔗 Home
🔗 Docs
Why MIP Candy? 🤔
Easy adaptation to fit your needs
We provide tons of easy-to-use techniques for training that seamlessly support your customized experiments.- Sliding window
- ROI inspection
- ROI cropping to align dataset shape (100% or 33% foreground)
- Automatic padding
- ...
You only need to override one method to create a trainer for your network architecture.
from typing import override
from torch import nn
from mipcandy import SegmentationTrainer
class MyTrainer(SegmentationTrainer):
@override
def build_network(self, example_shape: tuple[int, ...]) -> nn.Module:
...Support of various frontend platforms for remote monitoring
MIP Candy Supports Notion, WandB, and TensorBoard.
Note that MIP Candy requires Python >= 3.12.
pip install "mipcandy[standard]"Below is a simple example of a nnU-Net style training. The batch size is set to 1 due to the varying shape of the
dataset, although you can use a ROIDataset to align the shapes.
from typing import override
import torch
from mipcandy_bundles.unet import UNetTrainer
from torch.utils.data import DataLoader
from mipcandy import download_dataset, NNUNetDataset
class PH2(NNUNetDataset):
@override
def load(self, idx: int) -> tuple[torch.Tensor, torch.Tensor]:
image, label = super().load(idx)
return image.squeeze(0).permute(2, 0, 1), label
download_dataset("nnunet_datasets/PH2", "tutorial/datasets/PH2")
dataset, val_dataset = PH2("tutorial/datasets/PH2", device="cuda").fold()
dataloader = DataLoader(dataset, 1, shuffle=True)
val_dataloader = DataLoader(val_dataset, 1, shuffle=False)
trainer = UNetTrainer("tutorial", dataloader, val_dataloader, device="cuda")
trainer.train(1000, note="a nnU-Net style example")


