AI-Assisted Image Segmentation for Machine Learning Dataset Preparation
LazyLabel combines Meta's Segment Anything Model (SAM) with comprehensive manual annotation tools to accelerate the creation of pixel-perfect segmentation masks for computer vision applications.
pip install lazylabel-gui
lazylabel-guiFrom source:
git clone https://github.com/dnzckn/LazyLabel.git
cd LazyLabel
pip install -e .
lazylabel-guiRequirements: Python 3.10+, 8GB RAM, ~2.5GB disk space (for model weights)
- AI (SAM): Single-click segmentation with point-based refinement (SAM 1.0 & 2.1, GPU/CPU). Use negative points to subtract regions from the prediction.
- Polygon: Vertex-level drawing and editing for precise boundaries
- Box: Bounding box annotations for object detection. Hold Shift on release to erase with the box instead of adding.
- Select: Click to select existing masks for editing, reclassing, or deletion. Hold Shift+Space to erase the overlap of a drawn segment from the selected mask.
- Single View: Fine-tune individual masks with maximum precision
- Multi View: Annotate up to 2 images simultaneously, ideal for objects in similar positions with slight variations
- Sequence: Propagate a refined mask across thousands of frames using SAM 2's video predictor
- FFT filtering: Remove noise and enhance edges
- Channel thresholding: Isolate objects by color
- Border cropping: Zero out pixels outside defined regions in saved outputs
- View adjustments: Brightness, contrast, gamma correction, color saturation
Select one or more formats from Settings. All formats can be loaded back into LazyLabel.
import numpy as np
data = np.load('image.npz')
mask = data['mask'] # Shape: (height, width, num_classes)
# Each channel represents one class
sky = mask[:, :, 0]
boats = mask[:, :, 1]
cats = mask[:, :, 2]
dogs = mask[:, :, 3]| Format | Output File | Description |
|---|---|---|
| YOLO Detection | image.txt |
Bounding boxes: class_id cx cy w h (normalized) |
| YOLO Segmentation | image_seg.txt |
Polygon vertices: class_id x1 y1 x2 y2 ... (normalized) |
| COCO JSON | image_coco.json |
Per-image COCO format with polygon segmentation, bounding boxes, and area |
| Pascal VOC | image.xml |
XML bounding box annotations |
| CreateML | image_createml.json |
Apple CreateML JSON with center-based bounding boxes |
COCO supercategories: Set a class alias to name.supercategory (e.g. dog.animal) to populate the supercategory field in COCO JSON output.
SAM 1.0 models are downloaded automatically on first use.
If the automatic download doesn't work, you can manually download and place the model:
SAM 1.0 only requires the model weights file, no additional package installation needed.
- Download
sam_vit_h_4b8939.pthfrom the SAM repository - Place in LazyLabel's models folder:
- Via pip:
<site-packages>/lazylabel/models/(runpython -c "import lazylabel; print(lazylabel.__path__[0])"to find it) - From source:
src/lazylabel/models/
- Via pip:
SAM 2.1 requires both the sam2 package installed and the model weights file, since it relies on config files bundled with the package.
- Install SAM 2:
pip install git+https://github.com/facebookresearch/sam2.git - Download a model (e.g.,
sam2.1_hiera_large.pt) from the SAM 2 repository - Place in LazyLabel's models folder:
- Via pip:
<site-packages>/lazylabel/models/(runpython -c "import lazylabel; print(lazylabel.__path__[0])"to find it) - From source:
src/lazylabel/models/
- Via pip:
Select the model from the dropdown in settings.
Create a standalone Windows executable with bundled models for offline use:
Requirements:
- Windows (native, not WSL)
- Python 3.10+
- PyInstaller:
pip install pyinstaller
Build steps:
git clone https://github.com/dnzckn/LazyLabel.git
cd LazyLabel
python build_system/windows/build_windows.pyThe executable will be created in dist/LazyLabel/. The entire folder (~7-8GB) can be moved anywhere and runs offline.
- Usage Manual - Comprehensive feature guide
- Architecture Guide - Technical implementation details
- Changelog - Version history and release notes
- GitHub Issues - Report bugs or request features

