Supervisely
AboutAPI ReferenceSDK Reference
  • 🤖What's Supervisely
  • 🚀Ecosystem of Supervisely Apps
  • 💡FAQ
  • 📌Getting started
    • How to import
    • How to annotate
    • How to invite team members
    • How to connect agents
    • How to train models
  • 🔁Import and Export
    • Import
      • Overview
      • Import using Web UI
      • Supported annotation formats
        • Images
          • 🤖Supervisely JSON
          • 🤖Supervisely Blob
          • COCO
          • Yolo
          • Pascal VOC
          • Cityscapes
          • Images with PNG masks
          • Links from CSV, TXT and TSV
          • PDF files to images
          • Multiview images
          • Multispectral images
          • Medical 2D images
          • LabelMe
          • LabelStudio
          • Fisheye
          • High Color Depth
        • Videos
          • Supervisely
        • Pointclouds
          • Supervisely
          • .PCD, .PLY, .LAS, .LAZ pointclouds
          • Lyft
          • nuScenes
          • KITTI 3D
        • Pointcloud Episodes
          • Supervisely
          • .PCD, .PLY, .LAS, .LAZ pointclouds
          • Lyft
          • nuScenes
          • KITTI 360
        • Volumes
          • Supervisely
          • .NRRD, .DCM volumes
          • NIfTI
      • Import sample dataset
      • Import into an existing dataset
      • Import using Team Files
      • Import from Cloud
      • Import using API & SDK
      • Import using agent
    • Migrations
      • Roboflow to Supervisely
      • Labelbox to Supervisely
      • V7 to Supervisely
      • CVAT to Supervisely
    • Export
  • 📂Data Organization
    • Core concepts
    • MLOps Workflow
    • Projects
      • Datasets
      • Definitions
      • Collections
    • Team Files
    • Disk usage & Cleanup
    • Quality Assurance & Statistics
      • Practical applications of statistics
    • Operations with Data
      • Data Filtration
        • How to use advanced filters
      • Pipelines
      • Augmentations
      • Splitting data
      • Converting data
        • Convert to COCO
        • Convert to YOLO
        • Convert to Pascal VOC
    • Data Commander
      • Clone Project Meta
  • 📝Labeling
    • Labeling Toolboxes
      • Images
      • Videos 2.0
      • Videos 3.0
      • 3D Point Clouds
      • DICOM
      • Multiview images
      • Fisheye
    • Labeling Tools
      • Navigation & Selection Tools
      • Point Tool
      • Bounding Box (Rectangle) Tool
      • Polyline Tool
      • Polygon Tool
      • Brush Tool
      • Mask Pen Tool
      • Smart Tool
      • Graph (Keypoints) Tool
      • Frame-based tagging
    • Labeling Jobs
      • Labeling Queues
      • Labeling Consensus
      • Labeling Statistics
    • Labeling with AI-Assistance
  • 🤝Collaboration
    • Admin panel
      • Users management
      • Teams management
      • Server disk usage
      • Server trash bin
      • Server cleanup
      • Server stats and errors
    • Teams & workspaces
    • Members
    • Issues
    • Guides & exams
    • Activity log
    • Sharing
  • 🖥️Agents
    • Installation
      • Linux
      • Windows
      • AMI AWS
      • Kubernetes
    • How agents work
    • Restart and delete agents
    • Status and monitoring
    • Storage and cleanup
    • Integration with Docker
  • 🔮Neural Networks
    • Overview
    • Inference & Deployment
      • Overview
      • Supervisely Serving Apps
      • Deploy & Predict with Supervisely SDK
      • Using trained models outside of Supervisely
    • Model Evaluation Benchmark
      • Object Detection
      • Instance Segmentation
      • Semantic Segmentation
      • Custom Benchmark Integration
    • Custom Model Integration
      • Overview
      • Custom Inference
      • Custom Training
    • Legacy
      • Starting with Neural Networks
      • Train custom Neural Networks
      • Run pre-trained models
  • 👔Enterprise Edition
    • Get Supervisely
      • Installation
      • Post-installation
      • Upgrade
      • License Update
    • Kubernetes
      • Overview
      • Installation
      • Connect cluster
    • Advanced Tuning
      • HTTPS
      • Remote Storage
      • Single Sign-On (SSO)
      • CDN
      • Notifications
      • Moving Instance
      • Generating Troubleshoot Archive
      • Storage Cleanup
      • Private Apps
      • Data Folder
      • Firewall
      • HTTP Proxy
      • Offline usage
      • Multi-disk usage
      • Managed Postgres
      • Scalability Tuning
  • 🔧Customization and Integration
    • Supervisely .JSON Format
      • Project Structure
      • Project Meta: Classes, Tags, Settings
      • Tags
      • Objects
      • Single-Image Annotation
      • Single-Video Annotation
      • Point Cloud Episodes
      • Volumes Annotation
    • Developer Portal
    • SDK
    • API
  • 💡Resources
    • Changelog
    • GitHub
    • Blog
    • Ecosystem
Powered by GitBook
On this page
  • Roles in Labeling Jobs
  • Using Labeling Jobs: Step-By-Step Guide
  • Step 1. Create a Labeling Job
  • Step 2. Complete the Job
  • Step 3. Review and Quality Control
  • Labeling Job Statistics
  • Video Tutorial

Was this helpful?

  1. Labeling

Labeling Jobs

PreviousFrame-based taggingNextLabeling Queues

Last updated 5 months ago

Was this helpful?

Labeling jobs is a powerful tool for efficiently organizing and distributing data annotation tasks within a . It is designed to ensure that annotators work on well-defined and manageable portions of the , follow consistent guidelines, and contribute to the overall success of the annotation project while maintaining data quality and accuracy. It is a critical component of effective team coordination in data annotation efforts.

Learn more about Labeling Jobs in

  • Annotation Quality Control: ✅ Includes inspections and options for re-labeling when necessary.

  • Task Distribution: Prevents overlap by assigning jobs to specific team members.

  • Consistent Guidelines: 📜 Eliminates subjective errors through clear technical requirements and restrictions.

  • Real-Time Monitoring: 🕒 Tracks progress and allows iterations to improve results.

  • Acess Control: 🔒 Limits access to specific datasets as needed.


Roles in Labeling Jobs

Roles define team members’ responsibilities and permissions within the annotation process. Each member is assigned a role on the Members page, and a single user can have different roles across teams.

📋 Manager

Responsibilities: 🛠️

  • Creates and manages Labeling Jobs.

  • Provides detailed task descriptions in the Info section.

  • Monitors progress and analyzes team performance statistics.

  • Assigns tasks to Annotators and Reviewers.

Permissions: 🔑

  • Full access to manage the annotation process and review statistics.


✍️ Annotator (Labeler)

Responsibilities: 🛠️

  • Labels assigned data following predefined instructions.

  • Submits completed jobs for review.

Permissions: 🔑

  • Access limited to the Labeling Jobs page.

  • Cannot create classes, use other applications, or Neural Networks.


🔍 Reviewer

Responsibilities: 🛠️

  • Reviews annotations and sends jobs for re-labeling if needed.

  • Can create Labeling Jobs and analyze statistics.

Permissions: 🔑

  • Access to review and finalize jobs or send them for re-annotation.


Using Labeling Jobs: Step-By-Step Guide

Before you start

Make sure that you have:

  • dedicated Team assigned with proper roles

  • necessary projects for annotation with relevant properties (clasess/figures/tags) defined

Step 1. Create a Labeling Job

  1. Navigate to the Labeling Jobs tab. 🗂️

  2. Assign a title and provide a detailed description for Annotators.

    • Use Markdown or Labeling Guides for detailed instructions with images or videos.

  3. Assign Reviewers and Annotators.

  4. Select the data to annotate.

  5. Define available classes, tags, and any restrictions (e.g., max figures per item).

  6. Filter data by criteria like tags or item count. By default, the entire dataset is selected for annotation.

  7. Complete setup to redirect to the Labeling Jobs page, where tasks are listed for each Annotator.

Step 2. Complete the Job

Annotators should:

  1. Read technical specifications.

  2. Use the annotation toolbox to label data with predefined classes.

  3. Confirm the task is completed and submit it for review.

  4. Once submitted, the job is marked as completed and removed from the Annotator’s list.

Step 3. Review and Quality Control

Reviewers follow these steps:

  1. Read technical specifications.

  2. Review annotations in the annotation toolbox. 🛠️

    • Accept (👍) or Reject (👎) the work.

  3. Finalize the Labeling Job if all data is reviewed.

    • Rejected annotations are sent for re-annotation in a new Labeling Job. 🚀


Labeling Job Statistics

Managers and Reviewers can click the Stats button to view performance statistics for each Labeling Job.


Video Tutorial

Watch our concise video tutorial 🎥 to unlock the full potential of collaboration using Labeling Jobs on the Supervisely platform.

📝
team
dataset
Mastering Labeling Jobs: Your Ultimate Guide.

Labeling Queues

Labeling Queues is a systematic method for distributing and managing labeling process in a team, where labeling tasks are grouped into queues and sequentially distributed among annotators.

Labeling Consensust

Consensus labeling is an annotation approach where multiple annotators jointly label the same set of images independently.

Labeling Statistics

Understanding your data is extremely important if you want to have precise annotations and consistent training data.

Video guide