Labeling Queues
Discover the power of collaborative annotation with labeling queues. Learn how to harness teamwork for efficient and accurate data annotation.
Last updated
Discover the power of collaborative annotation with labeling queues. Learn how to harness teamwork for efficient and accurate data annotation.
Last updated
Labeling Queues is a systematic method for distributing and managing labeling process in a team, where labeling tasks are grouped into queues and sequentially distributed among annotators. The essence of this system is that the data is not distributed between annotators, but is issued from a common queue (pool) of tasks, on a "whoever labels first" basis, until there is no more data to label. This allows to balance data annotation process within the labeling team independently of the speed of individual workers.
Learn more about Labeling Queues in Labeling Queues: Streamline Your Labeling Pipeline.
Before you start
Ensure the following:
A dedicated team assigned with proper roles. Refer to our Labeling Jobs documentation for details on roles.
Necessary projects for annotation with defined classes, figures, and tags.
Suppose you want to train a model to recognize strawberries in images. You have 10,000 images to annotate. Follow these steps to efficiently manage the task with Labeling Queues:
The Manager or Reviewer creates a Labeling Queue with defined criteria.
Assign Annotators and Reviewers.
An empty Labeling Job is created for each Labeler and Reviewer.
Once setup is complete, the team is redirected to the Labeling Queues page, where jobs are automatically distributed.
Annotators retrieve images one at a time for labeling.
After labeling, Annotators press Confirm and pull next to continue.
Faster Annotators can assist slower teammates, ensuring smooth workflow.
The process continues until all images are labeled or the dataset is completed.
Reviewers sequentially verify the annotations for accuracy.
Approved annotations are finalized.
Incorrect annotations are sent back to the original Annotator for correction.
Reviewed images do not return to the general queue but stay in the original Annotator’s job.
📊 Managers and Reviewers can view statistics for each Labeling Job in the queue by clicking the Stats button. This provides insights into team performance and helps optimize the annotation process.
Watch our concise video tutorial 🎥 to learn how to use Labeling Queues effectively in the Supervisely platform.