Consensus works by having more than one labeler annotate the same asset. When the asset is labeled, a Consensus score is calculated to measure the agreement level between the different annotations for a given asset. This is a key measure for controlling label production quality.
Set the consensus at the beginning of your project. If you update it later, the KPIs will be updated but only the remaining assets (with
TODOstatus) will be distributed for consensus review.
The assets that have been selected by the algorithm to participate in consensus remain at the top of the stack with
ONGOING status until they have been annotated by the defined
Number of labelers.
When they are labeled by the defined number of labelers, their status is updated to
LABELED assets may still have their status updated:
- The status of a
LABELEDasset can change to
ONGOINGif consensus gets increased and the asset was already chosen for consensus.
- The status of an
ONGOINGasset can change to
LABELEDif consensus gets decreased.
- Asset "A" is chosen for consensus. "A" has two labels. So the status of "A" is
- The project is updated to 100% for 3 people.
- The status of "A" is now
- Asset "B" is chosen for consensus. "B" has two labels. So the status of "B" is
- The project is updated to 100% for 2 people.
- The status of "B" is now
For information on label statuses, refer to Asset lifecycle.
- At the asset level, Kili app queries the latest submitted label (type = Default) for each labeler and then computes the agreement. This metric is available from the project Queue page.
- At the project member level, Kili app queries all the assets for which a specific project member submitted labels (type = Default) and then computes the mean over all the consensus scores of these assets. To access this metric, go to Analytics page > Labelers > "List of labelers" table.
- At the project level, Kili app queries all the assets in a specific project and then computes the mean over all the consensus scores of all the assets. To access this metric, go to Analytics page > Progress > "Quality" table.
You can decide whether or not a specific labeling job is taken into account when calculating consensus, by using the
isIgnoredForMetricsComputationssettings. For details, refer to Customizing the interface through json settings.
For calculation details, refer to Calculation rules for quality metrics
For details on how to set up and deactivate consensus, refer to How to use consensus in your project.
Updated 8 months ago