Kili's built-in automatic workload distribution mechanism facilitates load management, allows for faster project annotation, and prevents duplicating work.
For example, if several users collaborate on an annotation project, Kili app distributes the data to be annotated to each project member so that each annotator processes unique data. The same data will not be annotated twice, unless Consensus is activated.
To help labelers and reviewers meet the labeling requirements for the project, you can set labeling instructions. Your instructions can also serve as reference, if questions arise.
Labelers who are unsure about a specific label can ask a question. Reviewers, team managers and project admin will be notified and will have the option to answer and close the question.
The number of open questions is shown on the issue button.
For information on how to handle questions asked by labelers, refer to Handling questions and issues.
To provide feedback to labelers, reviewers can add issues.
The number of open issues is shown on the issue button and is also available from the Analytics page.
For information on how to find, add, and resolve issues, refer to Handling questions and issues.
To further increase your label quality, use the following tools:
Based on these quality management tools, you can generate specific KPIs to optimize your quality control process.
You can access quality KPIs at 2 levels:
For more information on the review process, refer to Reviewing labeled assets.
For a list of best practices, refer to Best practices for quality workflow.
Updated 9 months ago