Calibration ensures that performance ratings are consistent across managers and teams. After managers submit their reviews, cycle admins and calibration committees can compare ratings side-by-side, identify outliers, and make adjustments before reviews are shared with employees.Documentation Index
Fetch the complete documentation index at: https://help.gowindmill.com/llms.txt
Use this file to discover all available pages before exploring further.
The Calibration Board is desktop-only. Open the Windmill Dashboard on a desktop browser to calibrate ratings.
What is calibration?
Without calibration, performance ratings often suffer from inconsistency. Different managers have different standards for what “exceeds expectations” means. Some rate everyone highly, while others are more conservative. Calibration helps correct these biases by:- Creating shared standards — Leadership aligns on what each rating level means
- Reviewing evidence together — Decisions are grounded in feedback, outcomes, and context
- Ensuring fairness — Comparing similar employees across teams to confirm consistency
- Balancing distributions — Avoiding clusters where everyone is rated the same
Creating a calibration
After managers submit their reviews, cycle admins can create a calibration:Go to the cycle
Navigate to Performance Reviews and select the cycle.
Create calibration
Click Create calibration and configure:Name — A descriptive name for this calibration sessionParticipants — Who will be involved (facilitators, committee members, reviewees)Questions to calibrate — Which rating questions to review (typically overall performance rating)
The Calibration Board
The Calibration Board is a visual workspace where leadership compares employees side-by-side based on their ratings.What appears on the board
The board displays employees in a grid or table layout:- Grid view — Available when calibrating single-select (rating) questions. Shows employees positioned by their rating with drag-and-drop adjustment.
- Table view — Shows all questions and answers in a table format for detailed review.
- Name and role
- Current rating for each question
- Manager name
- Department and team
Using the Calibration Board
Review the distribution
Look at how ratings are distributed. Are too many employees rated “Exceeds”? Are there outliers with high ratings but limited evidence?
Filter and compare
Use filters to view specific subsets:
- By department
- By team
- By manager
Review employee details
Click into any employee to open their full Performance Review Packet:
- Self review
- Peer feedback
- Upward reviews
- Manager’s submitted review
- Supporting evidence and context
Adjust ratings
If a rating needs adjustment, drag and drop the employee card to a different rating level (in grid view) or edit directly (in table view).When you adjust a rating, the system prompts for a reason. This note is audit-logged but not visible to the employee.Changes save automatically.
What can be adjusted during calibration?
- Rating question answers — Adjust single-select ratings (e.g., move from “Exceeds” to “Meets”)
- Participant list — Add or remove employees from the calibration
- Committee members — Adjust who has access to view or propose changes
After calibration closes
When calibration is finalized:- Updated ratings sync back to employee reviews — The rating in the manager’s review is updated to match the calibrated rating
- The board is locked — No additional adjustments can be made
- All changes are audit-logged — Windmill tracks what was changed, by whom, and when
- Cycle admins proceed to sharing — Reviews are ready to share with employees
Access and permissions
Calibration uses two session-level roles, both separate from cycle admin:- Facilitators drive the session. Only facilitators can start the session, submit it, cancel it, or regenerate the pre-read report.
- Committee members participate in discussion and view packets, but cannot drive the session.
| Action | Who can do it |
|---|---|
| Create a calibration | Cycle admins |
| Start, submit, cancel, regenerate pre-read report | Facilitators only |
| View packets and propose adjustments | Facilitators and committee members |
| Be reviewed in a calibration | Anyone in the cycle |
- The creator must be a cycle admin and must add themselves as a facilitator.
- The creator cannot also be a reviewee in the same calibration.
- A reviewee cannot also be a facilitator or committee member of the same calibration.
Common scenarios
Scenario 1: Rating is too high for the evidence
A manager rated an employee “Exceeds Expectations,” but peer feedback is mixed and there’s limited evidence of exceptional impact. Action: Adjust the rating to “Meets Expectations” and document: “Peer feedback indicates solid performance but not exceptional. Rating adjusted to align with evidence.”Scenario 2: Ratings vary across managers
One manager rated all direct reports “Exceeds,” while another manager rated similar performers “Meets.” Action: Compare employee details side-by-side. Adjust ratings to ensure consistency across managers and teams. Discuss with managers to align on rating standards for future cycles.Scenario 3: High performer on a strong team
An employee is rated “Meets Expectations,” but their performance is actually strong—they’re just on a team full of exceptional performers. Action: Review the evidence. If the employee’s impact is genuinely high, adjust to “Exceeds Expectations.” Calibration should ensure high performers aren’t penalized for being on strong teams.Best practices
- Start with outliers — Review employees with potential rating misalignments first
- Use evidence, not opinions — Base decisions on peer feedback, self reviews, and system signals
- Compare similar employees — Use filters to view employees in similar roles across teams
- Document adjustments — Always include a reason when changing ratings for transparency
- Align on definitions — Before starting, ensure everyone agrees on what each rating level means
Related reading
- Performance Reviews — Overview of the complete review process
- Manager reviews — How managers draft reviews and assign ratings
- Sharing reviews - Share finalized packets after calibration
- Progress tracking — Monitor review completion before calibration
- Cycles — Set up calibration as part of your cycle