Skip to main content
Calibration ensures that performance ratings are consistent across managers and teams. After managers submit their reviews, HR and leadership can compare ratings side-by-side, identify outliers, and make adjustments before reviews are shared with employees.

What is calibration?

Without calibration, performance ratings often suffer from inconsistency. Different managers have different standards for what “exceeds expectations” means. Some rate everyone highly, while others are more conservative. Calibration helps correct these biases by:
  • Creating shared standards — Leadership aligns on what each rating level means
  • Reviewing evidence together — Decisions are grounded in feedback, outcomes, and context
  • Ensuring fairness — Comparing similar employees across teams to confirm consistency
  • Balancing distributions — Avoiding clusters where everyone is rated the same

Creating a calibration

After managers submit their reviews, HR/admins can create a calibration:
1

Go to the cycle

Navigate to Performance Reviews and select the cycle.
2

Create calibration

Click Create calibration and configure:Name — A descriptive name for this calibration sessionParticipants — Who will be involved (facilitators, committee members, reviewees)Questions to calibrate — Which rating questions to review (typically overall performance rating)
3

Launch calibration

Once created, the calibration board becomes available for participants to review and adjust ratings.

The Calibration Board

The Calibration Board is a visual workspace where leadership compares employees side-by-side based on their ratings.

What appears on the board

The board displays employees in a grid or table layout:
  • Grid view — Available when calibrating single-select (rating) questions. Shows employees positioned by their rating with drag-and-drop adjustment.
  • Table view — Shows all questions and answers in a table format for detailed review.
Each employee card shows:
  • Name and role
  • Current rating for each question
  • Manager name
  • Department and team

Using the Calibration Board

1

Open the board

Navigate to the calibration from the cycle page. The board is available at /performance/cycles/CYCLE_ID/calibrations/CALIBRATION_ID/editor.
2

Review the distribution

Look at how ratings are distributed. Are too many employees rated “Exceeds”? Are there outliers with high ratings but limited evidence?
3

Filter and compare

Use filters to view specific subsets:
  • By department
  • By team
  • By manager
This helps you compare similar employees and ensure consistency.
4

Review employee details

Click into any employee to open their full Performance Review Packet:
  • Self review
  • Peer feedback
  • Upward reviews
  • Manager’s submitted review
  • Supporting evidence and context
This lets you verify that the rating is grounded in data.
5

Adjust ratings

If a rating needs adjustment, drag and drop the employee card to a different rating level (in grid view) or edit directly (in table view).When you adjust a rating, the system prompts for a reason. This note is audit-logged but not visible to the employee.Changes save automatically.
6

Finalize calibration

Once the team agrees on all ratings, close the calibration. Ratings sync back to employee reviews and the board locks.

What can be adjusted during calibration?

  • Rating question answers — Adjust single-select ratings (e.g., move from “Exceeds” to “Meets”)
  • Participant list — Add or remove employees from the calibration
  • Committee members — Adjust who has access to view or propose changes
Text answers and written feedback are not editable during calibration—only ratings can be changed.

After calibration closes

When calibration is finalized:
  • Updated ratings sync back to employee reviews — The rating in the manager’s review is updated to match the calibrated rating
  • The board is locked — No additional adjustments can be made
  • All changes are audit-logged — Windmill tracks what was changed, by whom, and when
  • HR proceeds to review release — Reviews are ready to share with employees
Employees only see the final approved rating and feedback once their manager shares the review. They do not have visibility into calibration discussions or adjustments.

Access and permissions

Access to the Calibration Board is role-based:
RolePermissions
HR/AdminFull access (view, adjust ratings, finalize)
Calibration facilitatorsAccess determined by the calibration creator
Calibration committeeAccess determined by the calibration creator
ManagersNo access unless explicitly added as participants
EmployeesNo access

Common scenarios

Scenario 1: Rating is too high for the evidence

A manager rated an employee “Exceeds Expectations,” but peer feedback is mixed and there’s limited evidence of exceptional impact. Action: Adjust the rating to “Meets Expectations” and document: “Peer feedback indicates solid performance but not exceptional. Rating adjusted to align with evidence.”

Scenario 2: Ratings vary across managers

One manager rated all direct reports “Exceeds,” while another manager rated similar performers “Meets.” Action: Compare employee details side-by-side. Adjust ratings to ensure consistency across managers and teams. Discuss with managers to align on rating standards for future cycles.

Scenario 3: High performer on a strong team

An employee is rated “Meets Expectations,” but their performance is actually strong—they’re just on a team full of exceptional performers. Action: Review the evidence. If the employee’s impact is genuinely high, adjust to “Exceeds Expectations.” Calibration should ensure high performers aren’t penalized for being on strong teams.

Best practices

  • Start with outliers — Review employees with potential rating misalignments first
  • Use evidence, not opinions — Base decisions on peer feedback, self reviews, and system signals
  • Compare similar employees — Use filters to view employees in similar roles across teams
  • Document adjustments — Always include a reason when changing ratings for transparency
  • Align on definitions — Before starting, ensure everyone agrees on what each rating level means