Building Your Performance Management Product: A People Ops as Product Approach
👋 This is a free edition of the MPL Build Newsletter—resources for running your people team like a product team. Become a paid member to get access to guides, templates, case studies, and exclusive AMAs.
Performance and Compensation are supposed to be best friends. Simpatico. Two peas in a proverbial pod. But let's be honest — when you’re an operator they’re often more like oil and water.
A few years ago I wrote a three-part blog series all about the operations behind my approach to performance management and calibration, with a specific focus on my time at Whereby. I think the blogs still stand, but they definitely lack the product management backbone that I’d love to take the time to share with you today. This is the same content, but with a whole new approach, and hopefully in a nice bite-size reflection for you to take away and consider as you build your own People Ops products in any space, be that performance management or not.
I've spent years watching well-intentioned HR leaders try to force these two together like mismatched puzzle pieces, and the result is usually the same: frustration, inconsistency, and a whole lot of spreadsheet-induced headaches.
The Product Problem We're Solving
As we know from the previous writing on this blog, people operations professionals are, in my framework, essentially product managers building internal tools for our customers (colleagues) and organisations’ best interests. And like any good product, we need to start with the user story up front:
“As a HR leader I want to bridge the gap between qualitative performance feedback and quantitative compensation decisions in a way that is fair, transparent, and scalable, so that we reward strong performance and encourage high-quality behaviours in our team for the best interest of our business.”
This is a classic product challenge – taking messy user inputs (performance) and transforming them into structured outputs (compensation) through a well-designed system.
User Research for Your Performance Framework
At Whereby, we approached building our performance framework like any product team would:
Problem Discovery: We started with user research following something very close to Erika Hall's "Just Enough Research" methodology
As a side note, I love this quote from her, “Bad design gets out in the world not because the people working on it lack skills, but often because the decision-making process is broken. Fixing that is a team effort that has to go bottom up, top down, and all the way across.”
Market Review: We examined existing practices and competitive analysis (how other orgs handle performance)
Prototyping: We built and tested our "performance snapshot" tool
Iteration: We refined based on user feedback from managers and employees
The Performance Snapshot: Our MVP
Our solution needed three "faces" – essentially different user interfaces for different audiences:
1. Employee-Facing Interface
Simple, visual, and non-threatening. We use a two-dimensional grid:
Axis 1: Role performance and competencies
Axis 2: Strategic behaviors (Active Growth for our distributed team)
This visual representation helps employees understand where they stand without getting caught up in numbers or rankings.
2. Manager-Facing Interface
More detailed guidance, including:
Coaching suggestions based on quadrant placement
Management action items
Links to relevant documentation and resources
3. System Interface (The Backend)
The ugly but necessary backend that converts qualitative assessments into quantitative inputs:
"Persisting" becomes "2", for example
"Misfiring" becomes "1", for example
With a range of -4 to 4
This acts as our “API” in some strange way to describe it, translating human-readable performance into system-compatible data.
Design Principles for Performance Products
Behavior-Driven Design: Build tools that guide cultural behaviors, not just measure competencies
Multi-Interface Architecture: Different stakeholders need different views of the same data
Abstraction Layers: Shield users from complexity while maintaining system integrity
Evidence-Based Inputs: Require documentation and examples, not just ratings
Measuring What Matters: The Product Metrics
When building a performance management product, focus on outcomes:
Adoption Rate: Are managers actually using the tool?
Quality of Evidence: Are performance assessments backed by concrete examples?
Distribution Fairness: Does your performance distribution make sense?
Time to Complete: Is the process efficient?
The Calibration Feature: Ensuring Consistency
Like any product, you need quality assurance. Calibration serves as our QA process:
The Calibration Product Spec
Objective: Establish common understanding of performance standards across teams
User Groups:
5-8 managers per session
Cross-functional groupings
People Partner as facilitator
Features:
Bias-checking tools (start each session reviewing bias articles)
Evidence requirements (specific examples required)
Discussion frameworks (structured conversation guides)
Confidentiality protocols
Calibration UX Design
We structure calibration sessions like user testing sessions:
Warm-up: Review bias articles (5 minutes)
Context Setting: Review materials (10 minutes)
Guidelines Review: Read golden rules aloud
Structured Exercises: Follow specific discussion patterns
Debrief: Questions and clarifications
Technical Architecture Considerations
As your org scales, consider:
Data Management: Google Sheets works until ~150-200 people pretty effectively, but there are some great tools out there in the market now, I encourage you to find one that really solves your problem statement and works with your approach to performance. I’m a huge fan of ONA-based performance assessment at the moment, but know these don’t work for all approaches.
Access Controls: Who can see what data?
Audit Trails: Track changes and decisions
Integration Points: How does this connect to HRIS/compensation tools? Is there a way to see thorough reporting in the future, particularly as it relates to cross-functional or cross-process reporting (for example recruitment and ‘?performance data being seen in connection with each other)
Iterating Your Performance Product
Like any product, continuous improvement is key:
Collect feedback after each cycle
A/B test different approaches (but be sure to look for consistency, we want to try to ensure we have a continuation of data integrity wherever possible to see long-term performance and cohort trends)
Monitor usage analytics
Regular retrospectives with stakeholders
Next Steps: From Performance to Compensation
This is the first two-parter in this series! In our next blog, we'll explore how to build the compensation engine that takes these performance inputs and transforms them into fair, transparent salary decisions. We'll cover:
Forecasting models
Budget allocation algorithms
Communication interfaces
Exception handling
Remember: Great people operations is just product management applied to human systems. Build with empathy, iterate based on data, and always keep your users (employees and managers) at the center of your design decisions.
P.S. Want hands-on help bringing product principles to life on your people team? Let’s talk — send a message to build@themodernpeopleleader.com.