
Ask most security teams what they measure in their phishing simulation program and you will hear the same answer: click rate. What percentage of employees clicked the simulated phishing link. How that number has changed over time. How it compares to industry benchmarks.
Click rate is a useful metric. But it only measures one half of the behavioral picture, and arguably the less important half.
The other metric — the one that tells you whether your employees are actively participating in your organization's defense rather than simply not making a mistake — is reporting rate. How many employees who receive a suspicious email actually report it to your security team.
An organization where 5 percent of employees click on phishing emails but only 2 percent report suspicious messages is genuinely more exposed than the click rate alone suggests. The low click rate tells you employees are not being tricked. The low reporting rate tells you that when a real phishing campaign lands, your security team will have almost no early warning — no ability to identify the campaign, pull the emails, and alert employees before more of them are affected.
Building a reporting culture is not a secondary or optional part of a security awareness program. It is a core objective, and one that most programs underinvest in.
What Reporting Rate Actually Tells You
When an employee reports a suspicious email — through a reporting button, a helpdesk ticket, a forwarded message to the security team — they are doing several things simultaneously.
They are demonstrating that they noticed something was wrong. This is itself significant: it means their awareness and pattern recognition are functioning at a level where suspicious content triggers a response rather than being ignored or acted upon.
They are contributing to organizational defense. A reported phishing email gives the security team information: the campaign is active, this is the template being used, these are the employees who received it, this is the sender infrastructure. That information enables a faster organizational response — pulling remaining copies from inboxes, blocking the sending domain, alerting at-risk employees, and investigating whether anyone has already been compromised.
They are establishing a behavioral norm. Every time an employee reports a suspicious email and receives a meaningful acknowledgment, that behavior is reinforced for them and, over time, observed by colleagues. Reporting becomes part of what people at this organization do, rather than an exceptional act taken by a few particularly diligent individuals.
The difference between an organization with a 3 percent reporting rate and one with a 25 percent reporting rate is the difference between flying blind during a real phishing campaign and having a functioning early warning system. Real attacks rarely target a single employee. When a campaign lands against your organization, the question is not whether someone receives it but how quickly someone reports it — and what your team does with that report.
Why Employees Do Not Report (And It Is Not Apathy)
The most common assumption security teams make about low reporting rates is that employees do not care enough to bother. This assumption is almost always wrong, and acting on it — by making the case more urgently in training, or by emphasizing the importance of reporting more forcefully — rarely produces meaningful improvement.
The actual barriers to reporting are practical and cultural, and they respond to practical and cultural interventions.
Friction in the reporting process. If reporting a suspicious email requires opening a separate application, filling out a form, navigating to a helpdesk portal, or doing anything more complex than a single click, many employees will decide it is not worth the time. This is not laziness — it is a rational response to limited time and competing demands. The reporting mechanism needs to be as simple as clicking a button in the email client. Every additional step reduces reporting rate measurably.
Uncertainty about what qualifies as reportable. Many employees who notice something slightly off about an email do not report it because they are not sure whether their suspicion is sufficient to warrant contacting the security team. "It's probably fine" and "I don't want to waste anyone's time" are common internal rationalisations. Training that clearly establishes what to report — anything that feels suspicious, even if you are not sure — and explicitly communicates that there is no such thing as an unnecessary report removes this barrier.
Fear of looking foolish or over-cautious. In organizational cultures where efficiency is highly valued, employees may feel that reporting a suspicious email (especially if it turns out to be legitimate) will make them look paranoid or inefficient. This is a cultural barrier, and it requires a cultural response — visible organizational messaging that reporting is valued, that no report is a waste of time, and that the cost of under-reporting vastly exceeds the cost of over-reporting.
No acknowledgment or feedback. Employees who report suspicious emails and receive no response — not even an automated confirmation — quickly learn that their report went into a void. This is demotivating in a way that is hard to overstate. People stop doing things that produce no visible result. An automated acknowledgment that says "thank you, we're on it" takes negligible effort to implement and has a material effect on reporting behavior.
The belief that someone else will report it. Diffusion of responsibility is well-documented in social psychology: when people believe others are equally capable of taking action, they are less likely to take it themselves. In organizations where email campaigns reach large populations simultaneously, employees who notice something suspicious may assume that many others have already reported it. Training and communication that establishes individual reporting as expected behavior — rather than a communal option — addresses this directly.
The Technical Foundation: Making Reporting Frictionless
The most impactful single change most organizations can make to their reporting rate is implementing a one-click report button directly in the email client.
A report phishing button that is always visible in the email interface, requires one click, and sends an automatic acknowledgment removes most of the friction that suppresses reporting. It also normalizes the behavior — a button that is always present becomes part of the visual landscape of the email client, which keeps reporting as a salient option even when an employee is not in a heightened state of suspicion.
Most major email platforms support phishing report button integration. Microsoft Outlook supports the Report Message add-in. Google Workspace supports reporting through the Gmail interface or through third-party add-ins. Most security awareness platforms offer their own reporting plugins that integrate with both.
The acknowledgment sent when an employee clicks the report button matters more than it might seem. At minimum, it should:
- Confirm that the report was received
- Thank the employee for the action
- Indicate whether it was a simulation or that the security team is investigating
- If it was a simulation, provide a brief, positive reinforcement message
This acknowledgment loop — action, confirmation, feedback — is the basic behavioral reinforcement mechanism that makes reporting a habit rather than a one-time decision.
Training That Builds the Reporting Habit
Technical implementation makes reporting easy. Training makes it expected.
Training content around reporting should address several things that generic phishing awareness materials typically omit.
Explicit instruction on what to report. Tell employees specifically: if you receive an email and something about it makes you uncertain — even slightly uncertain — report it. You do not need to be sure it is phishing. You do not need to be able to name the specific red flag you noticed. If your instinct says something is off, that instinct is worth acting on. The security team would rather investigate fifty false positives than miss one real attack because employees self-censored their suspicions.
Explanation of why reporting helps. Employees who understand the operational value of their report are more motivated to make it. Training that explains how a reported email triggers an investigation, how that investigation can identify an active campaign, how the security team can pull additional copies from other inboxes, and how faster detection reduces breach impact — this training creates a genuinely different relationship between the employee and the reporting behavior. It is not just a rule to follow. It is a concrete contribution.
Stories and examples. Security awareness content that includes real or realistic examples of how employee reports have caught active campaigns is consistently more motivating than abstract instruction. The specific example of "an employee in finance reported an unusual vendor email, the security team identified it as part of an active campaign, pulled 47 copies from other inboxes, and blocked the attack before any payment was made" is worth more than a paragraph of general guidance.
Integration into simulation feedback. Employees who receive a simulated phishing email and correctly report it without clicking should receive positive acknowledgment that explicitly names what they did and why it matters. This reinforces reporting as the correct response — not just "not clicking" but actively engaging the defense mechanism.
Recognition and Positive Reinforcement
Reporting behavior improves when it is recognized. This does not require elaborate reward systems — it requires visible acknowledgment that the behavior is valued and noticed.
Approaches that work include:
Department-level reporting rate visibility. Sharing reporting rate data at the department level — either in team meetings, in a regular security update email, or on an internal dashboard — creates social comparison effects that motivate improvement. When a team sees that their reporting rate is below the organizational average, managers often take it on without prompting from the security team. When a team sees that they lead the organization, they feel appropriately proud of a behavior that deserves to be recognized.
Highlighting improvements in organizational communication. A brief monthly or quarterly message from the security team or executive leadership that names specific improvements — "our reporting rate increased by 40 percent this quarter, which helped us identify and contain a real phishing campaign before it affected any accounts" — creates organizational narrative around the behavior. This is exactly the kind of evidence that supports security culture measurement at the executive level. It signals that reporting is not just a technical compliance metric but a genuine contribution to organizational safety.
Security champion programs. Identifying enthusiastic reporters and security-aware employees in each department and giving them a formal role as security champions creates a distributed advocacy network. Security champions can answer colleagues' questions, reinforce the reporting habit in their immediate environment, and serve as the visible embodiment of the behavior the organization is trying to build.
Direct acknowledgment from the security team. When an employee reports an email that turns out to be a real threat — particularly if that report enables a faster response — a direct acknowledgment from the CISO or security team leader is enormously motivating. Most people have never received a message from their security team that says "your report today helped us catch something real." The ones who do receive it do not forget it.
Measuring and Improving Over Time
Reporting rate should be tracked with the same rigor as click rate. Both numbers belong in your regular security program reporting to leadership, and both should be trended over time.
The relationship between the two metrics is informative. An organization where click rate is falling but reporting rate is also falling is in a different situation than one where both are moving in the right direction. Falling click rate with rising reporting rate is the ideal trajectory — fewer employees are being deceived, and more of those who notice suspicious activity are taking action.
The ratio of reporters to clickers within the same simulation campaign is a particularly useful metric. If 15 percent of employees click and 3 percent report, the ratio tells you something specific about the gap between recognition (some employees clearly notice something is wrong) and action (most of those employees do not complete the reporting behavior). That gap is addressable with the specific interventions above.
Tracking these metrics by department reveals where cultural barriers are highest and where targeted intervention is most needed. Departments with above-average click rates may need different help than departments with above-average click rates and low reporting rates — the latter suggests a specific cultural or friction problem that click-rate-focused training will not address.
The Long Game: Employees as Active Defenders
The framing that matters most for building a reporting culture is simple: employees are not just the human layer that you are trying to protect from phishing. They are also the human layer that can detect, report, and disrupt phishing campaigns that technical controls miss.
An organization where every employee understands this, believes it, and acts on it has a meaningfully different security posture than one where security is something that happens to employees rather than something they actively participate in. The click rate tells you how well your organization is doing at the first objective. The reporting rate tells you how well you are doing at the second.
Both matter. Building a reporting culture requires treating the second as seriously as the first — with appropriate tooling, explicit training, consistent recognition, and metrics that keep it visible. The organizations that do this well are not just ones where fewer employees click on phishing links. They are ones where a real phishing campaign is identified within hours rather than weeks, contained before significant damage occurs, and met with a coordinated response that every employee understood they had a role in triggering.
That is what a good security awareness program ultimately builds. Not just caution — capability. Organizations that run consistent monthly simulations and track both click rate and reporting rate develop a much richer picture of their human risk posture than those tracking only one metric.
PhishSkill tracks both click rates and reporting rates across every campaign, so you always see the full picture of how your employees are performing — not just half of it. Build the culture that makes both metrics move in the right direction.
Related Reading
For the systematic approach to reducing click rates alongside your reporting improvement, read How to Reduce Employee Phishing Click Rates.
Related Reading
To understand the single metric that combines both dimensions of human risk, see The Phishing Resilience Score: Why You Need a Single Metric for Human Risk or explore Security Culture Measurement: The CISO's Guide to Quantifying Human Risk.
If you encounter a real phishing attempt, follow the guidance at CISA: Report a Cyber Incident.
New to this topic? Learn the basics: What Is a Phishing Simulation?
More from the Blog
View allMFA Is Not Enough: How Phishing Attacks Bypass Multi-Factor Authentication and What Training Can Do
Multi-factor authentication has become a foundational security control, but attackers have evolved techniques to bypass it. Learn how adversary-in-the-middle phishing, MFA fatigue attacks, and vishing for OTP codes defeat MFA—and why training is your only defense.
Insider Threat Awareness Training: Building a Program That Protects Without Eroding Trust
Most insider incidents are accidental, not malicious. Learn the difference between insider threat monitoring and insider threat training, how to build a program that addresses negligent insiders without creating a culture of suspicion, and what truly effective insider threat awareness looks like.
Gamification in Security Awareness Training: Does It Actually Work?
Points, leaderboards, and badges are ubiquitous in security awareness training. But do they actually change behavior, or do they just drive engagement metrics? Explore the evidence behind gamification, when it helps, when it distracts, and how to combine it with simulation-based learning.
Ready to stop phishing attacks?
Run realistic phishing simulations and high-impact security awareness training with PhishSkill's automated platform.