Phishing attacks are on the rise, and no organization is safe. In research conducted for our 2023 State of the Phish report, Proofpoint found that a staggering 84% of organizations across 15 countries faced at least one successful phishing attack in 2022. And more than half of these organizations (54%) faced three or more attacks.
Our research also revealed that one-third of working adults are unable to define the term “phishing.” This is a crucial insight because every gap in end users’ cybersecurity knowledge puts organizations at risk. These gaps also challenge security awareness practitioners to find better ways of engaging users, improving their knowledge and promoting safer behavior.
In our recent Lessons Learned webinar, Proofpoint explored insights from our 2023 State of the Phish report. The webinar featured Karina Edwards, a senior information security analyst in the financial services industry, and Steve McGrath, an information security analyst with a multinational commercial property insurance firm. They shared their thoughts and best practices about a wide range of security awareness topics.
During the webinar, attendees submitted questions to be answered later. The following is a summary of our panelists' responses, which have been lightly edited for length.
Topic: Security awareness
How did you set up your working group, and what is the makeup of your team?
Karina: Our group is made up of information security officers for their respective business units (BUs). There is at least one representative from each BU. The group was originally formed out of a need to have the individual federated BUs come together to share ideas and best practices for any information security topics. That was years ago, and it has organically morphed into the Information Security Awareness and Training (ISAT) working group we have today.
Steve: We are in the process of implementing a “Cyber Champions” group that will essentially be an extension of our team. It will be made up of folks in our lines of business who other people would feel comfortable reaching out to with questions. These individuals will also help to get our messaging out.
What is the size of your team and the roles of team members?
Karina: Our global team has two members, and we share all the tasks in supporting the global ISAT program. We create global awareness campaigns, conduct global phishing exercises and work with vendors to create and curate content.
We also partner with departments such as Global Communications, Human Resources (HR), Global Privacy and Data Management, and the Security Operations Center (SOC) to create effective programs and content and distribute it widely throughout the organization. Additionally, we work with stakeholders from each BU to improve the organization’s data security and mitigate human risk through effective education.
Steve: Our team is also two people, although I am the only person working full-time in security awareness and education. In my role, I plan and run monthly phishing exercises. That work includes creating custom teachable moments and performing all follow-up tasks regarding “clickers.” We post to our internal Yammer group at least three times per week—Catch of the Day, Tip Tuesday and Fun Friday—and whenever something pressing comes up. We run training for internal audiences and quarterly “escape room”-style training. We also typically speak to two to three groups per quarter.
Topic: Phishing simulations
How do you avoid “phish fatigue”? And what is considered “over-phishing”?
Steve: We typically engage with larger and smaller groups every other month, so the follow-up isn’t too arduous. We are aware that some folks fall into multiple groups—for instance, HR or an administrative assistant or a privileged user. So, we spread out these exercises.
I have heard fewer than five people comment or complain about the volume of exercises. People understand that this is the number one attack vector, and we’re trying to keep them attentive. I think “over-phishing” would be targeting the same people four months in a row. We try to break things up so that most people are phished one or two times per quarter.
Karina: We haven’t experienced phish fatigue in our organization yet. In almost four years of sending out phishing campaigns, we have not received one response that feels we are sending too many phish. Between general population phish and targeted campaigns, on average each employee receives about six phishing emails per year. So, I imagine we would hear if we were sending them more often.
Do you ever announce phishing prior to sending the phish email?
Steve: No. However, we have discussed doing this. Our people know that we phish every month, so they should always be attentive. I guess in a way that is giving them notice, but we don’t publicize it.
Is there a specific time of the day in which phishing is more successful?
Steve: We randomize when our phishing messages are sent. Usually, it’s across one to three days, and sometimes it spans a weekend. I think people are less attentive when they are reading a lot of email in their mailbox, like on Monday morning or following a long weekend. I have noticed quite a few people fail after hours and on mobile devices when they are multitasking or trying to catch up after a busy day.
Karina: It depends on the phishing template we’re using. For example, if we’re simulating a phish from a streaming service provider, we send it close to the weekend. If we’re simulating HR sending a phish, we send it on a weekday and typically first thing in the morning. We also try to avoid sending phish during our help desk’s busiest times to avoid them possibly getting inundated with calls.
What objectives and key results (OKRs) do you share when reporting goals and metrics to leadership?
Steve: We regularly report click rate versus published click rate (the percentage shown in Proofpoint templates). We also review the previous six-month period to see if we’re trending in a positive direction. And we show the reporting percentages (using the Report Phish button in Proofpoint PhishAlarm) for phishing exercises and non-exercises.
Additionally, we show the number of repeat clickers from quarter to quarter. We saw that number decrease after we implemented formal follow-up processes that included sending summarized reports to key executives—not sharing names, just numbers.
Karina: We share the average reporting rate and average click rate. We also share the resilience ratio (average reporting rate divided by average click rate) and how it compares to the financial industry average year over year. Also, we share the percentage of employees who take the training and awareness campaigns that we roll out.
Topic: Security culture
How do you deal with executives or other people unwilling to take training?
Karina: We educate them on the potential consequences by providing real-life examples of what can happen if they fall victim to a phish or a hack. For example, the company’s reputation, financial damage or loss of trust with customers.
Steve: There are always executives or managers who don’t complete training in the allotted time frames. We start with friendly reminders and, if needed, escalate to higher managers to remind them from the top-down. We have, on occasion, provided shorter personal (live, instructor-led) training to cover important topics with folks who prefer that over taking competency-based education (CBE) training.
We don’t have a social media channel to share stories and highlights. What other methods would you suggest?
Karina: I suggest Microsoft Teams chats and posts in Communities (part of the Microsoft Office 365 intranet suite). Also, old-fashioned newsletters, posters and “chats around the watercooler” work. Finally, nothing beats face-to-face discussions. So, we engage managers to have security-related discussions in stand-up, team and department meetings. We often provide them with content and talking points to start the conversation.
Steve: I concur with Karina’s suggestions. We post corporate communications regularly, probably once or twice a month, on current threats or points of interest. We also post our recognition award quarterly.
How do you dispel the myth that IT/IS have infallible secure email gateways (SEGs) and anti-malware?
Steve: We make it clear that if an email gets to a user’s mailbox, it has gone through all of our technical controls to block or prevent the bad actors. We share real numbers from the Proofpoint PhishAlarm analyzer about the number of reported emails that ended up being malicious or suspicious, and what those messages could have done if someone didn’t report them. We tell people they truly are the last line of defense.
Karina: We partner with our SOC staff who monitor the SEGs and other system endpoints. They provide real-life examples of cyber attacks that made it through our SEGs. We also create awareness and training materials for our employees using that information.
Topic: Discipline model
How do you balance educating your colleagues with any fear they may have about reporting emails?
Karina: We use short bits of training that are curated to the topic at hand. For example, we created awareness materials that teach employees what is a phish vs. spam vs. legitimate email, and when and how to report and what happens when they report. This training takes less than five minutes.
Steve: We are careful to communicate that people are not in trouble if they click on an exercise. We explain that it means they are more likely to fall for a real phishing email and expose the organization to risk. I think they are receptive to internalizing the importance of being careful and reporting suspicious email and activities.
Everything we share reminds people to use the reporting button, including sharing the reporting numbers in hopes that managers will encourage their folks to report. We have discussed having a leader board showing line-of-business reporting numbers, rather than click rates, as a more positive message.
Did you have to fight management against forcing you to use a consequence model?
Steve: Our leadership had detailed discussions about how we would follow up. They did not want our process to be punitive and specifically told managers not to put anything in an individual’s review or performance objectives if they clicked or were a repeat clicker.
We want people to be our partners in keeping the organization safe. We always tell folks they are not in trouble, but they need to be more alert and get better at recognizing potentially risky email and respond appropriately (report it, delete it or avoid it).
Karina: We are very fortunate because our management prefers praise vs. the consequence model. So, we haven’t had this discussion. My suggestion is to bring some real-life success stories of using the “positive reinforcement” model to your management.
Have you implemented a repeat offender program? If so, what does the program look like and how was it received?
Karina: We stick to the “no punishment” method of training. We praise employees when they perform a “secure” action, like reporting a phish. And we promote that behavior as much as possible. We do assign a short training to those who clicked on two or more consecutive phish—but that’s pretty much the extent of a consequence.
Steve: We do have a process for repeat offenders. First click: Email with a Teachable moment and a reminder to be attentive. We also include what happens if they click again within a six-month period, or three times within nine months.
- Assigned two CBE training modules (about 15 minutes of training)
- The direct manager is notified and sent speaking points to review with the person who clicked
- Access to personal webmail blocked from corporate devices for six months
- The appropriate senior vice president/executive vice president (SVP/EVP) receives a summary report
- Reminder of follow-up actions if there’s a third click within nine months
- Assigned instructor-led training (about 30-40 minutes of training)
- Direct manager is notified
- Access to personal webmail blocked from corporate devices for an additional six months
- The appropriate SVP/EVP receives a summary report
Multiple people have invited me to speak to their larger team following the instructor-led training. Most people understand the importance of cybersecurity and really are trying to be better.
To get more insights, watch the full Lessons Learned webinar here.
About the panelists
Karina Edwards is a Sr. Information Security Analyst at an international financial services group. In her role, she is responsible for the development and execution of a global information security awareness and training program. Together with another team member, she promotes greater security of the company’s data, client information and associated assets through continuous education using a behavior change model to reduce the company’s inherent “human” risk.
Karina has been in her current role since 2019. Prior, she worked for large organizations in various industries such as telecom, healthcare and aerospace, focusing primarily on IT operations. Karina holds a Master of Science in Systems Management from the University of Southern California and a Certification of Security Awareness Professional from SANS Institute.
Steve McGrath is an Information Security Analyst with a multinational commercial property insurance company. He supports a global cybersecurity training, education and awareness program, as well as security and risk reporting. Steve directly supports the organization’s commitment to promoting a security-conscious culture, working to deliver the right message, to the right people, at the right time. His team is responsible for researching, creating and communicating a wide variety of interesting and applicable cybersecurity content to keep employees engaged.
Steve is a frequent presenter and educator to internal audiences, helping people understand how small changes can make a big impact. He is responsible for running monthly phishing simulations and follow-up activities to reinforce desired behaviors. Steve has been working in IT for 30 years, with the past 10 years dedicated to information security and security awareness.
Subscribe to the Proofpoint Blog