Incident Report Summary: Insider Threat
First of all: No illegal access was gained, and no data was lost, compromised, or exfiltrated on any KnowBe4 systems. This is not a data breach notification, there was none. See it as an organizational learning moment I am sharing with you. If it can happen to us, it can happen to almost anyone. Don’t let it happen to you. We wrote an FAQ, answering questions from customers. Story updated 7/25/2024.
TLDR: KnowBe4 needed a software engineer for our internal IT AI team. We posted the job, received resumes, conducted interviews, performed background checks, verified references, and hired the person. We sent them their Mac workstation, and the moment it was received, it immediately started to load malware.
Our HR team conducted four video conference based interviews on separate occasions, confirming the individual matched the photo provided on their application. Additionally, a background check and all other standard pre-hiring checks were performed and came back clear due to the stolen identity being used. This was a real person using a valid but stolen US-based identity. The picture was AI “enhanced”.
The EDR software detected it and alerted our InfoSec Security Operations Center. The SOC called the new hire and asked if they could help. That’s when it got dodgy fast. We shared the collected data with our friends at Mandiant, a leading global cybersecurity expert, and the FBI, to corroborate our initial findings. It turns out this was a fake IT worker from North Korea. The picture you see is an AI fake that started out with stock photography (below). The detail in the following summary is limited because this is an active FBI investigation.