loader image

Last year, we wrote about the use of AI for generating fake social media accounts and in proliferating social engineering attacks. We cited the use of captured voice recordings to clone voices for use in fraud, kidnapping scams and other illicit activity. The possibilities are unfortunately endless, and since last spring we’ve seen the misuse of this technology evolve quite rapidly increasing the importance to validate remote hires.

What happened at the company Knowbe4?

Knowbe4, a cybersecurity education firm, recently revealed that they in fact fell victim to a fraudulent employee while hiring a principal software engineer who utilized a stolen identity with an AI-enhanced photo. The employee turned out to be a state-sponsored threat actor from North Korea. Knowbe4 specializes in cybersecurity awareness training, including curriculum around insider threats. Importantly, Knowbe4 successfully quarantined the employee’s device before any data could be accessed, compromised or exfiltrated.

Knowbe4 immediately took ownership of this issue, choosing to publish the details of the incident as a cautionary tale of “if it can happen to them, it can happen to anyone.” Here, the threat actor social engineered his way into gaining employment, supported in part by an AI-enhanced stolen identity. He then used a US-based mule to receive and set up the issued equipment, which was subsequently remotely accessed. In addition to providing a detailed explanation of what took place, Knowbe4 shared several process changes with the public that are worth reiterating:

  1. Provide new employee accounts with limited permissions sufficient to onboard through the hiring process and training.
  2. Issue equipment in person. If that’s not practical ship only to a US mailing address previously verified as belonging to the employee. If a UPS location, request signature with photo ID.
  3. Scan remote devices for suspicious access or activity.
  4. Evaluate the employee’s use of VPNs or virtual machines.

What else can I do to safeguard against this scenario and validate remote hires?

In addition to Knowbe4’s lessons learned; BlueCoat recommends:

  • Asking a prospective employee’s references for the name and contact information of other individuals who may know the prospective employee. Never rely on email references only.
  • Exercise caution if the equipment’s shipping address differs from the employee’s residence.
  • Conduct security awareness training for company recruiters and HR personnel. Teach them how to look for inconsistencies in resumes and conflicting personal information.
  • Use a third-party background research firm, such as BlueCoat, to conduct a professional due diligence review of the individual or organization you’re looking to onboard or partner with. These reviews can identify critical red flags common in these attacks, such as the use of VOIP phone numbers and missing or inauthentic digital footprints.

There are several steps that can be taken to ensure your company avoids falling victim to AI fakes, as well as state-sponsored or other threat actors.

At BlueCoat, we’re strong believers in taking those extra steps to vet your workforce. Our investigative team regularly conducts the background research required by firms, large and small, to avoid this very situation.