The FBI has released a new public service announcement (PSA) warning employers not to fall for fraudulent attempts from job candidates to land remote working roles.
It said that voice spoofing and stolen personally identifiable information (PII) are used to trick managers into waving the applications through.
“Complaints report the use of voice spoofing, or potentially voice deepfakes, during online interviews of the potential applicants,” the PSA noted.
“In these interviews, the actions and lip movement of the person seen interviewed on-camera do not completely coordinate with the audio of the person speaking. At times, actions such as coughing, sneezing, or other auditory actions are not aligned with what is presented visually.”
The end goal for the scammers appears to be landing a remote working role in which they can access sensitive customer and corporate information from their new employer.
“The remote work or work-from-home positions identified in these reports include information technology and computer programming, database, and software related job functions,” the FBI explained.
“Notably, some reported positions include access to customer PII, financial data, corporate IT databases and/or proprietary information.”
In some of these incidents, employers apparently raised the alarm after pre-employment background checks revealed that the PII submitted by some applicants belonged to someone else.
As deepfake technology becomes more affordable and convincing, cyber-criminals are trying it out in various use cases.
In February, the FBI warned that scammers were using it on video conferencing platforms to effect business email compromise (BEC) attacks.
In this case, CEO inboxes were compromised and meeting invites were sent to various employees. Once on the virtual meeting platform, the ‘CEO’ says their video is broken and attendees are instead forced to listen to deepfake audio urging them to make a large bank transfer.