Does your business have a remote workforce? Since the pandemic, we’ve seen a massive shift to hybrid, work-from-home and 100% remote work models. And while remote work has proven to be immensely beneficial to both employees and employers, it does come with unique challenges. We’ve covered many of these, including security for remote workers, keeping your remote ream motivated, and even how to avoid Zoom fatigue. Now according to the U.S. Federal Bureau of Investigations (FBI), we have new challenges to look out for.
Yesterday, the FBI released a Public Service Announcement advising that deepfakes and stolen Personally Identifiable Information (PII) are increasingly used to fraudulently apply for remote work positions. Many of these positions include access to even more PII and sensitive information, allowing cybercriminals to continue furthering their activities.
Deepfakes use artificial intelligence to create realistic-looking videos, often intended to simulate a specific person’s likeness. Even if you don’t immediately recognize the term “deepfake,” if you spend time on Internet, you’ve probably already seen some examples of them. Some of the most infamous deepfakes feature uncanny fake video of former U.S President Barack Obama and Facebook CEO Mark Zuckerberg.
While the technology itself is impressive, the ethical implications of deepfakes are vast. Since it can be used to create convincing footage of people doing and saying things they never actually did, deepfake technology can potentially be used for malicious purposes. These malicious uses could include fraud, incitement of violence, disinformation campaigns, revenge pornography, etc. The possibilities are virtually endless and can be used to blackmail, intimidate, humiliate or otherwise harm a person or business.
The FBI’s Public Service Announcement warns that their Internet Crime Complaint Center (IC3) has received an increased number of reports that deepfakes are being used by job applicants interviewing for remote work positions. The positions include IT, computer programming, database and software related job functions, many of them including access to customer PII, financial data, corporate databases and proprietary information. Additionally, the complaints mention that stolen PII was used in the job applications.
Earlier this year, Europol also published a report warning that deepfakes could become a staple tool for various criminal activities including:
While deepfake detection tools employing AI have been developed, they may not be easily available for most businesses to access. However, there are a few visual cues that you can look out for to spot a deepfake. The FBI notice states that “the actions and lip movement of the person seen interviewed on-camera do not completely coordinate with the audio of the person speaking. At times, actions such as coughing, sneezing, or other auditory actions are not aligned with what is presented visually.”
An article from the MIT Media Lab offers the following tips when checking to see if something is deepfaked: