Deira Trading Company

An image provided by Pindrop Security shows a fake job candidate the company dubbed “Ivan X,” a scammer using deepfake AI technology to mask his face, according to Pindrop CEO Vijay Balasubramaniyan.

Courtesy: Pindrop Security

When voice authentication startup Pindrop Security posted a recent job opening, one candidate stood out from hundreds of others.

The applicant, a Russian coder named Ivan, seemed to have all the right qualifications for the senior engineering role. When he was interviewed over video last month, however, Pindrop’s recruiter noticed that Ivan’s facial expressions were slightly out of sync with his words.

That’s because the candidate, whom the firm has since dubbed “Ivan X,” was a scammer using deepfake software and other generative AI tools in a bid to get hired by the tech company, said Pindrop CEO and co-founder Vijay Balasubramaniyan.

“Gen AI has blurred the line between what it is to be human and what it means to be machine,” Balasubramaniyan said. “What we’re seeing is that individuals are using these fake identities and fake faces and fake voices to secure employment, even sometimes going so far as doing a face swap with another individual who shows up for the job.”

Companies have long fought off attacks from hackers hoping to exploit vulnerabilities in their software, employees or vendors. Now, another threat has emerged: Job candidates who aren’t who they say they are, wielding AI tools to fabricate photo IDs, generate employment histories and provide answers during interviews.

The rise of AI-generated profiles means that by 2028 globally 1 in 4 job candidates will be fake, according to research and advisory firm Gartner.

The risk to a company from bringing on a fake job seeker can vary, depending on the person’s intentions. Once hired, the impostor can install malware to demand ransom from a company, or steal its customer data, trade secrets or funds, according to Balasubramaniyan. In many cases, the deceitful employees are simply collecting a salary that they wouldn’t otherwise be able to, he said.

‘Massive’ increase

Cybersecurity and cryptocurrency firms have seen a recent surge in fake job seekers, industry experts told CNBC. As the companies are often hiring for remote roles, they present valuable targets for bad actors, these people said.

Ben Sesser, the CEO of BrightHire, said he first heard of the issue a year ago and that the number of fraudulent job candidates has “ramped up massively” this year. His company helps more than 300 corporate clients in finance, tech and health care assess prospective employees in video interviews.

“Humans are generally the weak link in cybersecurity, and the hiring process is an inherently human process with a lot of hand-offs and a lot of different people involved,” Sesser said. “It’s become a weak point that folks are trying to expose.”

But the issue isn’t confined to the tech industry. More than 300 U.S. firms inadvertently hired impostors with ties to North Korea for IT work, including a major national television network, a defense manufacturer, an automaker, and other Fortune 500 companies, the Justice Department alleged in May.

The workers used stolen American identities to apply for remote jobs and deployed remote networks and other techniques to mask their true locations, the DOJ said. They ultimately sent millions of dollars in wages to North Korea to help fund the nation’s weapons program, the Justice Department alleged.

That case, involving a ring of alleged enablers including an American citizen, exposed a small part of what U.S. authorities have said is a sprawling overseas network of thousands of IT workers with North Korean ties. The DOJ has since filed more cases involving North Korean IT workers.

A growth industry

Fighting deepfakes

AI generated deepfake scam is 'phishing with a twist', says Fortalice Solutions CEO Theresa Payton

Leave a Reply

Your email address will not be published. Required fields are marked *

Source: CurrencyRate