Work applicants use Deepfake AI to hire employers – Here’s how the hiring managers can see the next copy

[ad_1]

Vijay knew that Balasubramaniyan was a problem.

The General Director of Pindrop, a 300-member information security company, said that the recruitment team came with a strange dilemma: they heard strange noise and tonal abnormalities while conducting a remote interviews.

BalaSubramaniyan immediately thought that the test could be interviewed using Deepfake AI technology to mask the true identities of the issue. However, unlike most companies, Pindrop was as unique as an organization that detects the secret to explore itself.

To reach the bottom of this, the company sent to a business list that develops a large back. Then he used its own internal technology to scan candidates for potential red flags. “We started building this discovery for conference systems as not only for phone calls Enlarge and the teams “says Fortune. “Since the detection of the threat, we wanted to eat our own dog foods. And we saw the first Deepfake candidate very quickly.”

For the developer’s position, 827 general applications, the team used about 100 or about 12.5%, fake personalities. Balasubramaniyan says: “This is our mind.” “This was not a case before and says you are in a world in a far world, it becomes an increasingly challenge.”

Pindrop is not the only company that receives a selegini of business applications attached to fake personalities. Although there is still a matter of recruitment, 17% of hiring managers, according to March, are found in candidates using DEEPFake technology to change video interviews questionnaire Genius continues from the career platform. And recently a startup builder tell Fortune This is one of the North Korean engineers who claimed about 95% of Resumés. As AI technology continues to progress in a fast clutch, enterprises and HR leaders must be prepared for the already complex recruitment landscaping and ready to meet the Namentoz candidate for the interview.

“Currently, my theory is that if we hit everyone, he says Balasubramaniyan.

Black mirror reality to hire managers

Some AI Deepfake work applicants try to simply drop more than one job to increase their income. However, there are more nefolits in the play, there is an argument that the forces of the forces, who do not want to have more evil forces that can lead to great results.

In 2024, CyberSecure has responded to more than 300 criminal activities associated with the famous Chollima of the famous criminal group of North Korea. More than 40% of these events belonged to employees hired for fake identity.

“Most of the revenues that create this counterfeit work are directly going to the North Korea’s weapon program,” Meyers says the senior vice-president of the majority representative office of the other side. “Aiming to login, credit card information and company information.”

In December 2024, 14 North Korean citizens were also accused accusation related to a fake employee. They are accused of dedication of at least $ 88 million employees from enterprises in 6 years. This Department of Justice Also claims that some of these employees did not leak the sensitive company information if their employers did not pay them.

Catch a depth

Dawid Moczadło, information security software vidoc security laboratory, recently sent a your video side Linkedin One of an interview was made with a depth of a depth of a depth of the MasterClass in potential red flags.

The zoom bell did not quite synchronize the audio and video of the call and watched the video quality. “When man moves and speaks, I could see different shadows in the skin and very glitchy looked very strange,” says Moczadło Fortune.

Moczadło, even if the candidate wants to hold his hand in front of his face, the most cursed and rejected everyone. Moczadło, the suspects that will be the filter used to create a fake image, it happens, as much as it happens Snapchatto expose the true face.

“Before this happened, we gave people the benefit of the suspicion, maybe the cameras were broken,” said Moczadło. “But then, if they do not have a real camera, we will only completely stop (interview).”

There is a strange new world for HR leaders and employers, but then there are other talks that will be able to watch the main headaches before.

Deephfake candidates often use AI to generate real-looking fake Linkedin profiles, but use the history of their employment or use criticized notes to lose critical information or a few activities or several relationships.

As for the interview stage, these candidates are often unable to answer the main questions about life and work experience. For example, Moczadło recently met with a depth candidate continuing a well-known organization, but he says he could not share any detailed information about those companies.

Employers should also recommend the laptops that want to be sent to another place from home addresses. Some people use “laptop farms” that they are opened on more than one computer and miss people outside the country may be remotely.

And finally, the employees are usually not the best employees. They often do not burn their cameras during the meetings, do not excuse to hide their faces or work.

Moczadło, now he is more careful for hiring and said he carried out new procedures to the process. For example, he pays for the company to enter the company’s office for at least one full day before hiring candidates. But he knows that everyone does not allow so sober.

“We are in this environment that they receive thousands of employers,” said Moczadło. “And when there are more pressure to hire people, see this early warning signs, and this can create this perfect opportunity to take advantage of this perfect opportunity.”

This story was first displayed Fortune.com


[ad_2]
Source link

Leave a Reply

Your email address will not be published. Required fields are marked *