Subscribe
By subscribing you agree to with our Privacy Policy and provide consent to receive updates from our company.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

The FBI are cracking down on people using deepfakes to apply for remote jobs

Companies looking to hire somebody with IT abilities might want to ask themselves whether the individual they are interviewing is real or not. A person using stolen identity information to obtain a job at a tech company may not actually be human, according to the FBI.

The FBI has acquired numerous cases of serious technology crimes. It's critical for companies to maintain an eye out for fakers. Fakers also use images, videos, and voice recordings to get jobs at technology, programming, database, and software firms. They are able to create the impression that they are any individual they want by using stolen data.

A lot of these phoney job attempts may have had access to confidential consumer or corporate information, including economic and proprietary information, implying that impostors had an incentive to steal sensitive data in addition to defrauding the company. It is unclear how many of these fake job attempts have been successful in comparison to the quantity that was captured and reported.

The more malicious hypothesis is whether someone accepted the offer, accepted the salary, and was subsequently arrested. Voice spoofing techniques were used during the online interview, and their lip movements did not match what was said during the video call. Some respondents coughed and sneezed, but the video spoofing software did not spot it.

The FBI warned businesses about North Korean government employees looking for IT and other technical jobs in remote areas in May. In instances like this, fake workers typically use fake paperwork and credentials to obtain remote jobs via sites such as Upwork and Fiverr.

As detailed in the May report from the federal agency, some fake operators used multiple shell corporations to conceal their identities, making it more difficult to identify them. It has come a long way, but some of the more basic deepfake attempts often result in fake voices that don't match the speaker's mouth.

It's also difficult to find fake videos if you're not looking for them. It's a lot harder to make a lifelike human being with video than you think. If you're not looking for fakes, it can be very difficult to find them.

Researchers at Carnegie Mellon University recently reported that artificial intelligence designed to recognise edited video can recognise with an accuracy ranging from 30% to 97%. You can detect phoney videos if you are accustomed to paying attention to certain visual flaws such as shadows that don't work properly or skin textures that don't display properly.

Close Cookie Preference Manager
Cookie Settings
By clicking “Accept All Cookies”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage and assist in our marketing efforts. More info
Strictly Necessary (Always Active)
Cookies required to enable basic website functionality.
Made by Flinch 77
Oops! Something went wrong while submitting the form.