Home Security Heroes independently tests and reviews every product. We may earn a commission when you buy through our links. Read more here.
Being listed on the popular exchange Binance is a massive win for cryptocurrency startups. It offers global exposure a new company wouldn’t get otherwise.
Of course, getting that listing isn’t easy.
To fraudsters, that heavy demand makes it the perfect playground.
Using artificial technology, scammers have been able to fake Binance’s CEO, Patrick Hillman, and use that simulation on video calls. On those calls, startups are convinced to pay for the listing.
The problem is that Hillman has nothing to do with listing cryptocurrencies on his platform. He didn’t even know he’d had a video call with startups until they started calling to thank him.
What Is a Deepfake?
Going anywhere without someone wanting to talk about artificial intelligence is hard.
A.I., short for artificial intelligence, uses digital technology to emulate human behaviors. This is done through something called machine learning, which has a computer gathering information from data fed into it by humans. From this input, the machine can then create content.
That’s A.I. in a nutshell.
But with deepfakes, that learning goes into overdrive. The algorithms feature layers that add detail and nuance to the result. And that’s how someone can video chat with a recognized person without realizing it.
If someone video-chatted with you pretending to be someone you know well, such as a spouse, parent, or child, you probably would detect it. But when it’s someone who’s an acquaintance or who you’ve only seen in interviews and photos, as with the Binance CEO, it can be tougher to spot the fake.
As deepfakes become more sophisticated, they may be able to trick you into believing you’re seeing a loved one up close. For now, though, deepfakes can simulate a loved one’s voice or keep the familiar face in the shadows where you can’t get a clear view.
The technology is already sophisticated enough to have caught the eye of scammers, though. And it’s giving us a glimpse at the future of identity theft.
Ways Deepfakes Can Be Used
It’s already here, so protecting yourself against deepfakes is important. But it’s also important to keep an eye on how it will influence things in the future. Here are some ways deepfakes can be used for identity theft.
1. Data Theft
In 2022, the FBI issued a press release warning about deepfakes and job applications.
As the release detailed, scammers use deepfakes to conduct video interviews for jobs in technology. They specifically target jobs that will give them access to databases full of customer information.
The scammers apply for work-from-home jobs to get hired and have free access to a wealth of personally identifiable data (PII) and financial information.
With so many people working remotely these days, video interviews are fairly commonplace. Using deepfake technology, scammers could pose as someone else and land a job without ever having to meet an employer in person.
Credit: Julia M Cameron
Social media has weakened privacy. So much is out there for the world to see.
And all that content can be used as material for deepfakes.
Deepfakes are made using videos and audio. That information is fed into the computer, then used to create original videos. You can find examples online to see how it’s done.
Some scammers have already used deepfakes to extort victims. Here’s how this scam works:
A scammer reaches out, claiming to have kidnapped someone close to you. You see a video or hear an audio that convinces you it’s true. You pay the ransom, only to find it was a hoax.
A scammer might also claim to have a compromising video or audio of you, threatening to release it if you don’t send money.
3. Impersonation Scams
Impersonation scams already run rampant. Someone calls or emails, claiming to be from your bank or on the customer service team of an app you use.
Imagine if that contact included realistic audio or video of the supposed representative. It could even be someone you’ve met in person. Backgrounds can be easily faked, so it might look and sound like the person is calling from the bank or call center.
The goal of impersonation scams is to convince you to give up information like logins or your Social Security number. That information could be used to empty your bank accounts, apply for credit in your name, or take over your account.
✎ Read Also: What is Replika AI? Is it Safe? ➔
4. Fraud Against the Deceased
Even after we die, we’re not immune to identity theft.
With this scam, someone uses technology to steal the visual or audio identity of someone who’s no longer with us.
Defrauding the deceased is nothing new. For decades, criminals have stolen the names and Social Security numbers of those no longer living. That information is used to apply for jobs, credit, or financial gain.
Deepfake technology is only going to improve. That means we could be walking the Earth (virtually) long after we’re gone.
I was catfished once. Maybe it’s happened to all of us.
In my case, I was part of a small community of online users who were fooled. Someone claimed to have cancer. We all grew to care about this person and believed the photos on her profile were real.
Then one day, she was exposed as fake. Someone got suspicious and started snooping around, only to find her messages weren’t coming from the IP. Further, the pictures she’d used had been stolen from another person’s profile. Once called out, she disappeared and was never seen again.
Catfishing isn’t going anywhere. In fact, with deepfakes, someone could very convincingly pose as someone else online. The scammer could use video chat and make phone calls using someone else’s face and voice.
This could even hit your wallet. In my catfishing case, many of us had donated to the person’s cause.
There’s also the romance scam. Someone pretends to be in love with you, then asks for money to come see you. If you’ve chatted with the person via phone or video, you might be even more likely to send money.
Credit: EKATERINA BOLOVTSOVA
6. Social Engineering Attacks
Social engineering has long been a concern of cybersecurity professionals.
In a social engineering attack, a scammer convinces someone to click on a link or give up information. How do they do this? By gaining your trust. They might impersonate a respected official, such as someone from law enforcement, or they could convince you they’re someone you know.
You can already imagine how deepfakes could be used for this.
With a social engineering attack, you might provide sensitive information or be talked into downloading malicious files. Either way, you may very well provide data that can be used to steal your identity.
7. Opportunity Scams
If a stranger has an opportunity that sounds too good to be true, you might balk at it.
What if that opportunity comes from a familiar face?
Creating deepfakes of celebrities is nothing new. By now, we’ve all seen the deepfake of Jim Carrey’s face on Jack Nicholson’s body in The Shining. But what if someone put Elon Musk’s face on an ad for an opportunity?
Think of all the celebrities you trust. What if Tom Hanks is endorsing a multilevel marketing scheme? Oprah Winfrey is pushing a new weight-loss product?
Even if those public figures manage to take legal action against these fake endorsements, it will take a while. In the meantime, people will sign up.
8. Cracking Biometrics
Increasingly, systems are relying on biometrics for security.
Having your phone or laptop unlock merely by detecting your face makes things convenient.
But this technology leaves it vulnerable to hackers.
As scammers grow more sophisticated, they could learn to use the technology to hack into a system and “fake” that facial recognition. The system may think the person seated in front of the camera is authorized.
For that reason, security experts recommend shifting to voice-activated biometrics, as they make tampering tougher.
The Risks of Deepfakes
As technology advances, deepfakes will begin to creep into everyday life.
They bring with them some risks. Here are a few.
1. Financial and Reputation Losses
The biggest risk is, of course, personal. If one of these scams is successful, your bank account could take a hit.
This could be short-term, in that you lost hundreds or thousands to a scam, or it could be long-lasting, in that you gave access to your bank account to someone who’s up to no good.
If a scammer manages to grab your Social Security number, your credit score could fall victim. Identity thieves can apply for loans, make major purchases, and even land a job or apartment using your information. You’ll be left responsible for clearing things up with creditors and credit reporting agencies.
Lastly, there’s the overall reputation risk that comes from identity theft. With deepfakes, someone could spread false information about you or pose as you to defraud others.
It could be tough to convince others that yours wasn’t the face they saw with their own eyes.
2. General Loss of Trust
Right now, deepfakes are relatively new. Many of us haven’t yet been duped by the technology. (That we know of!)
But over time, deepfakes will start to creep into our lives. And with that comes a gradual erosion of trust.
Is what we’re seeing real? How can we tell?
Over time, this could lead to paranoia. Is that person who reached out to us with a career opportunity really who they say they are?
And when we reach out to someone, will they know we’re the real deal?
3. Synthetic Identity Theft
So someone can create a video of you. They can’t do that much harm, can they?
Yes, they can.
In addition to the other uses mentioned here, deepfakes can be used as part of something called synthetic identity theft.
With synthetic identity theft, a fraudster uses a combination of information to fake an identity. Someone might steal a Social Security number and combine it with a fake name. They might also deepfake a driver’s license photo and use all of it to get government benefits or land a job.
One of the worst things about synthetic identity theft is that once an identity is created, it can be used for a large number of fraudulent purposes. Your image might be used with a separate name and Social Security card to set up a life for someone, including landing a job, getting an apartment, and applying for credit cards.
4. Legal Action
If deepfakes take hold, businesses may find themselves responsible. If someone poses as a representative of a company and defrauds people, would that company be liable?
What if someone is using your likeness for commercial purposes?
The legalities surrounding deepfakes are complicated. Who owns the rights to your face? You?
We’ll let the lawyers battle that out. But in the end, it’s important to think about what you can do to protect yourself, legally and otherwise.
How to Protect Against Deepfakes
You can do some things to reduce the risks of being scammed.
1. Know How to Spot Them
Deepfakes are impressive, but they aren’t perfect. It’s essential to pay attention to your instincts. If something feels “off” about a piece of video or audio, there might be a reason.
Here are a few signs you might be looking at a deepfake.
- Inconsistent lighting: The background might feature darker or brighter lighting than the face of the person you’re looking at.
- Strange eye movements: The person in the video might not blink very often, or eye movements may be jumpy.
- Sound out of sync: If the audio doesn’t quite match the movements on screen, it’s a sign it might be a fake.
- Poor sound quality: Deepfake creators often focus heavily on visuals, which means in some cases, audio glitches and issues can be a sign a video isn’t authentic.
- Distortions: While deepfakes look great at first sight if you watch the person’s movements, sometimes you’ll see glitches that indicate artificial intelligence.
- Blurring and inconsistent color: The small details are usually off in deepfakes. A person’s jewelry might be crooked, or there may be inconsistencies in clothing colors.
That said, as technology improves, deepfakes are likely to only get more difficult to detect. That’s why it’s important to take other precautions.
2. Consider Insurance
If you run a business, this may be a good time to put some safeguards in place.
For professionals, liability insurance can reduce your risk of legal fallout if a deepfake impacts one of your customers.
Also, make sure you control your online presence. Claim your name and/or your business name on all the applicable platforms. If you can get that verification checkmark, get it. And make sure you own your business’s domain.
What about those of us who don’t own businesses?
That’s where identity theft protection can help. Services like Aura, LifeLock, and IdentityForce can help cover the cost of the cleanup if a deepfake snatches your identity.
Those services will also keep an eye on things and may even let you know early on if your identity has been compromised.
Lastly, make sure you have alerts set up on your financial accounts. This will ensure you’re alerted if someone tries to make purchases using your account information.
3. Restrict Publicly Available Information
Take a look at your social media profiles.
What do your friends see?
What does a complete stranger see?
My social media feeds are filled with people sharing vacations, photos of kids and grandkids, and memories that include exact dates.
All of that can be used for identity theft.
Those videos and photos you post online are the perfect material for deepfakes. Now that reels are so popular, you may even be providing thousands of hours of video content for thieves to mine.
There’s nothing wrong with sharing things. It just might be time to lock down your social media accounts so only friends and relatives can see those images.
4. Safeguard Your Information
With the possibility of fraud floating out there, it’s more important than ever to protect your information.
Never give your Social Security number, bank account information, or passwords to someone who reaches out to you.
If you receive a call or message referencing a problem with one of your accounts, either log into the account or contact customer service.
Never send money to someone unless you’re absolutely sure of that person’s identity. Even then, think twice.
When someone dies, always make sure to notify the Social Security Administration. They’ll tag the number so it can’t be used in combination with deepfakes to land a job or apply for benefits.
Deepfakes are relatively new on the scene, and we’re only beginning to understand how they’ll work.
One thing’s clear, though. It’s more important than ever to take measures to secure your information.
Use complex passwords, limit the number of photos and videos publicly available to you, and refuse to give out your Social Security number unless it’s absolutely necessary.
Last Updated on