Artificial intelligence (AI) has certainly caused a shakeup in recent years, and people are finding all sorts of ways to use AI to improve their lives. Unfortunately, scammers and fraudsters also are using AI to enhance their schemes.
Experian's Future of Fraud forecast for 2024 warns that AI could make it easier for people to become scammers and make scams harder to detect. Despite the assistance from new technology, however, AI-powered scams are usually new spins on time-proven scams. We update our annual list of the latest scams every year and see the same trend—scammers use new technology or current events to add a twist to existing methods.
7 Types of AI-Powered Scams
Scammers can use AI in many ways, including developing new types of software and analyzing companies' fraud defenses. The scams that target individuals often involve generative AI—when the AI creates something new, such as text, images or video.
Here are seven examples of AI scams that you need to beware of.
1. Voice Cloning Scams
Some AI tools can take a short clip of someone speaking and then recreate, or clone, their voice.
Scammers might clone celebrities' voices and then try to trick victims with fake robocalls or videos from the celebrity. The cloned voice might ask you to donate to a worthy cause, recommend an investment or claim to be giving away some of their money to a few lucky people.
Alternatively, scammers might target you directly by finding a video of a friend or relative on social media and cloning their voice. They might call you, pretend to be the person and say they need money to pay a ransom or bail. These types of scams are sometimes called grandparent scams because the scammers often target older victims by claiming to be a grandchild in a difficult situation.
2. Deepfake Video Scams
Deepfake videos are AI-generated videos that might include completely fake people or simulated real people. Some of the videos can be very convincing, especially when they're combined with cloned voices.
Scammers have used deepfakes of celebrities and news anchors to promote all sorts of scams, sometimes posting the videos directly on social media or using them in online advertisements. The videos will often point viewers to a website where the scammer then tries to collect your personal information or trick you into buying or investing in a scam product.
3. Deepfake Video Call Scams
Rather than using AI to generate and post deepfake videos, scammers might be able to use AI tools to create live deepfake videos that they can use for video calls. They could use these to video chat with victims of romance scams—where the scammer befriends or starts a romantic relationship.
Romance scams often go on for weeks or months, until the victim trusts the scammer and starts sending them gifts or following their "investment" tips. The real-time deepfake videos, which may also use AI-altered voices, could make the scam more convincing.
4. AI Images and Deepfake Scams
Scammers also might create AI-generated images to enhance their scams. They could use the images as part of an advertisement or post on social media and link to scam websites in the comments. There are also cases of criminals creating explicit deepfake images of people and then extorting victims.
5. AI-Generated Websites
Scammers might use AI to create websites and then send you links to the website via email or post links on social media. A fake online store might offer a popular item at a deeply discounted rate, and have a limited-time sale that prompts you to quickly make a purchase. However, the scammers could steal your payment information and sell it or use it to make fraudulent purchases.
These sites could also be part of a triangulation fraud. When you place an order, the scammer uses stolen payment information to buy and ship the product to you. Although you're getting the item for cheap, you won't be able to return or replace it if there are any issues. And the scammers might take and use your payment information to buy products to fulfill someone else's order and keep their money.
6. AI-Enhanced Phishing Emails
Phishing emails are emails that scammers send to try to trick you into downloading malware or sharing personal information, such as your Social Security number, financial information or usernames and passwords. The scammers often disguise or "spoof" their email to make it look like it's coming from a trusted friend, family member, government agency or well-known company.
A long-standing best practice for detecting phishing emails was to look for unusual phrases and grammatical errors. But with help from AI, scam generated emails have become much more convincing. The scammers also might use AI to personalize the message based on information they find about you online.
7. AI-Generated Listings
Scammers can also use AI to create images and descriptions for fake listings as part of an online marketplace scams. For example, they might list an in-demand item for sale and then ask you to pay a deposit to hold the item. Or, the listing could direct you to a different website that they use to steal your payment information. Scammers could also list apartments and homes as part of a rental property scam.
How to Protect Yourself From AI Scams
AI can make scams more convincing, but you can still put safeguards in place to keep yourself from getting scammed and keep scammers from doing too much damage.
- Be extra cautious. Deepfakes can look or sound real and familiar, and you'll want to keep your guard up whenever someone contacts you from an unfamiliar email account, phone number or social media profile.
- Don't take any actions if you feel pressured. Scammers will often try to make it feel like you need to act quickly. They might tell you that if you delay, you will be arrested, pay a fine, lose a reward or miss a great opportunity. When things start to feel urgent, take a deep breath and try to keep a clear head. In most legitimate situations, it's okay to pause for a couple of hours—or days—before acting.
- Stop the exchange and reach out to the person or organization via trusted channels. Legitimate government agencies, companies and friends won't mind if you hang up and call back. If anything, you could pretend the phone or internet connection dropped. Look up the organization or person's contact information from a different source and then try to reconnect to see if it was actually them.
- Phone a friend. If something sounds too good to be true, call or text a trusted friend or family member. Ask them if this might be a scam. Sometimes talking through the details and getting a second opinion is enough to identify a scammer.
- Don't click on links. It's generally best to avoid clicking on links in emails, texts and social media comments or messages.
- Use reversible payment methods. If you send a scammer cash, crypto or a gift card, you might not be able to get your money back. Some transactions, such as bank transfers, can be reversed if you act quickly. You also might be able to dispute credit card purchases if the seller turns out to be a scammer.
- Create a secret password or phrase. Create a secret password or phrase with your family members and friends that you can use to verify each other's identities. Don't discuss it via email—in case someone's email is compromised—and try to use something that a scammer won't be able to figure out using people search sites or reading social media posts.
- Update your account security: You might have heard this before, but it's now more important than ever: create unique passwords for all your online accounts and enable multifactor authentication (MFA). This can help keep identity thieves and scammers who break into one of your accounts from logging in to other accounts.
What to Do if You're a Victim of an AI Scam
If you've fallen for a scam, the next steps could depend on the type of scam, but here are a few things you may want to do.
- Try to get your money back. If you sent a payment using an app, credit card or bank account, contact the company to see if you can stop or reverse the transaction.
- Report the scam to the FTC. File a report with the Federal Trade Commission (FTC) at ReportFraud.ftc.gov. Reporting scams can help the FTC track trends, warn others about scams and charge scammers with crimes.
- Report the identity theft to the FTC. If the AI scam was after your identity rather than your money, you can report the identity theft on IdentityTheft.gov. The FTC will create a personalized recovery plan for you based on what happened.
- Protect your credit. You have the right to add fraud alerts to your credit reports, which alert companies that review the reports that you've been a victim of identity theft and that they should take extra steps to verify your identity. If you add a fraud alert with one credit bureau, they will automatically notify the remaining bureaus. You also have the right to freeze your credit reports for free, which limits access to your credit reports and may keep someone from fraudulently opening a new credit account.
- Resecure your accounts. Even if you recently created unique passwords for your accounts, you may want to update your passwords again.
Monitor Your Information for Signs of Fraud
Sometimes you'll figure out you fell for a scam immediately. You send the person money and then they stop responding. Or you place an order and immediately your mind clears and you realize it really was too good to be true. However, some scams can play out over time, and the scammer might not use the information they stole right away.
Experian's free credit monitoring offers real-time notifications when there are important changes in your credit report, which could be a sign of identity theft. Or, you could try a paid premium membership for an identity theft protection service that may also warn you about signs of fraud related to your Social Security number or financial accounts. Paid memberships also include fraud resolution support services and identity theft insurance.