Javascript is not enabled.

Javascript must be enabled to use this site. Please enable Javascript in your browser and try again.

Skip to content
Content starts here
CLOSE ×
Search
CLOSE ×
Search
Leaving AARP.org Website

You are now leaving AARP.org and going to a website that is not operated by AARP. A different privacy policy and terms of service will apply.

Don’t Believe Your Eyes: An AI-Powered Romance Scam

Evolving AI technology is making it increasingly difficult to distinguish between truth and illusion. David connects with Bonnie online, but after lending her money, he suspects she’s an AI creation

an illustration shows an older adult man at a mantle of money reaching up toward a castle’s balcony where a digitized woman, gripping a balcony rail adorned with bit-coin symbols, tries to reach back to him, Romeo and Juliet style.
AARP

Subscribe: Apple Podcasts | Amazon Music | Spotify | TuneIn

David is a retiree who remains active in his Florida community, but he would like to have someone to share his life with. When an Indiana woman named Bonnie strikes up a friendship with David on Facebook, he is delighted but cautious. A video call is reassuring, and their relationship deepens over several months. They make plans to move in together after Bonnie completes a large interior design job in Australia. Bonnie will be paid millions for the job, so when she needs to borrow $25,000 to pay the upfront cost for workers, David is confident she will pay him back. But when the bank notifies him of another transfer from his home equity line of credit, David knows that Bonnie is a fraud and possibly an AI creation. Ricardo Amper, founder and CEO of Incode Technologies, explains the surprising ways AI may be used in scams and why the best tool to combat AI scams may just be … AI.

a quote from the episode is represented graphically
AARP
Full Transcript

(MUSIC INTRO)

[00:00:02] Bob: This week on The Perfect Scam.

[00:00:05] Ricardo Amper: It's scary because biology taught us to believe the person we are looking at and the voice that we're hearing. And this is not really about your instinct. It is more about how technology is evolving in a way that humans cannot even detect it.

[00:00:22] David: You would like to have someone be part of your life that's somewhat of a match so that we have something to look forward to for whatever amount of years that you have here on, on the earth. The unfortunate thing is people don't need a gun anymore to rob you. All they need is a computer. 

(MUSIC SEGUE)

[00:00:51] Bob: Welcome back to The Perfect Scam. I'm your host, Bob Sullivan. We've all been hearing so much about artificial intelligence, AI, and all the wonderful and potentially scary things AI can do. I think we'll learn a lot about AI in 2026. There's a lot of hype and a lot of hope for it. But one thing I'm fairly certain about criminals will be using AI to scale up their scams this year, but perhaps not in the way you think. One of today's guests will tell you that there are already 2000 documented cases of deep fake videos being used in attacks on people and corporations. The only countermeasure, he thinks, is to fight AI with AI. But before that, we're going to hear from someone who fell in love with a woman he now thinks was just an AI creation.

[00:01:49] David: My name is David and I'm in Port Richie, Florida.

[00:01:53] Bob: And how long have you been in Florida for?

[00:01:55] David: Oh, I've been here since 2009, so probably 16 years now.

[00:02:01] Bob: David grew up on Long Island in New York and came from a very large family, which he misses.

[00:02:09] David: My father being the second youngest of 17. Aunts, uncles, everybody are gone. Cousins gone. Everybody. I had cousins that were in their eighties when I was in my fifties because of the lifespan of the family. My, my father was born in 1915 and his younger brother was born in 17. They had already lost brothers and sisters in their younger years, and even as I was growing up, but just, uh,

[00:02:39] Bob: Your father was one of 17 children?

[00:02:41] Yes.

[00:02:42] Bob: Oh my God. That, I can't even imagine the list of cousins that you had.

[00:02:47] David: I'll give you, but going off on a tangent here a little bit, at my parents' 25th wedding anniversary, and they were married in 1941. There were 125 people and they were all immediate family.

[00:03:00] Bob: He left New York with his wife in the 1980s, but that marriage ended up in divorce. So missing his large family, he's feeling pretty alone the day he gets an unexpected friend request.

[00:03:15] David: This woman Bonnie, reached out to me, said that she had found me. Through Facebook and through posts and different things that I had been engaged with in situations and comments that I made about different things. Said that I was a person that she felt that would be a nice person to develop a relationship with.

[00:03:40] Bob: So she sent you a friend request outta the blue, basically?

[00:03:43] David: Yeah, pretty much. Yep.

[00:03:45] Bob: And And you accepted it, right?

[00:03:47] David: Yes.

[00:03:49] Bob: And so they start chatting.

[00:03:52] Bob: Do you remember anything about the first couple of conversations that you had?

[00:03:56] David: We talked a little bit about what we did and where we lived. She identified herself as a resident of Indianapolis and that she was 61 years old and planning on retiring, as doing the interior decorating that she did.

[00:04:14] Bob: They hit it off pretty quickly and David starts talking to her all the time.

[00:04:19] David: Yeah, we were talking quite a bit and we ended up doing a video call and then we ended up making phone calls back and forth.

[00:04:28] Bob: So the video call that happened within maybe a week or so of you first connecting?

[00:04:33] David: Maybe about two weeks.

[00:04:35] Bob: And what was your reaction when you saw her on video?

[00:04:37] David: I thought she was a very attractive woman. She came across as a very intelligent woman who had herself together and was planning on making a good life for herself.

[CLIP]

[00:04:49] Bonnie: Hello. David. It's me. Bonnie.

[00:04:52] Bob: Where was she? Was she in an office or at home? Do you remember?

[00:04:55] David: Home. Home office.

[00:04:57] Bob: Oh, okay. And it looked like a normal background. You saw her in her environment anyway, right?

[00:05:04] David: Yeah. Nothing that looked abnormal or nothing that was shaky or sketchy as far as a video was concerned. It looked as normal as probably I looked from my video conversation.

[00:05:19] Bob: And, and did that video last for an hour or just a few minutes?

[00:05:22] David: No, probably about 20 minutes.

[00:05:25] Bob: And after he video chats with Bonnie. Well, David is pretty interested in taking things to the next level.

[00:05:32] Bob: Okay. So you have two weeks in, you have this video call. You think she's attractive and smart and at this point, is the relationship already turning romantic or are you still trying to figure that out?

[00:05:44] David: No, she started telling me how she felt about me and what she thought about the things that I did in my life. 'cause I shared with her the fact that I was fully retired and what I had done in my past life and what I was doing in my retired life.

[00:05:58] Bob: And soon after Bonnie has exciting, maybe even life changing news to share.

[00:06:04] David: She told me that she has to go to California to San Jose, and she has to make a proposition along with other people to this company, which was a computer company. And that if she was awarded the contract, it would be for a new office that was gonna be set up in Sydney, Australia.

[00:06:26] Bob: Oh, wow. So my reaction to that would be, “Oh no, she's gonna move to Australia,I'm gonna lose this relationship.”

[00:06:33] David: Not really, because she told me that the contract was only gonna be a four week stint in setting up this company, and that at the end of that time, she was planning on coming back here.

[00:06:45] Bob: Oh. Oh, okay.

[00:06:47] Bob: So she goes to California and while she's there, she sends more videos to David. These are one way videos, though, not video calls.

[00:06:56] David: She said that the internet connection wasn't good where she was so that she had to put those on WhatsApp in order to get them through.

[00:07:06] Bob: So things go well in California. She lands the project and they start to make plans to meet. David agrees to help out Bonnie during the trip.

[00:07:16] David: Because we did talk about her home in Indianapolis, where she also informed me that she had a dog, and that the idea was that she was gonna fly here from San Jose and then we were going to go fly up to Indianapolis. I would get her dog in her car and bring it here so that when she was done with the contract, it would be easier for us to manage everything.

[00:07:43] Bob: Sure. And, but that's a, if she's going to have you babysit her dog for a month, that's a big step in a relationship, right?

[00:07:50] David: Yeah. But I have a nice yard and you know how that could handle that? And it was, it wasn't a big dog, so it, it was cocker spaniel.

 [00:08:00] Bob: And then Bonnie starts to tell David more about this big project she'd landed. It sounds like a very big deal.

[00:08:08] David: It was a five and a half million dollar contract. Of which they were gonna pay her one and a half million dollars to as her pay for getting the contract,

[00:08:19] Bob: But she's going to have to pay for some things upfront. Mainly she has to pay for the right to work in Australia. It'll cost about $5,000. And she needs help there too.

[00:08:33] David: Yeah. Once she was awarded the contract, she told me that she needed funds because she had exhausted everything that she had to get her working papers so that she could work in Australia. I researched how much it would be for that amount, and it came out to what she told me.

[00:08:56] Bob: David doesn't have that kind of money lying around, but he does have a home equity line of credit.

[00:09:03] David: And kept that accessible in case I needed it for emergencies. And of course before I met her, the emergency would be car repairs or home repairs or anything along that effect. I told her that I can only access about $5,000 that I could, I didn't tell her where it was coming from and she said I'll try to get the money from somewhere else.

[00:09:24] Bob: So Bonnie tries but eventually tells David she can't get it and asks to borrow the $5,000 from him. He agrees,

[00:09:33] David: so I withdrew that and had sent that to her by a wire transfer. Now she told me that she was working with an accountant that was handling that for her, and initially she gave me a wire transfer address that I transferred the money to, and she said, oh, it went to the wrong place and had them, and she told me this a day later, and she said, I'm gonna send it back to you, which she did. So everything looked very above board and copacetic. So the money did come back.

[00:10:10] Bob: So you could see the $5,000 leaving and then you could see the $5,000 coming back in your account.

[00:10:14] David: Yep.

[00:10:16] Bob: So that transfer experience gives David more confidence in their relationship.

[00:10:20] David: Once she sent the money back to me, she said, I'm gonna give you an the correct account that the money should have gone to, and I rewired the money back to her. So that went through.

[00:10:33] Bob: And along the way, Bonnie sends more videos to express her appreciation.

[CLIP]

[00:10:38] Bonnie: Honey, I just wanna say thank you for willing to help me with the $5,000 I requested. I promise I'll refund back the $10,000 as soon as I get the payment. Thank you, sweetie.

[00:10:52] Bob: With the work papers in place, Bonnie can now get started on this lucrative project, but the plans change almost immediately.

[00:11:02] David: She said that the company wanted her to start the contract immediately, so she had to fly directly from San Jose to Australia. So I told her, I said, I said I, I'd like to have your flight information, so God forbid anything happens to you. I know where you are and how you doing, and she did get me the flight information and did get me the confirmation with her name and ticket number and everything else. And as it was processed, everything looked a hundred percent.

[00:11:38] Bob: It's a bummer they can't meet before she goes off to the other side of the world, but this way the project will go even quicker.

[00:11:46] David: So she went to Australia and our conversations went back and forth through Telegram. And also she had gotten me a phone number with the Continental Code for Australia. We talked back and forth and we talked quite extensively.

[00:12:05] Bob: Do you remember anything about those conversations? She what? Was she just describing her life in Australia, or what was, what were you talking about?

[00:12:12] David: Yeah, the work Monday through Friday that she was doing there, and then how much was getting done, and I asked her to send me pictures so I could see what was what, and she told me that she couldn't send me pictures because the company wanted to protect the, I guess the integrity of the work or something about, I forget how she explained it, but it was something that made sense about that. The interior design work had to be kept confidential when it was all done.

[00:12:40] Bob: Sure, sure. But eventually she does send him pictures.

[00:12:45] David: It was the interior design of this new company office in Australia,

[00:12:50] Bob: Like an office building. Okay. Yeah.

[00:12:52] David: In Sydney, Australia and the just basically assigning the workers, making sure the right products are being purchased and the the work being done. As a matter of fact, at one point she did send me an interior design shot that looked nice. It ended up getting deleted somehow. That was the, that that's how everything progressed.

[00:13:15] Bob: and that's when the project hits a major, major snafu, something that threatens her entire $1.5 million payout.

[00:13:25] David: There was something that happened in Australia, which made national news that their wages had been increased by the government for all the workers there. And she told me that she, when after the four weeks was done, the job still hadn't gotten finished, but she was having trouble paying the workers because of the increase in wages. So she said, told me that she was $25,000 short.

[00:14:00] Bob: She is $25,000 short. Now, that's a lot of money, but David has one good reason to believe that Bonnie's good for it. She's already sent him one and a half million dollars, sort of.

[00:14:15] David: in the midst of all this when she had gotten the job and she knew she wasn't gonna be home and she didn't have anyone to accept her check at home, that she was gonna have the company send me her paycheck to my address and her name, so that I could hold it for her so that when the job was done, of course she would see that, that she was paid. She was showing me that she was being paid for the job, and once I received that, she wanted me to take a snapshot of the check and send it to her to verify that I received it. So I did that.

[00:15:00] Bob: And this was a big check, right?

[00:15:01] David: Yeah, one and a half million dollars.

[00:15:02] Bob: Wow, so you get a one and a half million dollar check?

[00:15;05] David: Not made out to me. Made out to…

[00:15:06] Bob: But still that's, what is it like to hold a check that's written out for a one and a half million dollars?

[00:15:10] David: It felt pretty good.

[00:15:13] Bob: I'm sure. Yeah. Yeah. Was, did it, was it printed or like from a bank or was it handwritten, do you remember?

[00:15:19] David: So it was a printed check signed by the owner of the, the company, as it looked. And I had called the company to verify their existence, except I, I had not mentioned her because I didn't want to do anything to put her job in jeopardy and make it look like she was questionable as far as their hiring her. Once I…

[00:15:43] Bob: Oh, sure. But this company was a real company that the check was who wrote up the check, right?

[00:15:48] David: Yeah, it was.

[00:15:49] Bob: Now I would be scared about where to put such a check. Where'd you put it?

[00:15:52] David: I kept it hidden here in my home.

[00:15:56] Bob: Yeah. I would be nervous about that. Okay. But she didn't want you to cash it or anything? She just wanted you to hold onto it?

[00:16:01] David: There wasn't any way I could cash it, because I would've had to forge her name or whatever to do that.

[00:16:06] Bob: Banks don't often go around cashing $1.5 million checks without, without doing some additional work too. But now you have this.

[00:16:13] David: I wouldn't cash a $10 check in somebody else's name.

[00:16:18] Bob: So with a $1.5 million check in the safe, $25,000 doesn't sound like such a large request.

[00:16:26] David: She asked me for it, and she said she didn't want to ask any of her other friends or anybody that had, that she had known or worked with or anything else because she didn't want them to get involved in any of her private business because over the, what was it, February, March, April, may, June, July, August. Over the seven months that her and I had been communicating, she said that she wasn't sharing any of this information with anyone else that she knew except the people that were involved in initially getting the contract.

[00:16:58] Bob: But you had told her you could only access $5,000 a few weeks earlier. Right? So…

[00:17:02] David: Yes, but I eventually let her know seven months later that I did have this home equity loan that had a ceiling of $25,000. And she said, if you can get the money and Bitcoin it to me, then I'll get it right away and then I can get ready to come back home. And I tried many different ways to convince her to go, other ways to get this money. I tried to tell her that she could probably get a credit card and she can get a cash advance on her own for these different things. And she said, I don't have that much money in my account that I can do something like that.

[00:17:41] Bob: So ultimately David agrees to send her $25,000 in this case, because he's halfway around the world, he agrees to send it via cryptocurrency.

[00:17:53] Bob: Wow. Okay. So, sending $25,000 through a Bitcoin machine, that kind of takes a long time, doesn't it? You're feeding a hundred dollars bills in the thing?

[00:18:00] David: Not really. No, it takes maybe 10, 15 minutes.

[00:18:03] Bob: Do, do you remember how you felt when, 'cause did you have it all in cash at that point?

[00:18:07] David: Yes.

[00:18:08] Bob: Again, walking around with $25,000 in cash is something that would make me nervous. Are you nervous?

[00:18:12] David: Not really. Years ago, I worked as a conductor on the Long Island Railroad in New York, and there were days that I had thousands of dollars of cash on me.

[00:18:24] Bob: And then things get confusing very quickly. Somehow. Another big transfer gets taken out of David's home equity line of credit. The bank notifies him. He now owes nearly, nearly double the money, and that triggers something in David, something very dark.

[00:18:41] Bob: When this money disappears or now suddenly you owe $48,000 on your HELOC, what are you thinking about Bonnie?

[00:18:48] David: As soon as I decided to go ahead and claim that this money was fraudulently taken from me, my idea was this person really scammed me.

[00:19:00] Bob: Do you remember the moment when you went from thinking, this is a person I might move in with I'm in love with, to this is actually a criminal. Do you remember that moment? And what it felt like?

[00:19:09] David: it was. It was and still is devastating.

[00:19:11] Bob: Sorry.

[00:19:12] David: I put as realistically as possible, everything I had into her and all my trust and everything else.

[00:19:22] Bob: Did you try to reach out to her when you realized it was a scam?

[00:19:24] David: Yes.

[00:19:25] Bob: And what happened?

[00:19:25] David: She had told me that it wasn't a scam. And you know that at that point she was saying you could do whatever you want if you don't wanna believe me. So it was just tearing me apart.

[00:19:36] Bob: And at, at this point, not only has, have you spent your entire home equity line of credit, which is what you needed for emergencies, but basically. You, you owe the bank double that amount now, right?

[00:19:49] David: Yeah.

[00:19:50] Bob: So David goes to the police, he gives them hours of chat, logs, all sorts of other evidence, and then he shows them the videos of Bonnie or whoever that was.

[00:20:02] Bob: Do they think that they were artificially generated?

[00:20:04] David: That's what the sheriff's office finally told me,

[00:20:07] Bob: that it was an AI video or something like that, right?

[00:20:09] David: Yes.

[00:20:10] Bob: But it looked real to you, right?

[00:20:11] David: It looked very real.

[00:20:12] Bob Is it unnerving to think that somebody can make. Out of air, out of ones and zeros, a video that looks like a woman who's having a relationship with you? Is that scary?

[00:20:20] David: There's the fact that you know that there's so much being trusted into AI right now, whether it's through voice communications or video communications, that it's almost impossible to tell whether or not you're talking to a real person or not.

[00:20:39] Bob: And I know, that, that, that's part of the reason that you and I are talking because we, I think people need to know. That this is going on out there. I'm sure other people would find themselves in the same circumstance as you. They'd get a video like that and they'd think it was real. Right?

[00:20:51] David: Yes.

[00:20:53] Bob: Fast forward to today and David doesn't have any idea what's going on with this criminal case or with his debt to the bank.

[00:21:01] Bob: Okay, where does it stand right now, with the police?

[00:21:03] David: They told me that they were able to trace any monies that I was able to show them even through the Bitcoin transactions to Nigeria. And they said once it's out of the country, there's nothing that they can do.

[00:21:16] Bob: Right.

[00:21:17] David: Now, they said if it gets to the point that it gets to be a high enough dollar amount with enough people involved, that report these things, that they can go outta the country to try to do something about it. But they said until it gets to that point, my dollar amount alone is not enough for them to do anything more than just what they can do til it left the country.

[00:21:43] Bob: So as of now, you don't have much hope of ever getting any of your money back, right?

[00:21:48] David: No.

[00:21:50] Bob: The bank is investigating, but they're still sending you bills? Is that where it stands?

[00:21:53] David: They're sending me bills. With interest.

[00:21:56] Bob: Oh, God, yeah.

[00:21:57] David: On, on that $48,000. But I just received a, uh, communication from the bank saying that they have stopped, they, they don't want me to access my home equity, my line of credit.

[00:22:14] Bob: David, read a little of that letter to us.

[00:22:17] David: Thank you for relying on us to meet your borrowing needs. We work hard to deliver a legendary customer service and we're committed to keeping you informed. Today we are writing to let you know that we've proactively, it's not really proactively, it's like reactively placed your account on hold, as we suspect you were a target of fraudulent activity. And in bold letters it says, we put your line of credit account on hold while we are working to protect your account, no additional funds may be withdrawn from your line of credit. We recommend that you visit your local store to further verify your recent account activity. Your local store will be able to assist you with the next steps if any fraudulent activity is identified. And of course, this is after being reported in June. They're just sending me this letter now, which is how many months later.

[00:23:17] Bob: That is dripping with irony and the very definition of closing the barn door after the horses have run out.

[00:23:22] David: I think the barn burnt down.

[00:23:23] Bob: The barn burnt down.

[00:23:25] Bob: And so David is in a pretty bad spot right now.

[00:23:30] Bob: How, I know this is all still pretty, pretty fresh to you. How do you feel about it today?

[00:23:34] David: Making me physically sick.

[00:23:36] Bob: I'm sorry.

[00:23:37] David: Because. I, number one, don't know where the money is gonna come from, that I can pay it back if the bank gets to the point of making me pay it back. And number two, I don't wanna be in a situation as a senior to lose my home prior to this because of my involvement with community and volunteering. Before the two hurricanes came in last year, there were over a hundred seniors living in their cars just here in Pasco County, Florida.

[00:24:09] Bob: Oh God.

[00:24:10] David: And now since the hurricanes, there's over 300. Seniors living in their vehicles here in Pasco County just due to displacement from the hurricanes.

[00:24:21] Bob: But you're afraid you might become one of them?

[00:24:23] Yes.

[00:24:24] Bob: David is worried about being another senior citizen living in their car as if his life had been destroyed by a hurricane. That's not a bad metaphor, really.

[00:24:35] Bob: There's another part of this that we haven't talked about yet, but you thought you were going to have a life with this person. And suddenly, not only she's a criminal, but now you're alone again. Now how does that feel?

[00:24:46] David: It feels pretty bad. I've outlived my whole family.

[00:24:49] Bob: So now you feel like you're all alone. That's, I'm really sorry. That's really sad.

[00:24:53] David: Yeah.

[00:24:53] Bob: But that's obviously, that's part of what happened here, right? Is you were looking for companionship like everyone was. Everyone is. And then made, made you vulnerable, right?

[00:25:00] David: Yeah. I mean, you would like to have someone be part of your life that's somewhat of a match so that we have something to look forward to for whatever amount of years that you have here on, on the earth. The unfortunate thing is people don't need a gun anymore to rob you. All they need is a computer.

[00:25:24] Bob: All they need to rob you is a computer and often a story. But now there are so many tools available to make those stories far more realistic. We have no way of knowing if the videos that David received were AI generated, deep fakes or something else. But we wanted to talk about the possibility with an expert. So we have Ricardo Amper here today. He is the founder and CEO of Incode Technologies, which builds software to detect AI fakes. I asked him first about David's situation.

[00:25:59] Ricardo Amper: It is very unfortunate, but this is actually very common and online romance scams are happening more and more. It's a $1 billion industry right now, and we think it's gonna be a lot more going forward.

[00:26:14] Bob: One thing to note about David's story is that after that initial video call. Most of the other videos that she sent were one way clips. In other words, not interactions. That's a common tactic.

[00:26:27] Bob: Something I've heard from other people who are victims of romance scams using fake videos is this deal where they generate the criminals, a few moments of video, enough to be persuasive, and then, oh, the Internet's going out, let's switch to chat.

[00:26:39] Ricardo Amper: Yeah.

[00:26:40] Bob: Tell me why that happens.

[00:26:41] Ricardo Amper: That happened two or three years ago because the technologies were not good enough to be able to convincingly sync your voice to the lips. So what a deepfake is just AI generated video that it has to match your lip, it has to change your voice, and it has to look like the identity they're trying to portray. And two years ago, you could pull that off for just a little bit of time and then there will be like very clear tell signs that your voice was not matching with your lips.

[00:27:13] Bob: So if someone tries to send you a series of short video clips and offers up excuses to refuse a video call, well that's a sign of trouble. But unfortunately, warnings like that are becoming outdated. Ricardo says.

[00:27:28] Ricardo Amper: But the scary thing is that now you can sustain an hour conversation and. And you cannot tell. So that tell sign, which happened to us, which is it's not gonna work for a long time, that is now a solved technical problem. And now you can have an hour conversation and don't tell, don't even know.

[00:27:46] Bob: There was a time when we speculated that there would be these deep fakes or people would create very realistic video versions of lovers or executives or whatnot, but it was hard to really spot them out in the world. But when you guys contacted me, you referenced that there were 2000 documented incidents of deep fakes being used and crimes broadly speaking like this. Has this kind of deep fake video crime, has it now come of age? We really seeing out there in the wild?

[00:28:13] Ricardo Amper: Deep fakes, in general are becoming the number one attack vector in many industries. And what we're seeing is that at least right now, one third of all the attacks that we're defending are DeepFakes where people are portraying or systems now, if it's not even people, it's like people and agents, AI agents are portraying to be someone they're not and they're trying to fool banks. And these online sites on thinking that there's someone else and they're opening accounts on other people names, or they're taking over a bank account that they might not own.

[00:28:54] Bob: The most alarming part of this story is criminals are using our very human nature against us.

[00:29:02] Ricardo Amper: and it's scary because biology taught us to believe the person we are looking at and the voice that we're hearing. And this is not really about your instinct. It is more about how technology is evolving in a way that humans cannot even detect it.

[00:29:19] Bob: Years of evolution have told us he, here's how you know that someone's real. And it's by seeing them, their lips move. It's by hearing, hearing their voice. This is. This is a really big change for people to absorb in a very short span of time that we're going to have to change the way we verify who someone is, right?

[00:29:36] Ricardo Amper: And that's the problem, right? Because it is not about how sophisticated you are, it's more about making sure that the companies or the sites that you're dealing with adopt this AI technology to verify your identity. And the problem with this is that we work with Homeland Security as well, even them, they have issues when, for example, pilots are verifying their identity to be able to fly private and all of this is done on a digital way. There's very little they can do unless they adopt some of these new AI technologies.

[00:30:12] Bob: Deep fake attacks are becoming so common that, well, Ricardo himself was a victim.

[00:30:18] Ricardo Amper: One of the first ones that we saw was actually a deep fake attack directed at our company. So our head of Treasury, the person that moved the money in the company got a WhatsApp message from a fake me saying, and Andreas, we need to do an acquisition and I need you to send a million dollars to fund an account so that we can secure this, this initial acquisition, and of course the person said, let's go to a video, and they went to a Zoom call and. Looking at an actual video, live video, there was fake me talking about how we needed to do this very discreet acquisition. And it was hyperrealistic. There was no tell sign. And what happened was three, three minutes into the call, it started getting choppy as if you had bad wifi, and then the person said, let's go into a different video. And so they went into Teams meeting and same thing, very realistic person trying to explain, which was portraying to be me trying to explain what we were trying to do. And eventually went to a phone call and the, our treasurer was convinced that it was actually me asking him to wire money and at the very end, the reason why he didn't was because I never called him for that. It's really the CFO. So he calls the CFO, Hey, look, I on abundance of caution, I'm about to send a million dollars. Is that okay? And the CFO is okay. I didn't know about that. And they ended up sending someone to my house as I was working from home. And that's how we first saw some of the first deep fakes.

[00:31:57] Bob: It was very brazen for someone to attack your company this way, wasn't it?

[00:32:00] Ricardo Amper: It was, yes. That required a lot of guts to, to do that, and I, I, yes, I tell you that, but the truth is that it's extremely common.

[00:32:07] Bob: Just to give you an idea of how realistic these things can sound, we asked Incode and Ricardo if they could send us an example of a Deepfake phone call. And they did. What you are about to hear is a recording of Jorge Braniff, VP of Fraud and Operations at Incode, talking with a bot impersonating Ricardo. As you listen, think of this: Jorge told us this is a simple agent trained on a single, short voice clip. The agent was provided with a general prompt to ask for a bank transfer. Nothing was scripted; the agent is making up the words as it goes along.

[CLIP]

[00:32:43] AI Bot: I’m calling today regarding and a rather urgent matter that requires your team’s immediate attention.

[00:32:49] Jorge Braniff: What do you need?

[00:32:52] AI Bot: I need to request a money transfer for confidential purchase. This is a strategic project that requires immediate action. Could you please let me know what information you need from me to process this transfer?

[00:33:05] Jorge Braniff: How much do you need for this transfer?

[00:33:08] AI Bot: I need a transfer of $500,000. This is for a critical time sensitive acquisition that will significantly benefit the company.

[00:33:17] Jorge Braniff: Okay, thank you. I’ll let you know when it’s done.

 

[00:33:21] Bob: Ricardo also told us about a scary incident involving a set of companies in Las Vegas, and the key to that attack was a realistic looking, deep fake video.

[00:33:32] Ricardo Amper: One of the things that happened about a year ago that was really scary was when in Las Vegas. Someone was successful at taking an IT manager's identity. So the person who's managing the system for some, one of the biggest hotel chains, they were able to use a live deepfake, meaning the same type of attack that we had where you can see a person you're interacting live through a video zoom. So the person was saying, look, I lost my phone, I lost my computer. I need to get access to this. They got into a Zoom call with one of their colleagues. They gave them access and that person was actually highly sophisticated hacker who then took over that system in in Las Vegas and for two weeks, essentially half of La Vegas stopped and they blocked the system until they got ransom, and eventually they got paid more than 30 million in ransom. The loss that they had was in the hundreds of millions.

[00:34:36] Bob: Wow. That's crazy.

[00:34:37] Ricardo Amper: Yeah. But like this, we're seeing attacks where like we work with eight of the top 10 banks and in a lot of them they're trying to reset the password. They're trying to full customer success people into thinking they're there. And uh, when some of the biggest scams is when someone takes your account and starts making, moving money,

[00:35:03] Bob: if all this isn't alarming enough, I think there's something. Even more dramatic that's going to happen in the scam world with AI. All those banks of employees we've talked about who serve as frontline callers and texters who make the initial contact in many scams, well they're gonna be replaced by AI agents, and that will really allow criminal gangs to dramatically scale up their attacks. At virtually no cost.

[00:35:29] Ricardo Amper: And the scary thing is usually scams were made by humans. And so we think by next year, half all the scams in the world in identity, not just in Romans, but in in every part of identity is gonna be created by some of these gen AI tools that, by the way, cost al almost zero money to make. And what is also changing is that it used to be also you need a live human right and it's doing this deep fake, it's tricking you. But now with the advances of gen AI, this is now starting to be done by AI agents who are responding to you in real time, who can reason, like you can reason with a chat, GPT or a Claude. So you're talking to that, you know, agent, AI Agent, the agent is reasoning, is answering back to you. And so what it means is that at least before there was some limit at the scale of the attack because you needed real human beings to do that. And now. There's not gonna be a limit. You, there can be a million agents deployed and try to trick people and, and essentially that costs almost nothing.

[00:36:38] Bob: Yeah, I do wanna dwell on that point because for a long time I've, we've done many episodes about these call centers where there might be hundreds of people making phone calls day after day and trying to be, begin the process of making someone into a victim of a crime. But what you're saying is, instead of buildings full of people making phone calls, this is just gonna be a computer doing this.

[00:36:59] Ricardo Amper: This is just a computer doing that. And it's interesting because. Romance scams are particularly hurtful because it's all about creating connection. And these are just agents who are trained at responding very effectively to that connection. But the other, one of the others that, that we're seeing a lot is we're when people are interviewing. And so you see there, there's people who are having multiple jobs at the same time. They're collecting multiple pay checks and it's really just all fabricated.

[00:37:32] Bob: Okay, so what can you do about all this? Let's go back to David for a minute. We showed Ricardo the videos that David had been sent. Were they really AI generated? He said to us that you can't often tell just by looking anymore. Some AI videos like the ones in this example, can be extremely convincing for the general public, and even trained professionals can get it wrong on the first try, so he doesn't recommend relying on your eyes to spot the fake.

[00:38:01] Bob: Two years ago, as you're saying, we would be telling people, here's how you spot a deep fake, you look for bad fingers, or something like that. I don't think advice like that. Is really helpful anymore because the technology is advancing so quickly. So instead I just wanna leave people with big impressions, one of which is this is real and you listener might encounter this someday soon. Right?

[00:38:19] Ricardo Amper: Exactly. Right.

[00:38:20] Bob: What do you want people to know about this today?

[00:38:22] Ricardo Amper: The, the first thing is to recognize that, first of all, the reason why there's a lot of misinformation is because there's a lot of shame involved when people, this happens very frequently and we protect a bunch of the companies about the dating sites and similar type of companies. And I think the first thing is people are very ashamed of that. And I think the first thing I would tell your listeners, your audience, is that you shouldn't be ashamed. This is happening all around and this is where. Technology has evolved far beyond what human intuition can detect. And so what I would tell those people is make sure that when you're engaging in some of these websites, you have to be able to verify your identity first. And it's not, it's not that you don't trust. There's, I think, as, as long as the, that's. The modest operandi of the site, then I think it's a much less likelihood of having these type of sites because what they do is they take your picture, they make sure that it matches the picture that you have on your driver license. If they're good sites, they hired a vendor like us where we would in real time be detecting the deep fakes. And so it becomes a much safer environment. And this is happening not just for romance, but it's happening for Airbnb and car sharing, and every community where you have to be able to trust the other person. And so that's the most important thing.

[00:39:47] Bob: Can you. I realize I'm asking a question that's a years long answer, but I'm wondering how, if you could boil it down for us just to give us a short version. How does technology like yours work? How do you detect deep fakes? How do you, if we can't do it as humans, how does your computer tell the difference between what's real and what's a computer?

[00:40:04] Ricardo Amper: We are at an age where it's really AI, um. It's called defensive AI. It's AI who is protecting AI scams. And so what we do is we have a pretty extensive fraud lab where we have about 120 tools, some of which we develop, some of which we get from the dark web and even just the normal video generation sites now that that exist. And we create millions of DeepFakes. And then what we do is we train AI models to be able to detect things, and so they detect very little pixel by pixel tell signs that a human eye cannot see, and if you train them with enough data and, and you have enough samples, because it's not just the ones that you generate, but also the ones that we catch live with customers. Eventually you create. A really good model. And so when someone is, uh, trying to verify their identity at these sites, once you take your selfie, those models are looking at those little pixels and in a very microscopic way and with very complete algorithms that the AI invented to be able to tell. And so that's how it works.

[00:41:21] Bob: So you make deep fakes that you then detect.  

[00:41:24] Ricardo Amper: Exactly right.

[00:41:25] Bob: Yeah. That's fascinating.

[00:41:26] Ricardo Amper: It is, it is. It is interesting. It's like we have to advance the art of how do you create of some of these and also be very active in, in the dark web and understanding what are the newest tools that are coming out, and then with that capability, then you can train models to protect it, which is interesting.

[00:41:43] Bob: And you must have to keep the deep fakes that you create. Very well protected, also.

[00:41:48] Ricardo Amper: Extremely well protected. Yes. Yeah. And also like one, one of the other things that are helpful in these, you know, identity scams is that there are now ways where without sharing data and by respecting human privacy, that we can understand that if someone's identity is being misused to do some of these attacks, then we can react in real time. So let's say that all of a sudden someone is trying to be Bob Sullivan and it's trying to, trick some of the main banks and we can detect that identity is being reduced. And so that means that the next time even you try to verify your identity, we're gonna be extra careful. The model is gonna throw out extra protections to make sure that you're really you and not someone is using your identity.

[00:42:41] Bob: And certainly at the, for a bank, that's very critical, but it's also critical to someone like, like David. So I, I'd like to leave it here for the. The next to David, someone who is out there specifically looking for romance in any of the various ways we do that nowadays. What kind of suggestions would you have to help avoid encountering a situation like this?

[00:43:00] Ricardo Amper: The first one is meet people in sites where one of the first things that you do is you verify your identity and make sure that the. Person that you're interacting with has also verified their identity. And I think as you said, there's romance, romance sites and other type of dating sites where you have the choice of interacting with someone that did or someone that didn't. And my advice is make sure, make sure that you did. Second thing is very quickly they haven't done so and they want to go into video. You have to try and think, look, this is my biology tricking me into thinking this is real. Please verify first and then we can engage. It could be a little bit awkward, and particularly if you're new to these sites or you've never interacted with these technologies, but it is having a lot of impact. It is protecting these romance sites from fraudsters, but it's also, for example, protecting even adult entertainment sites from making sure there's no underage people being used. So it's actually, the more you encourage others to use it, the safer the world is.

[00:44:09] Bob: That makes sense. It's certainly not romantic to say, Hey, I don't wanna chat with you until I see your driver's license or something like that. But if we normalize that, everyone will be safer.

[00:44:16] Ricardo Amper: Everybody's safer. And I think it's also important that you don't use websites that are not encouraging that. It's also another way to make sure that there's an economic interest from these companies to be able to do that. And the ones that don't get marginalized,

[00:44:33] Bob: Normalize asking for ID when you start dating. That's certainly something to think about.

[00:44:39] Bob: Okay. Again, I'm David, and a typical scenario we hear about here all the time is that someone's not necessarily on dating site.com, but they're using Words with Friends or just using Facebook or something like that. And a person approaches them somehow. And even sometimes these can be people they've played Words with Friends with for four months. And so you're not in a situation where there's a sign up, a verification process, whatnot. What would you suggest in that situation? What could somebody like a David say to, to someone who comes into their Facebook messenger and says, “Hey, let's start a conversation.” What would you suggest as a verification procedure?

[00:45:11] Ricardo Amper: It is very tricky because when you have a more freeform way of meeting someone, it is, the, the social media companies haven't necessarily adopted that. The first thing I would say is some of these social media websites have already what they call verified profiles. Make sure it's a verified profile. The second thing I would say is if they're from the area, trying to meet them or trying to encourage a meeting is important because the more you get involved, the more connection, the more likelihood you get scammed and unfortunately, out of all the fraud that we see, the one that has the highest ticket, the one that costs the most by person is unfortunately romance scams. And look, but there's the industry's changing. Most of the video conferencing companies are adopting technologies like this, and so. For example, as we're talking right now, Bob, it wouldn't be rare maybe in a year that you would have a check mark that says, you're the real Bob Sullivan. I'm the real Ricardo Amper. So that is gonna make it a lot easier and a lot less awkward because every conversation is just gonna be verified like that. Right. You verify once you show your ID once, and then the next time you're in in the Zoom, it shows that you did, and it's a safer world.

[00:46:27] Bob: Despite all these scary implications. Ricardo actually remains optimistic about the future of AI, but it's going to be a bumpy ride.

[00:46:39] Ricardo Amper: I think it's difficult to understand how that gen AI, I think it's a wonderful revolution that is gonna make all of us more, more productive. But in the short term, there's gonna be a lot of fraud. I think people don't understand the scale at which the combination of impersonating someone perfectly in video and audio in a combination with AI agents doing the conversation, is gonna become a, a big mess. And I think there are three things that, that people have to be aware of. Number one, as we said, make sure that website that you're interacting with, hopefully has had implemented this technology. Number two is, we are working with government at the local and state levels where we partner, for example, with the DMVs. And we can use that picture that the DMV has of you in in the future. You won't have to necessarily show your ID, it's just gonna be like first name, last name, date of birth, and just take a selfie and we'll be able to verify you. And so it's a lot better than having to pull out your driver license. And the third thing is the more you encourage companies to use these type of technologies, it's more of a trust graph we call it, where all these companies are connected so that we together can fight fraud. So it's gonna get very bad in the next two or three years, but I think in 10 years it's gonna be a much safer and very productive world.

[00:48:03] Bob: Then we'll have a lot to talk about in the next 10 years, won't we?

[00:48:06] Ricardo Amper: Absolutely. Absolutely.

[00:48:09] Bob: For the perfect scam, I'm Bob Sullivan.

(MUSIC SEGUE)

[00:48:15] Bob: If you have been targeted by a scam or fraud, you're not alone. Call the AARP Fraud Watch Network Helpline at 877-908-3360. Their trained fraud specialists can provide you with free support and guidance on what to do next. To learn more about the Fraud Watch Network volunteers and the fraud survivors they've helped, check out the new video series, Fraud Wars, on AARP's YouTube channel. Our email address at The Perfect Scam is: theperfectscampodcast@aarp.org, and we want to hear from you. If you've been the victim of a scam or you know someone who has, and you'd like us to tell their story, write to us. That address again is: theperfectscampodcast@aarp.org. Thank you to our team of scambusters; Associate Producer, Annalea Embree; Researcher, Becky Dodson; Executive Producer, Julie Getz; and our Audio Engineer and Sound Designer, Julio Gonzalez. Be sure to find us on Apple Podcasts, Spotify, or wherever you listen to podcasts. For AARP's The Perfect Scam, I'm Bob Sullivan.

(MUSIC OUTRO)

END OF TRANSCRIPT

The Perfect ScamSM is a project of the AARP Fraud Watch Network, which equips consumers like you with the knowledge to give you power over scams.

 

How to listen and subscribe to AARP's podcasts

Are you new to podcasts? Learn how to subscribe to AARP Podcasts on any device.

Unlock Access to AARP Members Edition

Join AARP to Continue

Already a Member?