Javascript is not enabled.

Javascript must be enabled to use this site. Please enable Javascript in your browser and try again.

Skip to content
Content starts here
CLOSE ×
Search
Leaving AARP.org Website

You are now leaving AARP.org and going to a website that is not operated by AARP. A different privacy policy and terms of service will apply.

‘National Geographic’ Photographer Paul Nicklen Warns About Social Media Impostors, Part 2

Both the famous photographer and a Navy veteran reach out to Meta to stop fake Facebook accounts being opened in their names

spinner image celebrities are often victims of social media impostors
AARP

SubscribeApple Podcasts | Amazon Music Audible | Spotify | TuneIn

In part 2, Kelly and her husband Ryan, who is retired from the Navy, are reeling from the theft of his Facebook account and fear it may be used for military-related scams. World-famous wildlife photographer Paul Nicklen is fighting his own battle against scammers impersonating him on social media. His impostors have stolen thousands of dollars from fans who think they are donating to his causes or carrying on a romantic relationship with him. To make matters worse, both Kelly and Paul have reported impostor accounts to Meta and have been told those fake accounts aren’t violating the terms of service.

spinner image infographic quote that reads "Why are these companies that are drowning in billions of dollars, the biggest, best tech companies in the world, with the brightest minds, not able to tackle something so simple?"
AARP
Full Transcript

[00:00:01] Bob: This week on The Perfect Scam.

[00:00:03] Paul Nicklen: Why are these companies that are drowning in billions and billions of dollars, the biggest, best tech companies in the world, with the biggest and brightest minds, not able to tackle something so simple?

(MUSIC INTRO)

[00:00:18] Bob: Welcome back to The Perfect Scam. I’m your host, Bob Sullivan. It's easy to pretend you're something or someone you're not online. Social media has made becoming an impostor even easier still, and impostor accounts are often used for crime. When we left our story last week, Kelly Anderson and her husband Ryan, a retired Navy corpsman, are reeling from the theft of his Facebook account. A criminal has hacked in, changed Ryan's profile picture, and Kelly is worried a series of military-related scams might soon follow. Meanwhile, impostor scams can take many forms, and one of the most insidious is when criminals impersonate celebrities. Paul Nicklen is a world-famous wildlife photographer with a massive Instagram following, thanks to his many incredible stories for National Geographic. He faces a near constant stream of impostors and he just can't seem to get social media companies interested in fixing the problem. In Paul's case, his impostors have stolen thousands of dollars from fans who think they are donating to his causes or carrying on a romantic relationship with the famous photographer. To make things worse, both Kelly and Paul have reported impostor accounts to Facebook and have been told those fake accounts aren't violating the terms of service. So before we find out what happens to Kelly and Ryan, let's go back to Paul and see just how frustrated he is.

[00:01:48] Paul Nicklen: It's still so cathartic to block and report these people, but it seems like you're, you're blocking and reporting into a void. They run it through some algorithm and you know, unless there's proof or whatever, they, they don't shut it down and uh it, it's just going to get worse and worse and worse for everybody if, if it's not dealt with, but yeah that's, I'm glad, I'm glad you're digging deeper into this, and I appreciate you guys taking this on. We need, we need to start peeling back the layers of this problem, and I don't want to give up on social media, but I'm getting pretty close.

[00:02:18] Bob: Why is the problem of impostors so hard to solve? It's certainly not new.

[00:02:25] Bob: The New York Times did a big expose on this problem 6+ years ago, and it seems like it's only gotten worse. How does that make you feel?

[00:02:34] Paul Nicklen: Yeah, it's, I just don't know. I mean it makes me feel terrible obviously, but why are these companies that are drowning in billions and billions of dollars, the biggest, best tech companies in the world, with the biggest and brightest minds, not able to tackle something so simple? You know why can they not put some code on every account that there's only one and true you. I mean the checkmark was supposed to help, but obviously it's not. I mean these scammers should get flagged the second that something comes up. Instagram does seem to be floundering a little bit. It's not what it used to be, but still, it, it's still makes no sense that with the greatest minds in the world can't write a bit of code that, that makes these impostors uh traceable and held accountable, you know. It's just but I'm, I'm not a tech guy. I'm an artist. I'm a photographer. I don't know any of this stuff, but I'm just shocked that it can't be dealt with more, more simply.

[00:03:27] Bob: Back in 2018 when the New York Times ran that story, here's what Facebook told the newspaper, "We take the reports really seriously. We're not going to get it 100% right every time, obviously." When we asked Meta, Facebook’s parent company, about our story, the firm sent us a statement that read, in part: Quote -- Scams, fraud, and abuse aren’t new challenges. They happen online and offline and across the entire technology industry. But given the reach and impact of Meta’s technologies, we know we have a big role to play in combating it.” They called the scam Paul Nicklen is enduring “Celeb-Bait” – a mashup of the words celebrity and bait – and told us that from April through June 2023 – that’s just three months – quote “we removed 676 million fake accounts on Facebook, 98.8% of these were actioned before they could be reported.” I told Paul about Kelly and Ryan and what they're going through, and to some extent, misery loves company.

[00:04:33] Paul Nicklen: Yeah, okay, well that's, that's, makes me feel a tiny bit better, but a whole lot worse, right? It's just like if that's happening to him and a good man who's doing service and you know that's being used to hurt people. I mean it's just, come on. Like why can the tech giants not get a handle on it? I don't know.

[00:04:53] Bob: When we left Kelly's story, she had just found out her husband's Facebook account had been hacked. After spending hours trying to get back control of the account, she ends up going late to work that day for her job as a photographer for Disneyland. But that is only the beginning of the problem.

[00:05:10] Kelly Anderson: We noticed the next morning where the person who hacked in uh started sending out hundreds of Facebook messages. Um, my husband woke up to message upon message upon message and phone call upon phone call from friends and family saying, "Hey, I think you got hacked." And we go, "Yeah, we already know. Uh but please start reporting." So that's when we started putting out the mass thing to support our Pew report. Um, and I noticed that I also got a message from this person, and the message said something to the effect of, "Hey, Babe, can you do me a favor?" Well, already knowing he's been hacked, the clue would have been that he never calls me, Babe. That's not a pet name we use.

[00:05:56] Bob: So yes, the hacker writes to Ryan's wife to say hello, but at some point, realizes that it's probably not a good idea.

[00:06:04] Kelly Anderson: So, uh before I could even wake up and respond, 'cause I'm on the West Coast and I work nights, so I wake up later than other people. Um, I had already been blocked, before I even woke up, before I could respond, this person had realized that I am close to the hacked account, um, probably because Facebook said I was his wife. And I was blocked, and I am still blocked. I can't see anything going on which makes this even harder. I can't monitor what's going on. So I am now relying on friends and family who regularly and diligently update me anytime they see anything weird.

[00:06:45] Bob: So Kelly can't even see what the criminal is doing with her husband's account now. But she hears about it from plenty of friends. The criminal has reached out to almost all of Ryan's friends within a day. Why?

[00:07:00] Kelly Anderson: Um, everybody had said some form of they were being asked for money, yes. Uh, it was, "Hey, can you do me a favor? I need $100, um, into, like, just until my paycheck." But I was getting sent screenshots, and they all were very, very predictable script. Um, it, luckily the, the hacker was not doing anything unusual as far as hacking. It was something that people were able to recognize fairly quickly, um, so everybody got some form of, "Hey, can you do me a favor? I need 100 bucks. Can you spot me until payday?" Uh, and then anybody who actually engaged, 'cause I have a lot of friends who like to mess with people, uh, anybody who actually engaged was then given some alternate route of, "Oh, uh, can you send it through Meta Pay? Can you send me Apple gift cards? Can you send me..." so it, that's when it really started to get even more obvious. Um, and they, my friends would string this person along as long as they could to try and gain any information. Uh, but it definitely got back to, "Hey, I need 100 bucks." So it wasn't even asking for a large amount of money, uh but they definitely went the, the wide route of just contacting everybody.

[00:08:32] Bob: And when the initial asks for money don't work, the criminal moves onto other scams. Not direct messages, public postings.

[00:08:42] Kelly Anderson: Then there was a post trying to sell a car, and then shortly after that the next day, there was a post trying to sell French bulldog puppies.

[00:08:53] Bob: Fortunately, most of Kelly's friends pick up on what's happening right away and report the scam messages and post to Facebook, but that doesn't have the intended effect.

[00:09:04] Kelly Anderson: Um, and so those two uh posts were also individually reported by several people, and I got a response from one person saying that those specific posts had been taken down, but still nothing had been done about the account itself. Um, so it looks like there is some response to individual posts, but they're still not, not responsive at all, and there's nowhere to reach out to about the account as a whole.

[00:09:33] Bob: And then the hack becomes even more personal.

[00:09:39] Kelly Anderson: And at one point they actually contacted my son through his Messenger Kids account. Uh, so that was a whole new level of uh scary, because we, at that point looking back at it, I'm glad it was somebody who was trying to scam people out of money and not a child predator, uh but my son came running down the stairs the second day and said, "Dad, didn't you say you got hacked? Your account is, is texting me." So that was really scary. That was a whole new blow, a new like close to home kind of violation of, okay, now I have to worry about my son's account and everything too. And luckily all he had was the Messenger account just keep in contact with me because I work so far away, um, and with his grandparents. So he doesn't actually have a full Facebook account. It's kind of an offshoot from mine. But yeah, that was, that was not a comfortable feeling.

[00:10:39] Bob: I, I can't imagine that it wouldn't be just incredibly disturbing.

[00:10:42] Kelly Anderson: It was. It, it was very, very disturbing.

[00:10:45] Bob: So while Kelly is digesting that, she's also communicating with dozens of friends, making sure none of them send the criminal any money.

[00:10:53] Bob: It's one thing to have your account hacked and be hassled, but all your friends are getting asked for money. What does that feel like?

[00:10:59] Kelly Anderson: Luckily, most of my friends are, are fairly internet savvy these days. Um, so other than the, the massive annoyance, um, I was, it, I was grateful that everybody was giving me a, a more informed approach of hey, you got hacked. Um, we were a little bit worried that some of the less tech savvy friends and family might get targeted. So we actually put out a kind of a, a special reach out to the grandparents, to the older aunts and uncles, um, and anybody we could think of uh, so I was telling my parents, like "Hey, let Grandma know that Ryan got hacked. Let her know not to, you know, not to engage if she sees anything from the account." And my husband did the same thing with his side of the family um, but that was, that was very much something on our mind of hey, this is, this is a thing that we need to be uh cautious of now. Um, and I am still giving updates regularly any time somebody sends me, "Hey, they're doing this now." Uh I put a, a post on my account so I can say like, "Hey, still continue to report. Continue to not engage with my husband's account because it, it's still taken over." It's been probably a month and a half at this point.

[00:12:29] Bob: and, you know, but here, that's very smart that you tried to proactively notify as many family and friends as you could think of, but you're sort of in a race against the criminal at this point, right?

[00:12:39] Kelly Anderson: Oh yeah, yeah.

[00:12:41] Bob: Meanwhile, after the generic posts about selling cars and puppies, the impostor criminal moves onto an even more narrow form of the crime.

[00:12:50] Kelly Anderson: We have seen that he has joined a few Facebook groups, uh buy/sell pages, things that my husband would never, would never be joining. Things like selling old cars. One of them that stood out to me was vintage Singer sewing machines. Very, very strange choices for pages to join, but um, from what I've read on forums like Reddit, a lot of times these, these people will go in and kind of do a multipronged approach and join a lot of groups and try and uh, try and scam people that way. Luckily, a lot of group admins will either ban them or report them, but that doesn't seem to do anything for the, the base account itself.

[00:13:42] Bob: So the criminal, pretending to be a Naval corpsman, has joined small, highly specialized groups like fans of a certain kind of sewing machine and posted fraudulent for sale ads trading of course on Ryan's credibility, on the credibility of his uniform, the one he sacrificed so much to wear. Seeing screenshots of her husband posting in these highly specialized narrow groups has actually given Kelly some food for thought about other scam activities she has spotted in the past on social media.

[00:14:15] Kelly Anderson: Seeing the prevalence of how many times that happens that I see and now knowing the, the flip side of it, um, it's, it's weird because in my head something has shifted that oh it's, it's not that person that's doing this. This person is, in all likelihood, a, a victim of hacking. And their account has been hijacked, and they're, they're going around doing this. So I, I've always had this kind of ick factor when I see people, you know, trying to sell animals and things online, and um, there has definitely been a shift of like, oh that person, that literal name that I'm seeing is very much likely not that person. They are, they are probably another victim. So it's, yeah, just seeing the reach of this kind of thing is even more maddening, because it's, okay, obviously Facebook knows what's, what's going on. They're very well aware. They have a dedicated URL to Facebook.com/hacked. Um, but they don't actually offer any help for the number one most crippling thing that happens on their website.

[00:15:30] Bob: Well you know, and you bring up another sort of victim in this situation which I hadn't really considered. Um, so there are people in groups probably right now who think ill of your husband because they see these posts and they say, well I know it's a scam. He's a jerk, right and, and maybe, and you'll never know, and you'll never be able to defend yourself.

[00:15:48] Kelly Anderson: Exactly. Yeah, and it's, it's more maddening because that current profile picture that he has, the most recent one that they updated, which is not a recent photo, it's a 10-year-old photo, is him in uniform, um, with his, with his dad and his younger siblings who were very much children in that photo. They were like 4 and 8 years old. So um, it, it very much looks like you know, it puts a bad name on, on his face, which to the internet at large, he's a faceless stranger. He, he doesn't mean anything to most people uh, but who have that uniform on there is another layer of, um, you know people who already think ill of the military or the people in the military, that's giving them another excuse of you know, negative, negative things to say, negative things to think, and yeah, it, there is that aspect of for the longest time, I didn't even consciously realize that I was, that I was doing it, but when I did see scammers on uh Facebook groups and things like that, in my head, they weren't hacking victims. They were the scammers themselves. I was going, oh, these people are, they don't even care. They're trying to scam people, blah, blah... and I never actually sat and thought about it, and went oh, that's probably not even them. But now I'm, now I'm aware.

[00:17:20] Bob: The incident, of course, only adds to the paranoia that Kelly feels and she spends the next several weeks changing every password in every account in her life and trying to figure out how she would recover other important accounts if she's hacked.

[00:17:36] Kelly Anderson: Yeah, I mean it, it was definitely one of those things that was, yeah, I've been meaning to do all of this for months. Now it's, now I'm being forced too, and admittedly some of the force was just coming from myself. It's from internal anxiety about it, but...

[00:17:58] Bob: But you say that like it's not real. You just went through it, it's very real.

[00:18:02] Kelly Anderson: It is a very real threat. So um, and then knowing that an account so close to mine was, was hacked, that automatically makes me a target, um, that makes me more likely to be a target because there is a successful hacked account in my house, um, or in my household I should say. So yeah, there is definitely this sense of somebody breathing down your neck, like just waiting to target you instead. And the um, the kind of faceless crime that you know internet hacking is, has become tangible.

[00:18:50] Bob: She feels like someone is breathing down her neck. Anyone would. But she also feels like she isn't getting the help she should with this impersonation nightmare.

[00:19:01] Kelly Anderson: Yeah, absolutely. I mean there's, there is no uh, you know, there's no business advantage to Facebook helping uh the everyday person. Um, it, it, which is kind of untrue. I don't know where the cost benefit analysis lies on that, but they, they should absolutely be doing something, even if it was an inbox that you could email and they got back to you in two months or three months or six months. There would at least be some hope and some reason to say, okay, I (inaudible), they must have a huge volume of this but they're trying. Right now, they're just not trying.

[00:19:42] Bob: Kelly feels like they're just not trying. Kevin Long, who you also heard from in the first part of this episode, agrees with Kelly. You might remember he runs a company named Social Impostor, which gets paid by celebrities who are trying to get impostor accounts shut down or trying to get help recovering a hacked account.

[00:20:03] Bob: Here's my question for you and your business. I mean your business exists because they're not doing enough to remove impostor accounts. Would you agree?

[00:20:10] Kevin Long: Yes, absolutely.

[00:20:13] Bob: And when I asked Kevin, who's been doing this for more than a decade, what Kelly could do, what hope she and Ryan might have to recover from this hacked account, he wasn't optimistic.

[00:20:26] Kevin Long: I, I, I don’t have a good answer for you because I, I could tell you what I would have done two years ago, but now they, they don't seem to have a way to resolve those type of issues that's effective. I mean she could spend a lot of money having me try to fix it, but in the end, my uh, my results on those cases like that have gone down as far as being successful in trying to remove those unless that account was verified to start with.

[00:21:01] Bob: If Ryan's account wasn't verified to start with, which for most people costs $11.89, the odds he can recover the account are basically zero. Especially, Kevin says, because of recent changes at Meta that involve the use of artificial intelligence.

[00:21:20] Kevin Long: The, the ability to find the right person at Facebook or Instagram to resolve that has significantly diminished as they have, they have reduced the number of people that are in the, the departments that handle those types of things, because they've replaced them with the AI that's supposed to be resolving these issues before they happen. And they don't. You know they just, it doesn't work. And very rarely, I should say. I can't say, I can't categorically say it doesn't work, but I can say I've seen very few cases where something like this is able to be resolved at all. I mean it's, it's a, you're very lucky if you find someone that will be able to resolve this issue without spending a ton of money right now. And it's unfortunate. And the more that they go to automating the reporting process, the less of eff--, effective it is. And it used to be that you would report it and a person would review it, and now it's re--, reviewed by um, their AI, their automated intelligence system and when you program AI, yeah, it learns over time, but initially if something's off just a little bit, it kicks those accounts and doesn't take them down. And so then you've got a list of things that are being, still being used to commit fraud that there's really very little recourse to be able to have, you know there's not, there's not a plan B. Once you go and the AI kicks it, there's not a plan B to have a human look at it, and they still need to have, in my opinion, they still need to have that human review option because you know in my case, I report hundreds of accounts a day for my clients, and inevitably, there's you know a dozen accounts that get by the AI that don't, don't, the AI for one reason or another, doesn't kick it. They don't like the picture and the profile. They can't see you know the, the name is spelled just a little bit off, so it doesn't meet their criteria on the AI. But in reality, if someone would look at it, common sense would kick in and be like, oh yeah, this is, this is a, a bad account. We should get rid of this. So I think as they rely more and more on AI and less on eyeballs to take care of the problem, it is not, uh, you know in theory you're getting, you're more effective because you're handling a larger volume, but in reality, they are missing a fair number of accounts because the AI hasn't learned yet to accept things that are just a little bit off, one way or another.

[00:24:07] Bob: One of the challenges I've mentioned for both Kelly and Paul, the wildlife photographer, is that social media firms sometimes don't recognize these impostor accounts as operating against the company's terms of service, or at least that's a gray area.

[00:24:23] Bob: Like it's against Twitter and Facebook's terms of service to impersonate someone, right?

[00:24:28] Kevin Long: It is; however, their definitions of impersonation are very narrowly defined, they're not that broad. Um, for instance, the accounts have to have a profile and/or banner photo of the person who is being impersonated. And if you have a, an account that's an avatar for example, just a plain old, um, you know, the face back--, or the outline of the, of the standard account where there's no profile picture, and that person has copied the bio, the content, the name of the person they're impersonating, technically that does not violate their terms of service. And those are the types of accounts that they're using. Now most of the time they use accounts that do have photos on them, and they've taken the photos from somewhere on the internet, or even straight from, from Facebook or Twitter themselves, they do that. So it does, yes, it does violate the terms under most parameters, but if they don't um, if they do something that's a little bit off, like they'll misspell the name intentionally by one letter, or they'll use a 1 instead of an L or a 3 instead of an E. If you're not paying close attention and you just kind of glance at it, you don't notice maybe that the account is off just a little bit. Um, then, then it doesn't violate, or they can say they're a fan page, or they're a parody page, or they're a unofficial page. And they put that somewhere in the about section that no one ever reads. And therefore, it doesn't violate the terms of service anymore, the community guidelines of the networks. Um, and they're very strict about enforcing that as a way to determine whether they take down an account or not.

[00:26:25] Bob: So you're telling me that there's a nudge bias, whatever word you want to use, towards keeping these accounts up.

[00:26:33] Kevin Long: In my opinion, there is a resistance to wanting to take accounts down because if they take accounts down, that reduces their number of users and they base all of their revenue stream on like number of accounts, eyeballs on accounts, time spent on accounts; that's the data that they use to go to advertisers that pay money to have the ads run on those accounts. So the fewer accounts they have, the less money they bring in, the less they can charge for ads, and I think it's all revenue driven and it's not necessarily driven by what's the right thing to do, in my, in my opinion.

[00:27:16] Bob: Hmm, so okay, maybe there's some vagary around what's a parody account or whatnot, but surely then asking someone for money in a fraudulent scenario is against their terms of service. But do you feel like the social networks don't do enough to stop accounts, even after there's evidence of a crime that's been attempted?

[00:27:35] Kevin Long: Absolutely they do not do enough. And they make it so difficult to report, and they've automated the reporting system to the level where if there's something that's off on your report, just the slightest bit, it gets kicked, and it doesn't get removed. And if once you've reported it one time, then you report it again, and you report it again, and you report it again, they don't take it down. And it's, you know they, their system is set up so that once they've looked at it once and determined it's not fraudulent for whatever reason, mistake or not, they don't want to review it again, and it just automatically doesn't get removed. It's not easy for the average person to report an account because, and do it properly because even if you have evidence of let's say you have a, a screenshot of a direct message that they've sent you that says, "We're building a school in Guatemala. We need $500. Can you help us out?" There's no place for you to send that screenshot to show them that the messenger, you've been messaged by that. And that this is why you think this account is fraudulent. And so they make it difficult to report in a way that's going to have the outcome that everyone desires which is to get the account taken down.

[00:28:54] Bob: Kevin's company used to maintain a list of the most popular accounts for criminals to impersonate, but they've stopped doing that.

[00:29:04] Kevin Long: Yeah, we kind of took that down just because, you know and again, it takes extra time to go out and find that, and I, instead of showing the number of people, or the number of different celebrities that are being impersonated and searching them and finding those accounts and spending the time doing it, because we don't have a way to do it on an automatic fashion, it just became a time suck, and it wasn't worth the time to shoot that. But um, yeah, even back then, I think Taylor Swift was one of the top ones, even though she was not nearly as famous as she is now, um, she has a lot. I mean there, there are a lot of high-profile entertainers that have this issue that either don't know that they have a problem, their social media team is, doesn't care, or there hasn't been enough of an issue, enough of a pain point for them to want to try and reach out and resolve the issue.

[00:29:56] Bob: So celebrities are obvious targets for this kind of impersonation. As Paul told us, impostor criminals send out direct messages pretending they are falling in love with fans or asking fans to donate money, or in some cases, telling fans they endorse a product or a point of view. But Kevin told me about another kind of client that's a target, and you might be surprised to learn about this.

[00:30:20] Kevin Long: Um, I work with a lot of folks in the ministry for example. Um, and so a lot of folks who are the televangelists or the high-profile ministers at the mega churches down to smaller ministries that are um, reliant on their followers to be financially supportive for their organizations.

[00:30:41] Bob: You know, I hadn't thought about the, the church congregation issue, uh but that's interesting to me as you've described it because in a lot of these situations, like in Paul Nicklen's situation, he just says, please know, I would never ask you to send me money. But in a, a church or charity situation, they, they do ask you sometimes to send money, right? So this, the crime scenario is a little bit more true to life.

[00:31:03] Kevin Long: There are a number of um, of my clients who go out and tell people, "We do not solicit funds over social media. Please don't send it in." You know, they'll make those posts to try and preempt that, but with the way that the social media companies govern the um, who gets to see the post, it's not like just because one of these folks makes a post like that, that all their followers are going to see it. It doesn't necessarily show up in their feeds. And they, they have to go to that person's page and scroll through everything that person has posted in order to see that.

[00:31:39] Bob: Those warnings are not very viral.

[00:31:41] Kevin Long: Yeah.

[00:31:42] Bob: Yeah, yeah, in fact, they have an incentive to, to, to make them not viral.

[00:31:46] Kevin Long: Yes, there's, well they ... unless you pay money to advertise, your posts get seen by fewer and fewer people. And then they want to get you on a plan where you spend more and more money to advertise more and more in order for more of your posts to be seen. So it's just, it's all about revenue generation for the networks. They weren't, they weren't getting revenue from ads.

[00:32:13] Bob: So, uh yeah, so a person is being impersonated which hurts their reputation, they, they're having money stolen from their followers, and then they're expected to pay to advertise to stop that, or to pay you to pull it down? That seems unfair. In your work, are the social media companies allies or foes or, or both?

[00:32:35] Kevin Long: That's a good question, and it's a tough question to answer because they outwardly they, you know, they've got it in their policy that you can't do it. Um, the enforcement of that policy is um, uh sketchy at best, let's put it ... or inconsistent maybe is a better word. Inconsistent at best. You know in my mind, if I were the networks, I would be bending over backwards to help me do what I'm doing, 'cause it doesn't cost them any money, right? My clients are the ones paying. The networks aren't paying me. They should be helping me because I'm helping them enforce their community standards, but it kind of goes back to, do they really want to remove accounts? And...

[00:33:19] Bob: Yeah. my head is spinning on the fact like well the first thing I had written down which we've already run past was, why should people pay you? Shouldn't, shouldn't they just take your bill and hand it to Facebook and make Facebook pay if they, if they could?

[00:33:32] Kevin Long: They it's uh, yeah. I mean...

[00:33:36] Bob: (chuckles) You don't have to answer that question.

[00:33:39] Kevin Long: Yeah, but they uh, you know it's not, it's a valid, a valid question as why is, why are the networks not doing more?

[00:33:47] Bob: Especially because clearly the criminals are doing more.

[00:33:53] Bob: I, I think a lot of people might have encountered this this and not even realized it. On, on Facebook, for example, I will occasionally get a friend request for someone and I'll think, wait a minute. Aren't we already friends? But I'm not really sure. It's kind of a lot of trouble to figure that out. Um, so maybe you just hit accept and then things, things go from there. Um, why is it so easy to, to do this? Why is it easy for the criminals to do this?

[00:34:21] Kevin Long: It's uh, yeah, it's very easy to set up an account which um, they've got that down to a, a science. They've, they've even got programs that autogenerate accounts now, um, they go through and they find people who they think are uh probably not going to pay attention to it or don't care about it, and they generate hundreds of accounts using those persons' names. And they then send, you know it's a coordinated effort. They, they spend, they've got programs where it's automated. They just go out and they look at the real followers on that person's real account, and they just start contacting them uh directly through direct message or messenger or uh depending on the, the network, their, whatever their uh messaging feature is, they, they start sending out auto responses and it's a, it's a game of numbers.

[00:35:18] Bob: Can you tell me how you do what you do?

[00:35:20] Kevin Long: Yeah, a lot of it's proprietary, but you know and the, the simple way that I do it is I identify it, and then my clients tell me what their real accounts are. I white list those, and then any new accounts that come up, on, on those networks that are not on my white list, I find and I report to the networks through my various channels that I have that are um, allowing me to report accounts en masse instead of one-offs. And you know they, I do that every day. Seven days a week.

[00:35:57] Bob: So you have some way of monitoring for, for new accounts, and then you report them as soon as you can.

[00:36:03] Kevin Long: Yep, that's exactly right. And I, I don't get all of them because they're clever with how they spell their names or um, you know if they're one letter off, it doesn't necessarily show up. It also depends on your location in the country as to what accounts show up. So you know I, I always tells my clients, I'm not 100%. I'm about 99.9% on finding and removing them. And they're very understanding of that. And they're more likely to find them, the ones that I miss in particular because someone of the 26 million followers they have is going to be interacting with that person, and they may report it to the, the individual being impersonated who then in turn sends it to me and I get it taken care of. So it's a, you know it's a, a daily game of whack-a-mole, and no sooner do you take them down ... I mean I could search 24/7 and I would find, I could take down everything that I found and five minutes later there could be 10 more accounts up.

[00:37:04] Bob: That sounds a little bit on the maddening side, I must say.

[00:37:07] Kevin Long: Yeah, it is.

[00:37:09] Bob: And that game of numbers became tilted in the criminal's favor ironically after the big Cambridge Analytica Scandal a few years ago. If you don't remember that, Facebook had said it wasn't sharing data with outside companies but sold access to Cambridge Analytica anyway; data that was ultimately used to target voters in the 2016 elections. Facebook ended up paying a record $5 billion fine to the Federal Trade Commission after that.

[00:37:37] Kevin Long: Yes. So during the 2016 elections and the Cambridge Analytica Scandal hit, at that point I had a program that was using the APIs that the networks allowed me to um, do automated searches. So the searches that we had with the software that we built would contact Facebook, would contact @Twitter, would contact YouTube, contact all of the networks and, and say, okay, I need a list of everybody named Kevin Long. And it would return all of the accounts that came back under that username. When Cambridge Analytica hit, they had misused their access um, that they were granted through the APIs, and so Facebook, probably rightly so, did a review of everybody who had access to their APIs and shut it down during the review. So what do you do now? Now you're not able to automat--, automatically search, um, now you’ve got to employ people to go in and do physical searches, manual searches, typing in names of your clients, and so it went from being a, a job that was relatively easy to being one that was very time consuming. And it has just been, you know, it's increased our workload tenfold because we have to do the searches manually now versus using a computer to go in and do it because once they, eventually they did reopen their APIs, but they changed the um, things you could ask for. So you could no longer ask for, "Bring me back everybody under the name Kevin Long," which was all I needed. I didn't need the stuff that you can ask for now. I didn't need that stuff. It was irrelevant to what I needed to be able to perform my task. So you know I spent several years trying to convince Facebook, "Hey, you've got to let me do this, because this, I'm helping you. I'm helping your community; I'm helping the victims in your community. This is not being used for nefarious purpose, this is being used for a very specific targeted purpose to help eliminate fraud." And I just, it was spinning my wheels. I got nowhere with them. They weren't going to make an exception for it. So um, and none of the other networks really seemed to care either or wanted to adjust either to uh allow me to do that. So now everything's done manually.

[00:40:03] Bob: So I'm sorry, but I'm picturing that a human being is sitting at a desk right now typing, Bob Sullivan, and occasionally substituting in a "1" for an L" or whatever, trying to imagine, and they're just brute force doing this one at a time?

[00:40:15] Kevin Long: Yep, that's how it's done now. And that's why it takes longer.

[00:40:19] Bob: Oh my God. (laugh) And this is the age of artificial intelligence.

[00:40:24] Kevin Long: Yeah, where I could, my job could be done in a nanosecond with a computer program that we had in place before, and just because of the way that they've changed their rules after the Cambridge Analytica Scandal, it has, you know all the networks readjusted their access point for the data. And it um, you know it made, it made the job a lot harder. It didn't make it impossible; it just made it a lot harder and a lot less effective. But I mean effective from our perspective in that from a time perspective, from a time usage.

[00:41:02] Bob: Yeah, sure. Well, the criminals have automated tools, but you don't.

[00:41:05] Kevin Long: Right.

[00:41:07] Bob: Why does Kevin take on this job which feels basically impossible?

[00:41:12] Kevin Long: Yeah, I was uh bullied a lot as a child, and I don't like people that take advantage of other people and that pick on the less fortunate. And so it's kind of a passion of mine to help folks who are taken advantage of in some way or another. And so it's just been, I've worked for people in the past that have that same mentality and it just kind of is ingrained in my head to kind of stick up for the little guy whenever I get the opportunity to.

[00:41:44] Bob: So this isn't just a company for you.

[00:41:47] Kevin Long: It's a passion. Uh, I enjoy doing what I do and uh I enjoy helping people and, and it is a, a, it has been rewarding in many ways and, and uh fulfilling both mentally and, and uh, from a career standpoint.

[00:42:04] Bob: And there is another reason Kevin cares so much about fake accounts and you should, too.

[00:42:11] Kevin Long: Um, I think a lot of it's coordinated by that sort of activity as well. Activists create these fake accounts to just cause problems for people. And, you know they've got somebody on their team that knows how to do it, and, and, you know, they understand that it's going to take resources away from being able to concentrate on the mission to um, react to a barrage of fake accounts if they're doing it internally. So that's why a lot of them outsource this to me is they just, they don't have the manpower uh or the staff available to spend the time that I spend taking care of this problem for them. And it's a, you know, for them it's worthwhile to pay my fee to um, to have me take care of the issue for them so it's no longer their headache. And um...

[00:42:57] Bob: I, I didn't get on this call thinking we were going to head down the misinformation route, but, but clearly there's a tremendous potential for a problem there too, right?

[00:43:06] Kevin Long: Oh yeah, I mean they, they create accounts and they'll say, they'll, they'll make accounts, they'll make statements that are clearly something that someone wouldn't say.

[00:43:16] Bob: And so Kevin doesn't see a lot of relief coming for people like Kelly or Paul anytime soon.

[00:43:22] Bob: It sucks because you know those one-offs, those are the people that are really hurting, and it really impacts them, and you can hear the pain in their voice when they're reading their emails to me. And I know that for them it's frustrating beyond belief. So it's definitely a um, it's definitely an ongoing problem and, and as the scammers find more victims, both potential victims and people that they can scam and potential victims that they can prey on, uh as far as uh, uh, famous or semi-famous people, um, then it, it's just going to continue to go and, and continue to be a problem.

[00:44:07] Bob: There's some piece of me if I stretch that can understand why it's hard for Facebook to say pick between you know you are the real person, you're the impostor. But if, if I send Facebook, "Look, look, here's a crime happening right now." It's shocking to me that they don't stop a crime in progress.

[00:44:24] Kevin Long: They probably would tell you to contact law enforcement and then (chuckling)... law--, law enforcement is going to say, you know if it doesn't involve millions of dollars or child porn, they, you know, it's not a priority for them. And it's unfortunate because people are, it's happening to more and more people, and you know in the end I think it's probably going to drive people to stop using social media, which is kind of self-defeating on the social media networks part, because they will have created that problem for themselves because they refuse to try and resolve the issues, or dedicate resources to resolve those issues, because they feel like it's, you know, it's not important to them. The little guy is not important to them. Even though the little guy is the one that makes up the bulk of their users that are actually you know, seeing ads and clicking on their ads and paying their bills. Uh, but the system's not set up to help the little guy.

[00:45:26] Bob: You've been doing this now for a long time. Has this problem gotten worse or better?

[00:45:30] Kevin Long: It's gotten worse as um, people have been able to create um, bots to generate these accounts, whereas in the old days you used to have to physically create an account and do it. Now they've got computer programs that do it for them. So it's, you know, again, it's like, I'm going into this battle with both arms and legs tied behind my back and squirming to try and fix it whereas I used to be able to fight them, you know, hand to hand on a little bit more level playing field. It's not level anymore and not only are the criminals becoming more sophisticated, the networks are becoming less receptive to trying to resolve the issue. So it's, it's an uphill battle, and it's become more challenging to do it.

[00:46:17] Bob: An uphill battle for people like Kelly and Ryan who, when we last spoke, still hadn't regained control of his account.

[00:46:28] Bob: Okay, so what do you want Facebook to know about this?

[00:46:31] Kelly Anderson: I don't know that uh they don't know anything about it all already, but I, it's, it's such a feeling of helplessness that you know so much of our lives are actually tied onto this platform. Um, and to, it's one thing to be a huge corporation where there are just physical limitations with how much you can do at any one point. But I think it's a, there's a sense of duty to your customers when your customer base is that big and that is your product to, again, connect to people, that not only are they doing their customers a disservice which is a pretty widely held belief, but they're actually cutting, cutting people off from, from friends and family, from their main relationships. Especially with, you know the pandemic in recent memory, there are hundreds, thousands, maybe millions of people where all of their interaction happens online. And ... making it so easy to cut somebody off from all of that is inexcusable really. That is your product, is to connect people, and to not have any way to fix a situation like this; there's no excuse.

[00:48:14] Bob: But there is real consequence.

[00:48:18] Kelly Anderson: The, I think the biggest um, impact to me, even though it wasn't my account, was the sheer amount of mental exhaustion that uh, that happened because of it. I was thoroughly distracted at work for the first couple of days. Luckily, I am not, you know, crunching numbers too much, but uh when your job is interacting with the general public and keeping a smile on your face, that's very, very hard to do when something big is going on. So that was definitely, uh, that was definitely difficult.

[00:49:03] Bob: I told Kelly that Kevin warned me there was little hope of a happy ending for her and Ryan.

[00:49:09] Kelly Anderson: Yeah, and that's, that's kind of the conclusion that we've gotten too as well. Um, at this point, my husband and I both agree that it, it would just be better for the account to be shut down entirely. Um, we've kind of given up on any sort of recovery options.

[00:49:28] Bob: And for celebrities like Paul Nicklen, well, he feels like the situation is pretty hopeless too. And he also feels like impostor incidents undo all the good that social media has done for him through the years.

[00:49:41] Paul Nicklen: Oh jeez, this is and to the point I almost stopped social media, but that's how also I make my living. You know so it's, it's tricky, and that's how we do our conservation work. So I don't want to give it up. There's a lot of beautiful things and a lot of benefits to social media, but this is one nasty downside for sure. Thirteen years ago, I'm like, I'm never going to get into Instagram. And then I also now I had access to the National Geographic feed, and then I think one day I got, I put up a narwhal on the Nat Geo post, and I said, "If you want to see how many, you know, what narwhals look like underwater, go to my feed." Well I got 48,000 new followers in an hour. And that's when I was like, oh my goodness. And then when I launched my gallery in New York in 2017, I'm like, "Come on down, and uh meet me in my gallery at New York." Well 3,000 people showed up. You know we were in violations of fire code and people were out in the street in the rain. I'm like, these are real people at the end of this. These are not just numbers. And just like a, a social media number or anything. These are real people who really care. People came in crying. People who wanted to give a hug. People who had questions. People who wanted to relate. And I'm like, so that's who I feel is at the other end of social media, and those are the people who are being predated upon by, by, by the scamming community, and um, it's, I'm, I really hope that you keep digging deeper and you keep uh pushing this along. But thank you.

[00:50:59] Bob: And the harsh truth is, people have to look out for impostor scams for themselves right now.

[00:51:05] Paul Nicklen: I just, I guess my message to the world is, in this time of these big tech companies not doing their due diligence, and doing, you know, and taking, and protecting, I guess, you know, protecting their base and their users, then it's up to us to really do the due diligence and, and really protect ourselves. So if something even remotely sniffs or smells a little bit odd, know that it's a scam. You know, so I, I put up there in my masthead, I say it all the time, um, I put it in my newsletters, I put it in emails that I will never write you a romantic or a, a, an email that solicits you for funding or for support or for love or, you know, it's just we have to constantly beat that drum. But scams are getting worse and on every platform, on the phone, on email, on text. You know we're, it's, it's become obviously a massive industry. We just all have to, unfortunately, go through life with our, our cautious blinkers on and, and just be very cognitive and cautious about, about the crap that's going on out there.

[00:52:16] Bob: For The Perfect Scam, I'm Bob Sullivan.

(MUSIC SEGUE)

[00:52:29] Bob: If you have been targeted by a scam or fraud, you are not alone. Call the AARP Fraud Watch Network Helpline at 877-908-3360. Their trained fraud specialists can provide you with free support and guidance on what to do next. Our email address at The Perfect Scam is: theperfectscampodcast@aarp.org, and we want to hear from you. If you've been the victim of a scam or you know someone who has, and you'd like us to tell their story, write to us or just send us some feedback. That address again is: theperfectscampodcast@aarp.org. Thank you to our team of scambusters; Associate Producer, Annalea Embree; Researcher, Sarah Binney; Executive Producer, Julie Getz; and our Audio Engineer and Sound Designer, Julio Gonzalez. Be sure to find us on Apple Podcasts, Spotify, or wherever you listen to podcasts. For AARP's The Perfect Scam, I'm Bob Sullivan.

(MUSIC OUTRO)

END OF TRANSCRIPT

The Perfect ScamSM is a project of the AARP Fraud Watch Network, which equips consumers like you with the knowledge to give you power over scams.

 

How to listen and subscribe to AARP's podcasts

Are you new to podcasts? Learn how to subscribe to AARP Podcasts on any device.

Discover AARP Members Only Access

Join AARP to Continue

Already a Member?

spinner image cartoon of a woman holding a megaphone

Have you seen this scam?

  • Call the AARP Fraud Watch Network Helpline at 877-908-3360 or report it with the AARP Scam Tracking Map.  
  • Get Watchdog Alerts for tips on avoiding such scams.