
A new scam is using AI technology to target the families of missing people. | Why advocates say the problem is only getting worse
SAN ANTONIO — Advocates who work with families of missing people are warning about a new and disturbing scam that uses artificial intelligence to extort victims already in crisis.
Alfonso Solis, a search and rescue advocate in North Texas, said families of missing people have long been targets of scammers, but he recently encountered a case that took the deception to a new level — a fabricated video generated using AI.
“Anytime you put your personal information out there for contact, there’s a good chance you’re going to get calls regarding a scam,” Solis said.
Solis has been involved in search and rescue operations across the Dallas-Fort Worth area for the past three years. He said he was assisting a family searching for a missing loved one when someone contacted them claiming to have seen the person near the U.S.-Mexico border.
Solis said the family was later sent a video allegedly showing the missing person being held on a ranch. Concerned, the family shared the video with him. The individual in the video has since been found, and the video has been blurred to protect their identity.
“I saw the video and I said, ‘No, this is too clean,’” Solis said. “It’s his face, but clothes do not have wrinkles. There are going to be folds, there are going to be details in the background that tell me this is not right. This is AI.”
Solis said another red flag was the absence of audio.
“They can replicate your face from a picture, but they can’t synthesize your voice — at least not yet,” he said. “When you ask to speak with that person for proof of life, they’ll give every excuse not to put them on the phone, like no you need to give us money.”
The scammers did demand payment, Solis said, exploiting the emotional vulnerability of families desperate for information.
“They’re going through the worst day of their lives,” he said. “They’re scared, they miss their loved one, and there are so many emotions at play. They’re hoping to capitalize on that.”
While the video in this case did not include audio, Solis warned that voice synthesis is likely coming next as AI technology advances.
“They’re going to use video from your social media and synthesize it,” he said.
Solis urged families to remain cautious and take steps to protect themselves, including documenting any videos or messages received and demanding proof of life. He recommends asking to speak directly with the missing person or requesting answers to specific questions only the family and loved one would know.
As technology evolves, Solis said, awareness may be the strongest defense.