When Savannah Guthrie made a heart-wrenching plea to the kidnapper of her 84-year-old mother to send ''proof of life,'' she addressed the possibility of people creating deepfakes.
"We live in a world where voices and images are easily manipulated,'' she said.
Before artificial intelligence tools proliferated — making it possible to realistically impersonate someone, in photos, sound and video — ''proof of life'' could simply mean sending a grainy image of a person who's been abducted.
That's no longer true.
''With AI these days you can make videos that appear to be very real. So we can't just take a video and trust that that's proof of life because of advancements in AI," Heith Janke, the FBI chief in Phoenix, said at a news conference Thursday.
Hoaxes — whether high or low-tech — have long challenged law enforcement, especially when it comes to high-profile cases such as Nancy Guthrie's disappearance last weekend from her home in the Tucson area.
As technology has advanced, criminals have grown savvy and used it to their benefit, confusing police and the public and masking their identities. The FBI in December warned that people posing as kidnappers can provide what appears to be a real photo or video of a loved one, along with demands for money.
Police have not said that they have received any deepfake images of Guthrie. At least three news organizations have reported receiving purported ransom notes that they have given to investigators, who said they are taking them seriously.