Dr. Robert H. Lustig is an endocrinologist, a professor emeritus of pediatrics at the University of California, San Francisco, and an author of bestselling books on obesity.
He is absolutely not — despite what you might see and hear on Facebook — hawking “liquid pearls” with dubious claims about weight loss. “No injections, no surgery, just results,” he appears to say in one post.
Instead, someone has used artificial intelligence to make a video that imitates him and his voice — all without his knowledge, let alone consent.
The posts are part of a global surge of frauds hijacking the online personas of prominent medical professionals to sell unproven health products or simply to swindle gullible customers, according to the doctors, government officials and researchers who have tracked the problem.
While health care has long attracted quackery, AI tools developed by Big Tech are enabling the people behind these impersonations to reach millions online — and to profit from them. The result is seeding disinformation, undermining trust in the profession and potentially endangering patients.
Even if the products are not dangerous, selling useless supplements can raise false hopes among people who should be getting the medical treatment they need.
“There are so many things wrong with this,” Lustig — the real one — said in an interview when contacted about the impersonations. The interview was the first time he had learned about them.
Story continues below this ad
The Food and Drug Administration and other government agencies, as well as advocacy groups and private watchdogs, have stepped up warnings about counterfeit or fraudulent health products online, but they appear to have done little to stem the tide.
The advancements of AI tools have made it easier to generate convincing content and spread it on social media platforms and e-commerce sites that often fail to enforce their own policies against frauds.
There are now hundreds of tools designed to re-create someone’s image and voice, said Vijay Balasubramaniyan, CEO of Pindrop, a company that tracks deceptive uses of AI. The technology has become so sophisticated that swindlers can create convincing impostors from just a few clips or photos.
“I can actually create an AI bot that looks like you and has full-blown conversations just from your LinkedIn profile,” he said.
Story continues below this ad
Dr. Gemma Newman, a family physician in Britain and the author of two books about nutrition and health, took to Instagram in April to warn her followers about a video on TikTok that had been altered to make it seem like she was promoting capsules of vitamin B12 and 9,000 milligrams of “pure nutrient rich beetroot.”
Newman was horrified: Her likeness was pushing a supplement, one that could be harmful in high doses, by playing on women’s insecurities — implying the pills could make them “feel desirable, admired and confident.”
The video was so realistic that her own mother believed it was her.
“It’s a double betrayal because my image is there, supporting something that I don’t believe in,” she said.
The impersonation of medical professionals extends beyond unproven supplements.
Story continues below this ad
Dr. Eric Topol, a cardiologist and the founder of the Scripps Research Translational Institute in San Diego, discovered there were dozens of apparent AI spinoffs of his newest book on Amazon. One of his patients unknowingly bought a fake memoir, complete with an AI-generated portrait of Topol on the cover.
Christopher Gardner, a nutrition scientist at Stanford, recently found himself the unwitting face of at least six YouTube channels, including one called “Nutrient Nerd.”
Together, the channels have hundreds of videos, many narrated by an AI-generated version of Gardner’s voice. Most of the videos target older adults and dole out advice he does not endorse, addressing issues like arthritis pain and muscle loss. Impersonations like these may be an effort to build large enough followings on the platform to qualify for programs to earn a share of ad revenue.
The spread of these fakes has made standard advice about how to find good health information online suddenly feel outdated, said Dr. Eleonora Teplinsky, an oncologist who has found impostors on Facebook, Instagram and TikTok.
Story continues below this ad
“This undermines all the things we tell people about how to spot misinformation online: Are they real people? Do they have a hospital page?” she said. “How would people know it’s not me?”
Gardner said he was concerned about the amount of nutrition misinformation online. He has been active on social media and has appeared on podcasts to set the record straight, he said.
Now, Gardner wonders if those efforts have backfired on him — providing a library of recordings that can be used to impersonate him. Experiences like his may discourage other experts from venturing into online conversations, he said. “Then the credible voices will be drowned out even more.”
Gardner and a Stanford representative spent hours reporting the videos to YouTube and to federal authorities. They also posted comments on the videos warning that the videos were falsely impersonating Gardner, but most of those comments were deleted within a minute.
Story continues below this ad
A YouTube spokesperson, Jack Malon, told The New York Times that the platform had removed “several channels” for violating its policies against spam. TikTok said in a statement that it did not allow most impersonations but did not address the video of Newman.
The brand of supplement that impersonated Lustig on Facebook also created fake posts in a number of other countries, including ones featuring real doctors in Australia and Ireland.
The geographic range showed it was the type of large, sophisticated operation that is increasingly becoming a threat to brands across the globe, said Yoav Keren, CEO of BrandShield, a cybersecurity company in Israel that uncovered it.
The campaign, which appeared to begin late last year, was capitalizing on the popularity of a class of drugs known as GLP-1s, which have transformed treatment of diabetes, obesity and related diseases. It pitched a product called Peaka, which appeared to be liquid capsules. (The only approved forms of the GLP-1s now available are injections.)
Story continues below this ad
They are sold on here-today-gone-tomorrow websites with registrations in Hong Kong, Keren said, but who exactly is behind them remains unknown.
Despite its uncertain provenance and false marketing, the product was until recently available for purchase on major e-commerce platforms, including Amazon and Walmart, and appeared in Google searches as a sponsored product. (Amazon began to remove it after being contacted by the Times. Walmart said the product was not sold in its stores, but third-party vendors, in violation of the retailer’s policies, were selling the product on its e-commerce site; those vendors have since been removed.)
In addition to impersonating doctors, the marketing campaign featured logos of regulatory agencies or advocacy groups in several countries, including Mexico, Norway, Britain, Canada and New Zealand, falsely implying the product received official approval, according to BrandShield. In the United States, the groups included the Obesity Society, whose website now has a pop-up warning about what it called an “e-commerce scam.”
Meta, the company that owns Facebook, forbids impersonations on its platforms but indicated that it was not aware of the fake accounts until the Times contacted it. It recently began removing those of several doctors involved, the company said.
Story continues below this ad
“We know that there will be examples of things we may miss,” the company said in a statement noting that it was rolling out new efforts to detect impersonations of public figures or celebrities.
Average Rating