
By Marc Frankson
Jamaica is still reeling from Hurricane Melissa’s passage. Across the island, families are clearing mud from their homes, searching for missing documents, repairing leaking roofs, and trying to restore some sense of normalcy. Phones are dead or lost, communication is unstable, and many people are physically separated from their loved ones. It is in moments like these, when stress is high and clarity is low, that another threat quietly emerges: scams.
History shows that after major disasters, scams almost always increase. A 2025 AICPA/Harris Poll in the US found that 37 per cent of Americans experienced financial fraud following a natural disaster, and the numbers climb when fear and confusion are highest. Jamaica is no different. After a hurricane, when people are scared, displaced, and desperate for help, criminals rush in to take advantage. And today, many of these criminals are using AI to make their tricks more believable than ever.
But while the threats are new, the steps to protect yourself are simple. With awareness and a few straightforward habits, you can safeguard your family, your savings, and your peace of mind. Below are the three common AI-driven scams you are likely to encounter after this and other disasters, and the clear steps you can take to protect yourself.
When a “Loved One” calls asking for money: STOP, SLOW DOWN, CALL BACK

Imagine your phone ringing at 11:30 pm. The voice sounds exactly like your son or granddaughter in the country; they have the same tone, same accent, and same panic. But, it’s not real; it’s a clone. This has the potential to be a dangerous AI scam. With just a few seconds of someone’s voice from a TikTok video, a WhatsApp status, or a voicemail, scammers can create a replica of that voice and use it to target older parents and grandparents first because their instinct is to help.
The solution is simple but powerful: stop, slow down, and call back to verify. If you receive an emergency call requesting money, I suggest that you hang up and call the person back on their real number. If they don’t answer, call someone who is likely with them, a partner, sibling, or coworker. Before acting, try to confirm it is a real emergency through the use of a second channel.
Before another disaster strikes, sit with your family, especially elders, and explain that you will never request money through a surprise phone call or WhatsApp message. You could also create a family code word that must always be used in a true emergency. If the caller does not say the code word, the call is fake. These simple actions could save thousands of dollars and prevent unimaginable stress.
Fake videos of public figures: IF IT’S NOT ON THE OFFICIAL PAGE, IT’S NOT REAL

If scammers can fake the voice of your child, imagine what they can do with the face of a government minister. There are AI tools that can now create extremely realistic videos known as “deepfakes” of elected officials, business leaders, and other well-known Jamaicans delivering messages they never actually recorded. After a disaster, when people are desperate for updates, these videos spread rapidly, especially on WhatsApp. Some can instruct viewers to send money to a special recovery account, while others may promote fake investment opportunities disguised as government initiatives.
This is not theory; Jamaica has seen this already. In 2025, the Jamaica Constabulary Force warned about a sophisticated scam using a deepfake video of businessman Adam Stewart endorsing a fake investment fund. The video looked like a real newscast and created widespread confusion before the scam was shut down. To protect yourself, here is another simple rule to adopt: If it’s not posted on an official page, it’s likely not real.
Public agencies do not issue financial instructions through broadcast WhatsApp messages, and no public figure, whether a minister, businessman, or entertainer, will ask you to send money to a personal bank account for community recovery. The moment you see that, delete it and warn your relatives.
Fake images designed to trigger donations: VERIFY BEFORE YOU GIVE

The next wave of scams involves using AI to fabricate images and videos of destruction. Scammers can now use AI imaging tools to create hyper-realistic photos of collapsed houses, injured children, or entire communities underwater, all designed to tug at your emotions and push you to donate.
Jamaicans are some of the most generous people in the world, which makes us especially vulnerable. After major flooding in Texas in July 2025, hundreds of AI-generated images circulated showing devastation in areas that never happened. Many well-meaning people donated, and many were scammed.
This last rule is simple: verify before you give. If a dramatic photo or video is not appearing on reputable news outlets, pause and verify. Zoom in, AI often produces strange-looking hands, odd shadows, or blurred signs. Fake images fall apart under careful inspection. Only donate to someone you know directly or through established charities, churches, or community groups you can call, visit, or confirm online. Avoid payment links in WhatsApp messages or unknown websites. When in doubt, don’t give until you confirm it is legitimate.
Scammers target the emotionally vulnerable, and major disasters, elders and children are especially exposed. Sensitising your family is one of the most powerful defences you have. For older relatives, keep it simple. Tell them you will never ask for money through a surprise call or WhatsApp message. Remind them that government agencies, banks, and utility companies will never request PINs, passwords, or transfers to “secure accounts”. Encourage them to call you immediately whenever something feels suspicious. Teach children and teens that videos and images online can now be faked just like filters on TikTok, only far more advanced. They should never click payment links, share banking information, or forward emotional pleas without checking with an adult.
Hurricane Melissa may have tested Jamaica, but it did not bend our spirit. Yes, criminals are turning to AI to exploit vulnerability, but we are a nation that turns challenges into catalysts. Artificial intelligence itself is not the enemy; the same technology used to deceive can be transformed into a tool for clarity, safety, and progress.
It is helping us assess damage faster, coordinate recovery with precision, and keep communities informed and connected. This moment is not just about repairing structures. It is about strengthening our systems, sharpening our digital awareness, and preparing our people for a future where resilience is not only physical but also technological.
Marc Frankson is lead consultant at Transcend AI Consulting. He can be contacted at [email protected]
Comments