A client reported today that when hackers compromised her Instagram account, they started copying her voice to create new ads and scam more people. This is not a theoretical threat — it is happening to real people right now, and the technology to clone a voice from a short audio sample is widely available.
The immediate action item is simple: turn on Multi-Factor Authentication (MFA) for Facebook, Instagram, LinkedIn, your email, QuickBooks, and any crypto or banking sites that support it.
Which MFA Method to Use
Unless you are actively being hacked at this moment, use the text (SMS) option for MFA. The authenticator app is more secure in theory, but it causes significant problems when you change phones — the app data does not transfer automatically, and if you lose access to your authenticator without a backup, recovering your accounts becomes very difficult.
For most people, SMS-based MFA provides a substantial security improvement over no MFA at all, with far less risk of locking yourself out.
Why This Matters Beyond the Obvious
When an attacker owns your social media account, they do not just post spam. They have access to your message history, your contact list, your photos, and — increasingly — audio and video of you. AI voice cloning tools can generate convincing audio from as little as a few seconds of your voice. That audio can then be used to impersonate you in phone calls, voicemails, or new scam content.
Enabling MFA does not guarantee your accounts will never be compromised, but it eliminates the vast majority of credential-stuffing and phishing attacks that are the most common entry point. Do it today.
— Robert
