• A Ferrari executive was targeted by a deepfake scam earlier this month.
  • The fraudster used AI to impersonate the Ferrari CEO and request transaction help.
  • Fortunately, the executive didn’t fall for it, and the scammers ended the call abruptly.

The rapid advancement of artificial intelligence is equipping scammers with new tools to deceive even high-profile targets. Recently, a deepfake scam hit a Ferrari executive, with several messages and calls appearing to come from CEO Benedetto Vigna. Fortunately, the executive outwitted the fraudster by posing a personal question for verification.

It all began with a series of WhatsApp messages from someone posing as Ferrari’s CEO. The messages, seeking urgent help with a supposed classified acquisition, came from a different number but featured a profile picture of Vigna standing in front of the Ferrari emblem.

More: AI Can’t Think On Its Feet As Weird Situations Stump Self-Driving Cars

As reported by Bloomberg, one of the messages read: “Hey, did you hear about the big acquisition we’re planning? I could need your help.” The scammer continued, “Be ready to sign the Non-Disclosure Agreement our lawyer will send you ASAP.” The message concluded with a sense of urgency: “Italy’s market regulator and Milan stock exchange have already been informed. Maintain utmost discretion.”

Following the text messages, the executive received a phone call featuring a convincing impersonation of Vigna’s voice, complete with the CEO’s signature southern Italian accent. The caller claimed to be using a different number due to the sensitive nature of the matter and then requested the executive execute an “unspecified currency hedge transaction”.

 Ferrari Exec Targeted By Deepfake Scammers Posing As CEO

The oddball money request, coupled with some “slight mechanical intonations” during the call, raised red flags for the Ferrari executive. He retorted, “Sorry, Benedetto, but I need to verify your identity,” and quizzed the CEO on a book he had recommended days earlier. Unsurprisingly, the impersonator flubbed the answer and ended the call in a hurry.

Ferrari representatives declined to comment on the incident, which Bloomberg learned about from unnamed sources. The situation, which occurred earlier this month, is under investigation by the company.

Needless to say, this isn’t the first time AI has been weaponized by fraudsters to siphon off cash. Rachel Tobac, CEO of cybersecurity firm SocialProof Security, warns, “This year we’re seeing an increase in criminals attempting to voice clone using AI.”

Stefano Zanero, a cybersecurity professor at Politecnico di Milano, grimly predicts that AI-powered deepfakes are only going to get scarier becoming “incredibly accurate”. Until companies can equip their staff with superhuman detectors, the onus is on individuals to be hypervigilant. Double-check, triple-check – even quadruple-check before transferring a dime, no matter who’s asking, even if it’s to your boss.

 Ferrari Exec Targeted By Deepfake Scammers Posing As CEO

The real Benedetto Vigna, Ferrari’s CEO