Skip to navigationSkip to contentSkip to footerHelp using this website - Accessibility statement
  • Advertisement

    Financial adviser who deepfaked himself says clients ‘love it’

    Lucy Dean
    Lucy DeanWealth reporter

    Subscribe to gift this article

    Gift 5 articles to anyone you choose each month when you subscribe.

    Subscribe now

    Already a subscriber?

    James Gerrard can’t really clone himself, but if he could, he probably would.

    The Sydney-based financial adviser with FinancialAdvisor.com.au says financial advice is now just as much a communications game as it is a finance game.

    James Gerrard and his deepfake.  Louie Douvis

    He tells The Australian Financial Review he had been looking for ways to make his business more efficient on both fronts, when advances in generative artificial intelligence since late last year made it possible for him to create his digital clone, and palm off some of his work to “himself”.

    “When the Reserve Bank of Australia puts out their rate rise decision every month … I’d love to always do a 60-second video saying, ‘This is what it means for markets,’ but I don’t always have the time,” Gerrard says. “So, I’ve deepfaked myself.”

    A deepfake is essentially a computer-generated version of a person that is startlingly life-like. They’re a form of “synthetic media”, and have been used to create fake images, videos and recordings of celebrities, politicians and even the pope.

    Advertisement

    While some examples can seem harmless, like Harrison Ford being made younger again in the opening scenes of the recently released Indiana Jones and the Dial of Destiny, or Elvis Presley being raised from the dead to perform on America’s Got Talent, the technology is often associated with more harmful applications.

    Many female celebrities have become the victims of deepfake porn, where their likeness is put in videos without consent, while scammers recently created a deepfake of CBA chief executive Matt Comyn urging viewers to get in touch and make hefty returns on crypto investments.

    In May, a deepfake image of an explosion near the Pentagon went viral. Swiftly debunked, the image nevertheless saw the Dow Jones Industrial Index drop 85 points in four minutes.

    The software that creates the deepfake is taught, or trained, to identify the ways that subjects like human faces and expressions (or, in the case of the Pentagon, explosions) work and then use that knowledge to create new, and realistic versions of the subjects.

    The potential uses of deepfakes – and society’s inability to detect them – has alarmed market watchers, politicians and technology experts. However, to Gerrard, the technology also represents a commercial opportunity.

    Advertisement

    To build his deepfake, he had to record a high-resolution video of himself reading out a 50-minute script of dialogue (originally presented by David Attenborough), so the technology could learn his voice and his facial movements.

    He used then used tech from his platform provider, Dash Technology Group, to make it happen.

    “Now, my staff can type in a script about the RBA rate decision, hit ‘generate’, and then five minutes later there’s a video of me talking,” Gerrard says. “You can’t tell it’s not me talking on the video. It looks really realistic.”

    The upshot, he says, is being able to send out more client communications, more quickly and with less work.

    Other AI use

    Aside from the general videos Gerrard says the technology can also be used to streamline portfolio reviews for clients.

    Advertisement

    The platform provider is hooked up to FinancialAdvisor.com.au’s customer relationship management platform, which in turn is linked to the AI video tool.

    “We can do a portfolio review for a client and then do a deepfake video of me talking about some of the shares that went up or down over that particular period, and send that out with the review report,” he says.

    Gerrard says he has only tried it out with a “handful” of clients as a pilot test so far, and that those who received the deepfake videos were told it was AI-generated.

    “The feedback’s been fantastic. They love it,” he says.

    No AI stock-picking

    While he’s happy for AI to create a clone of himself, Gerrard says he won’t let it pick his stocks. He is sceptical of its ability to replace the human mind, and a financial adviser’s ability to empathise and understand their clients’ unique personal situations.

    Advertisement

    But he thinks there could be a role for AI at the early stages of a person’s financial journey.

    For example, a financial advice website with a ChatGPT style software running in the background could take a potential client through to the best adviser for their specific needs, and pass along all the relevant details for that adviser to then work with the client.

    Or, it could potentially offer basic portfolio advice. Once the client has filled out a risk questionnaire, the AI could technically direct them towards an appropriate allocation of bonds, shares and properties.

    “But a lot of the times, it’s not things that any AI engine could give advice on. We’re a sounding board, we’re helping people make decisions around divorce or selling a business – things that are very personal and need someone … to give bespoke advice,” says Gerrard.

    Lucy Dean writes about wealth management, personal finance, lifestyle and leisure, based in The Australian Financial Review's Sydney newsroom. Connect with Lucy on Twitter. Email Lucy at l.dean@afr.com

    Subscribe to gift this article

    Gift 5 articles to anyone you choose each month when you subscribe.

    Subscribe now

    Already a subscriber?

    Read More

    Latest In Technology

    Fetching latest articles

    Most Viewed In Technology