Why robots won’t replace financial planners

A few months ago I paid £150 to see Abba in concert. Sort of.

Agnetha, Bjorn, Benny and Anni-Frid could’ve spent the evening on their sofas, for all I know. In their place were ‘ABBAtars’ — digital versions of the Swedish pop stars appearing on a 65 million pixel screen.

It got me thinking: with the rise of AI tools such as ChatGPT, a machine learning language model, will me and my financial planning peers become redundant in the future? 

Instead of speaking to an adviser with a brain and a pulse, will clients turn to robots and holograms? 

And would this be a good thing for the profession or would we feel as though we’d been plunged into an episode of sci-fi anthology series Black Mirror? There’s a lot to think about… 

Goodbye human error

Usually when you go to a concert, you have no idea what will happen. The lead singer could fall and break her ankle. A rockstar could forget his words, too drunk to put on a good performance. Some artists are known for having poor time keeping skills, turning up an hour or two late for their own shows. 

This doesn’t happen with AI. With the right technology, human error and fallibility can be removed from the equation. In theory, with robots taking the lead, everything will be alright on the night. Unless there’s a blackout or some other technical issue. 

Financial planners would be foolish to dismiss a tool that could reduce human error — we’re only human after all. That’s why technology already plays a big part in what we do at Smarter Financial Planning. We use cashflow modelling software, comprehensive reporting tools and we have our very own app, Smarter Money, which enables us to contact you securely. These tools help us to streamline tasks, manage data and provide a high-quality, multi-dimensional service. What’s not to love? 

An important caveat, however, is that it works best when humans are involved too. 

It’s not just about the numbers

There’s far more to financial planning than managing clients’ portfolios and telling them which accounts to open. Most of what I do is rooted in psychology and human behaviour. The way I work with clients can’t be automated or mimicked by AI. Not yet, anyway.

If you manage your pensions and investments directly online, for example, through ‘robo advice’ you might be asked to describe your risk tolerance before any recommendations are made. 

You’ll usually only have three options here: low, medium or high. If you’re a new investor, or you have a scarcity mindset or your parents taught you that investing = gambling, your judgement may be clouded. The robot making recommendations on your behalf doesn’t know that though. These are the types of issues that only a human can unpick, which is where problems arise.

When you work with a financial planner, they’ll know a lot more about you before getting to these questions. They’ll know who you are, what you want to achieve and what’s holding you back. 

So whether you hit that low, medium or high risk investment button, we’ll discuss your decision. For some clients, low risk investments are absolutely the right move for them. We don’t want you having sleepless nights. But if you’ve got 30 years left until you retire, we might encourage you to step out of your comfort zone. 

An online solution might make a similar recommendation, but how good will it be at helping you consider the other options? I often speak to clients who are so scared of running out of money that they refuse to spend it. Not only do I point to their pension, savings accounts, mortgage-free home and robust estate plan, I talk to them about why they feel the way they do. 

It’s so satisfying when I inspire them to think differently and they thank me six months later after renovating the house and enjoying a two-week cruise. I don’t just help people transform your finances, I help them transform their relationship with money too. 

The human touch

When you use the chat option on a hotel website, or banking app, you can usually tell when you’re speaking to a robot. Alarm bells usually ring when you send a series of paragraphs laced with subtleties and nuance, only to receive a reply that doesn’t solve your problem at all. 

When you voice your frustration, the robot knows to issue an apology. You close the app and pick up the phone. There’s no point arguing with an algorithm. An apology from a human (or indeed an argument with one) is much more compelling. 

Sometimes, what we really need is a human touch — original anecdotes, personal stories and tailored advice, delivered by someone who truly understands your history, hopes, dreams and fears.

There might even be times when you need someone to shake a bit of sense into you. 

Let’s imagine you’re being unnecessarily frugal in a way that’s impacting your marriage. Or your spending is a little out of control. You’re completely oblivious to this, until your adviser tells you exactly what you need to hear, rather than what you want to hear. 

The truth can be uncomfortable, but these human and nuanced conversations play an important role in financial planning. They can prevent you from making catastrophic decisions about your wealth by yourself. If you’ve built a relationship with your adviser, you know they have your best interests at heart. You trust them completely. Yet if you received this information from a robot, you might not take it so well. 

I’ve been playing around with ChatGPT myself. When I asked ‘Can I afford to retire?’ it told me to consider my life expectancy, future expenses and how long I’ll need my savings to last. Its answer wasn’t terrible, aside from some jargon. What concerned me most, though, was that it couldn’t tell whether I was in pain or just checking in. It didn’t offer sympathy or reassurance. I closed the window without the answers I needed.

Don’t get me wrong. I do see the value in AI

It’s able to analyse vast amounts of data in ways that humans can’t. We can use it to our advantage to make informed decisions and real-time adjustments to financial plans. With technology on our side, we can improve efficiency, reduce the risk of errors, and improve outcomes for our clients.  

I don’t see robots as a threat. Not yet, anyway. If my financial planning skills are any good, there will also always be people who come to me because they know what I can provide: someone who can read between the lines, note the tone of their voice, as well what they’re saying and marry that with my in-depth knowledge of them, to give guidance that’s right for them, and them only.

 

Your adviserJon Elkins