MIT researchers say that AI is ‘inherently sociopathic’ — but that it can be trained to give ethical financial advice

MIT researchers say that AI is 'inherently sociopathic' — but that it can be trained to give ethical financial advice


  • Researchers at MIT say it’s possible to train AI to be a reliable financial advisor.
  • But it would need to be able to understand emotions to build trust with users.
  • This article is part of “Build IT,” a series about digital-tech trends disrupting industries.

Arynton Hardy, a wealth manager in Los Angeles, uses artificial intelligence daily.

“Almost every single meeting I have with a client, I utilize an AI summarizer that gives me notes, gives me follow-ups, gives me to-dos,” Hardy said. His employer, Savvy Wealth, is a tech startup focused on providing AI innovations to its growing roster of financial advisors and high-net-worth clients.

Hardy said AI tools regularly save his team hours on data entry, portfolio monitoring, and other back-office tasks, giving him more time to meet with clients.

He’s not alone: A report that the data-analytics firm Escalent shared with Business Insider said that nearly 40% of financial advisors used generative-AI tools on the job, primarily to boost productivity, generate content, and market to or prospect new clients.


Arynton Hardy smiling and wearing a sweater in a headshot

Arynton Hardy, a principal wealth manager at Savvy Wealth.

Savvy Wealth



Soon generative AI may have the power to fulfill a financial advisor’s most important role: giving people trustworthy money advice.

While the uptake of ChatGPT and other large language models has been swift across industries including fashion and marketing, financial services have hit regulatory roadblocks.

But MIT researchers believe there’s a clear path to training AI models as subject-matter experts that ethically tailor financial advice to an individual’s circumstances. Instead of responding to “How should I invest?” with generic advice and a push to seek professional help, an AI chatbot could become the financial advisor itself.

“We’re on our way to that Holy Grail,” said Andrew Lo, a professor of finance at the MIT Sloan School of Management and the director of the Laboratory for Financial Engineering. “We think we’re about two or three years away before we can demonstrate a piece of software that by SEC regulatory guidelines will satisfy fiduciary duty.”

Fiduciary duty refers to the set of legal responsibilities that implore financial and investment advisors to act with the highest degree of care when handling a client’s money. It’s the gold standard in financial planning, and it has so far proved difficult to replicate in direct-to-consumer AI tools.

AI is ‘inherently sociopathic’

Financial advisors often develop client recommendations through a behavioral-finance lens, as research suggests that people don’t always make rational or unbiased financial decisions but are error-prone and emotionally driven.

An average investor, for instance, might panic when the stock market plummets and decide to sell their stake in a mutual fund rather than wait for the market to recover, as it almost always does.

Knee-jerk reactions like this can often be avoided or corrected under the guidance of a skilled financial advisor whom the investor trusts. Lo said that trust can be developed in part through empathy and small talk.


Andrew Lo smiles and wears a suit and tie.

Andrew Lo, a professor of finance at the MIT Sloan School of Management and the director of the Laboratory for Financial Engineering.

Erica Ferrone



“When you start talking to somebody, almost immediately you develop feelings for that person,” Lo said. “That’s the kind of process that needs to happen with large language models. We need to develop an ability to interact with humans not just on an intellectual level but on an emotional one.”

But the glaring problem with publicly available AI tools is that they’re “inherently sociopathic,” Lo and his coauthor wrote in a research report exploring the challenges of widespread adoption of AI-powered financial advice.

“This sociopathy seems to cause the characteristic glibness of LLM output; an LLM can easily argue both sides of an argument because neither side has weight to it,” they wrote. It may be able to role-play as a financial advisor by relying on its training data, but the AI needs to have a deeper understanding of a client’s state of mind to build trust.

One way to accomplish this is by requiring the machine to ask the user simple questions, starting with “How are you doing?” before dispensing personalized financial advice.

In addition to analyzing the text, advanced AI may be able to solicit audio or video from the user to identify emotional cues, such as stress or fear, in their voice or facial expressions, Lo said. Think of it like a doctor displaying the kind of bedside manner that’s lauded in the medical field.

“Trust is not something that will automatically be given to a generative AI,” Lo told BI. “It has to be earned.”

Planning for optimization

Conquest Planning, a Canadian startup, has developed financial-planning software that uses an AI architecture known as a blackboard system.

Ken Lotocki, its chief product officer, said the “blackboard” stores essential information about tax rules, cash-flow mechanics, retirement-account structures, fiduciary rules, and more.

Clients input their personal and financial data so the system can learn their preferences, parameters, and goals. Using this information, Lotocki said, the system “looks at all of the permutations and options that are available” and then recommends a course of action.


Ken Lotocki smiles wearing a suit jacket over a button-up.

Ken Lotocki, the chief product officer at Conquest Planning.

Conquest Planning



Advisors can toggle between a list of strategies to measure the effects of various decisions on the client’s estate and goals, such as claiming retirement benefits at different ages.

Mark McGrath, a financial planner and portfolio manager at PWL Capital in Canada, said the software saved him time and provided clients with better financial plans than he could create manually.

“I’ve been doing this for a long time — I assure you, no financial planner is going into that level of depth and spending that amount of time on optimization,” McGrath said. “That means clients are not always provided with an optimal financial plan.”

He added, “The analogy I’ve used in the past is that with Conquest, I’m driving a professionally tuned Ferrari, while my competition is getting around on a pair of roller skates.”


Mark McGrath, smiles and wears a suit jacket in a headshot.

Mark McGrath, a financial planner and portfolio manager.

PWL Capital



Lotocki estimated that Conquest’s software is used by 50% to 60% of the financial-advisor market in Canada through partnerships with banks and independent advisory firms. The software is available in the US on Pershing X’s wealth-management platform, Wove.

While it’s sophisticated, the software isn’t a direct-to-consumer solution. Lotocki and his associates hope to help make financial-advisory AI accessible to people across the wealth spectrum, perhaps free of charge.

Financial planning 3.0

Many financial advisors are eager to use generative AI as an assistant, but few are ready for it to replace them.

Lo said he believes that a world in which people rely on AI advisors rather than human advisors is within view. But he said a smooth transition would require retraining advisors for new careers, possibly with government support.

“What I worry about, and what I think policymakers need to be really focused on, is if a large body of human employees become displaced in a very short period of time. That could cause tremendous social unrest and dislocation,” Lo said. “So the speed of the displacement is something we need to pay attention to.”





Source link
lol

By stp2y

Leave a Reply

Your email address will not be published. Required fields are marked *

No widgets found. Go to Widget page and add the widget in Offcanvas Sidebar Widget Area.