Generative AI tools like ChatGPT and Claude are household names now, but not just because they write emails and perform in-depth web searches. More and more commonly, artificial intelligence is being used to offer financial advice.
According to Experian, about half of Americans say they’ve used generative AI tools to manage or understand finances. Among those individuals, 96% reported positive experiences, and 77% stated they use generative AI for personal financial tasks on a weekly basis.
The appeal is clear: instant answers, tailored suggestions, and a sense of empowerment.
But between industry concerns about AI and the high stakes of investment decision-making, should you be concerned about relying on AI for financial advice? How savvy (and safe) is generative AI when it comes to your money and investments?
A recent analysis from the Massachusetts Institute of Technology’s Sloan Business School found that while generative AI can realistically simulate financial logic, the models often perform in opaque and unpredictable ways.
Even though the advice seems sounds, generative AI can produce guidance that lacks regulatory oversight, an ethical framework, and—perhaps most critically—any connection to users’ personal and financial lives.
“With AI, you could run a simple table or do a complex Monte Carlo analysis,” says Professor James Mallory at the Rochester Institute of Technology. “It’s not a replacement for an adviser clearly, but it takes a lot of the heavy lifting and tedious financial calculations away.”
For this reason, financial institutions are building out their own in-house large language models (LLMs).
For example, JP Morgan is testing its AI-powered solution, Quest IndexGPT; Wealthfront is building out its platform, Path; and Amplify is trying out an AIEQ ETF.
Proponents of such tools stress that they are built with robust financial datasets, guardrails, and regulatory compliance, distinguishing them from your run-of-the-mill generative AI models.
However, that still doesn’t mean that most people are equipped to distinguish good and bad AI advice. A Pew Research Center study found that of Americans who consider themselves financially literate, only about 33% of them gained that knowledge from the internet.
So, where does this leave AI-curious investors?
For now, consider using artificial intelligence for your investment planning. Avoid using it to make your decisions for you. The key is to ask smart, targeted prompts that help you explore ideas.
For instance, you can use the following prompts to understand investment vehicles or to clarify basic strategies:
You can also use AI to gather context, decode jargon, and evaluate trade-offs. For example:
These prompts won’t create a custom strategy, but they’ll prepare you to ask better questions—and make more informed decisions—when working with a qualified financial advisor. AI can even help you find a financial advisor who’s right for you. Try asking:
AI can be a helpful assistant, but not a full-blown financial advisor. So do use it to gather ideas, define terms, or compare investment frameworks. Don’t use it to execute a financial plan or retirement investment strategy.
Investors shouldn’t trust AI with their money and financial futures—at least not yet. So when using it for financial advice, seek guidance from a human fiduciary who understands your financial circumstances and can review what you’ve learned using AI.