google.com, pub-8701563775261122, DIRECT, f08c47fec0942fa0
USA

How to write a good AI prompt for personal finance

Nurfoto | Nurfoto | Getty Images

Many Americans are turning to AI for financial advice.

But getting good or bad advice largely depends on how well users write their instructions or prompts into AI platforms.

“I think there’s a real art and science to driving engineering,” Andrew Lo, director of MIT’s Financial Engineering Laboratory and principal investigator of the Computer Science and Artificial Intelligence Laboratory, said in a recent statement. web presentation For Harvard University’s Griffin Institute of Arts and Sciences.

Limitations of AI for personal finance

First, experts said, it is important to remember that artificial intelligence has limitations when it comes to financial planning.

AI is generally good at providing high-level overviews of financial issues, for example, why it’s important to diversify investments or why exchange-traded funds may be better than mutual funds in some cases but not others, Lo told CNBC in an interview.

However, it struggles in other areas. Tax planning is a good example, Lo said.

Perhaps counterintuitively, he said that AI is not very good at crunching numbers and making precise financial calculations. While AI can provide general guidance about the types of tax deductions or tax rules people might consider, asking AI to do a numerical analysis of their own taxes is risky, he said.

“When it comes to very, very specific calculations regarding your personal situation, that’s where you have to be very, very careful,” Lo said.

Lo said AI can sometimes give wrong answers due to a so-called “hallucination” of the algorithm.

“One of the things about [large language models] What I find particularly worrying is that no matter what you ask him, he will always come back with an answer that sounds authoritative, even if it isn’t,” Lo said.

Read more CNBC personal finance coverage

This doesn’t mean people should avoid it completely.

And indeed, many appear to be taking advantage of the technology: 66% of Americans who use generative AI say they use it for financial advice, according to Intuit Credit Karma; For Generation Y and Generation Z, this rate exceeds 80%. questionnaire From 1,019 adults published in September.

According to the survey, approximately 85% of respondents who used GenAI in this way acted on the advice given.

“[People] They should use AI for financial planning, but it is how they use it that matters,” Lo said.

How to write a good AI prompt for personal finance?

This is where writing strong prompts can be helpful.

“Even if she’s the best model in the world, she can only do so much if she’s fed bad direction,” said Brenton Harrison, a certified financial planner and founder of New Money New Problems, a virtual financial advisory firm.

Lo said a strong prompt isn’t too broad: It includes enough detail so that the AI ​​can present relevant information to the user.

Consider this example he gave regarding retirement planning.

A bad question in this context might be: “How should I retire?” Lo said during the Harvard webinar.

“This is very general,” he said. “Garbage in, garbage out.”

Lo said a better suggestion would be: “Assume you are a fee-only fiduciary [financial] advisor. Here are my goals, restrictions, tax bracket, situation, assets, risk tolerance, and timeline. Give me number one: base case strategy. Second rule: basic assumptions. Three: risks. Four: What could invalidate this plan? Five: What information are you missing, and what are you particularly unsure of?”

In this case, the user tells the generative AI program, including OpenAI’s ChatGPT, Anthropic’s Claude, and Google’s Gemini, to frame its recommendation as a proxy. This is a legal framework that requires the financial advisor to: To make recommendations that are in the best interest of the customer.

Ultimately, it’s a process of trial and error, akin to a conversation with multiple prompts, perhaps more than 20, until the user gets a satisfactory response, Lo told CNBC.

He said it’s important to double and triple check output, especially when it comes to financial matters.

How to ‘reverse engineer’ a prompt

After going through this series of prompts, users can “shorten” the process for future queries by asking an additional question: “What prompt should I have asked you to generate the answer I was looking for?” Lo told CNBC.

Essentially, the user asks the AI ​​how to generate the “right” prompt faster, Lo said.

“Once you have that answer, you can keep it aside and use it in the future for questions similar to the one you just asked,” Lo said. “This is one way to make your prompt engineering more efficient: reverse engineering the prompt by asking the AI ​​to tell you what you should do differently.”

Take an extra step

Lo told CNBC that he recommends taking a few additional steps for financial questions.

Once a user gets a good answer to their question, they should always follow up by asking the AI ​​additional questions to determine its limitations. Lo, like asking what is unclear and what information is missing.

For example: “What kind of information did you not have to make this recommendation that could have led to some unreliable results?”

Or along the same lines: “How convinced are you that this is the correct answer? What uncertainties do you have about the answer, and what kinds of things do you not know you need to find a definitive answer to the question?”

This way, the user can reveal the range of uncertainty behind the AI’s answer, Lo said.

One of the related things [large language models] What I find particularly worrying is that no matter what you ask him, he will always come back with an answer that sounds authoritative, even if it isn’t.

Andrew Lo

director of the MIT Financial Engineering Laboratory and principal investigator of the Computer Science and Artificial Intelligence Laboratory

Along the same lines, financial planner Harrison said he suggested requiring the AI ​​program to list its sources. Users can also instruct the AI ​​to limit its resources to those that meet certain criteria.

“If you don’t need verification of sources, it will give an idea, that’s not what I’m looking for,” Harrison said.

After all, there is so much “context” and complexity surrounding each individual’s financial situation that there is only so much a human financial planner can extract from their client, Harrison said. He said someone using artificial intelligence wouldn’t necessarily know that their prompts were uncovering all these subtleties.

“I’m looking [AI] “Getting advice means you’ve given it enough information to form an opinion and make a recommendation, and that’s a step further than I would go with AI,” he said.

Select CNBC as your preferred source on Google and never miss a beat from the most trusted name in business news.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button