Generally speaking, the tools that advisors have at their disposal to help clients plan for retirement are much better than they were a few short years ago. With the advent of faster processors and better programming, PCs and Web-based software can now perform complex calculations that would have only been possible on mainframe computers ten years ago. Better yet, most of those calculations can be hidden behind the scenes, so clients do not have to see the raw data unless they want to. This allows the advisor and the client to focus on what is important: the plan itself.
In spite of the advances we've made, retirement planning software is still evolving. Surprisingly, there is no one universal standard for performing retirement planning calculations within our industry. As a result, the advisory firm, or in some cases the individual advisor, must decide what methodologies and tools are the most appropriate to use when counseling clients.
Deterministic Models
Ten to 15 years ago, the deterministic or "straight line" model was the norm in the industry. This approach assumes that there will be a constant, steady pretax rate of return. In its simplest form, it also assumes a constant tax rate (or constant tax schedule) and a constant inflation rate. For example, you might create a model that shows a 65-year-old client retiring at the end of this year with $1 million in an IRA. If you assume that he will withdraw $50,000 per year beginning at age 66, that he will enjoy a 6% annual return and that he will encounter a 3% inflation rate with a tax rate of 28%, the client will have exhausted the account by age 84, at least according to a simple calculator on the FINRA Web site.
The beauty of the deterministic model is that it is simple to explain and simple to use. This makes it a good place to start a retirement planning discussion with clients. It also makes it a starting point for a do-it-yourself investor trying to figure out how much to save for retirement. There are, however, some severe limitations to this approach. The first is that advisors' projections for the average rate of return, inflation and taxes are likely to be wrong. The odds of a do-it-yourselfer getting the projections right are much lower.
Even if you do get all the projections right, however, the averages can be misleading. After all, if you put one arm in the freezer and the other in an oven, your average body temperature may be "normal," but you are not going to feel very comfortable. The same holds true for investment portfolios. Assuming you have invested this hypothetical IRA in a diversified portfolio, the returns for some years will be higher than average and for others they will be lower. If the IRA holder experiences the worst annual returns in the first year or two of retirement, he'll see his account depleted well before he's age 84. Or, in a more extreme example, if the portfolio declines 50% in year one, even if it returns 6% over the next ten years, with all the best returns projected for the final years, the account will still be depleted by the time the account holder reaches age 75.
Monte Carlo Simulations
In an effort to improve upon the deterministic model, many advisors now use Monte Carlo simulations. The basic underlying idea is that you have the computer simulate portfolio returns over time and then you run the simulation hundreds or thousands of times.
In a basic Monte Carlo simulation, you might start with a portfolio value (again, let's say $1 million), a tax rate, an inflation rate, an average return, a withdraw amount and a life expectancy (the entire period in which you'll be taking annual withdrawals). If you are looking at a 30-year retirement period, you'd have the computer select 30 different annual returns, withdraw $50,000 from the portfolio each year, and then see how long the portfolio lasts. If it retains some value over the simulated 30 years, that series of returns is deemed a success. If not, it is a failure. If you run this scenario 1,000 times, you can then determine what percentage of the trials succeeded.
Advisors are often puzzled when they enter the same data into two different Monte Carlo retirement programs and arrive at very different answers. The reason is that different software packages use different assumptions. One popular assumption is that returns will follow a "normal" pattern-that is, the distribution of the returns will resemble a bell-shaped curve. So if one program assumes there will be a normal distribution and another assumes there will be some variation of the normal distribution, the results will differ. In fact, even if both programs use a normal distribution, the results could differ because they may be using a different standard deviation of returns. All other things being equal, the higher the standard deviation, the less likely it is that the plan will succeed.
Monte Carlo And Risk
One of the criticisms leveled at advisors' Monte Carlo applications, particularly since the recent market meltdown, is that they underweight the likelihood of a significant market decline. In a 2009 article entitled "Déjà vu All Over Again," Paul Kaplan, Morningstar's vice president of quantitative research, looked at "standard models." These models use standard deviation as a measure of risk and they assume returns follow a normal, bell-shaped distribution. According to Kaplan, "If returns follow a normal distribution, the chance that a return would be more than three standard deviations below average would be a trivial 0.135%. Since January 1926, we have 996 months of stock market data; 0.135% of 996 is 1.34-that is, there should be only one or two occurrences of such an event."
But the actual monthly returns Kaplan found in the S&P 500 dating back to January 1926 told a different story. He discovered that the monthly returns of the S&P 500 have fallen below the three-standard-deviation average a remarkable ten times in the period under consideration. So the evidence suggests that a significant monthly decline is about eight times more likely than the standard models would lead you to believe.