Friday, February 25, 2011

Do Target Date Funds Have it Bass Ackwards?

Target date funds or TDFs as they are sometimes known are all the rage in defined contribution (DC) plans. Perhaps, you have your money invested in one. Before I try to confront conventional wisdom, let me explain how they typically work.

Your plan account is set up on what is known as a glide path. What this means, oversimplified somewhat, is that your funds are invested in a diversified portfolio which is fairly aggressive at younger ages, and gets more conservative as it moves along the glide path toward older ages.

I'm not saying that this is wrong, but I'm going to challenge conventional thinking. I did a really simple simulation for 25 years. I assumed that a person started with annual compensation of $50,000, got annual pay increases of 4% per year, deferred 6% of pay into his 401(k) plan each year and got various rates of return on his money for each year. In the case where our hero earned a 10% rate of return in year 1 and 0% in year 25 with intermediate years sloping down smoothly, his account balance at retirement (end of Year 25) was about $182,500. When I assumed an annual rate of return of 5% (never varying), his account balance at retirement was nearly $223,500, and when I reversed the first scenario by grading upward from 0% to 10%, his account balance at retirement was just shy of $272,000.

But, the whole philosophy behind TDFs is to sacrifice potential reward for decreased risk as a participant approaches retirement. These illustrations imply that this is all wrong. Why? Well, the close that you get to retirement, the more money you have in your account. So, in Year 1, it probably doesn't make much difference what your rate of return is as there is a limit on what can happen to $3,000 deferral. But as deferrals and earnings add up, the annual rate of return becomes much more important.

This is too simplified for you, isn't it? Well, I made it more complicated. I developed multiple scenarios using a random number generator where the rates of return in Years 1 through 5 averaged 6%, in Years 6 through 10, they averaged 5.5%, in Years 11 through 15, they averaged 5.0%, in Years 16 through 20, they averaged 4.5%, and in Years 21 through 25, the average return was 4.0%. The mean ending account balance of all the simulations was about $192,000. Then, I reversed the process so that 5-year time-weighted returns averaged 4% in the first 5 years up to 6% in the last 5 years. Now, the simulated mean ending account balance was about $225,000.

Come on now. Don't do this to us. Don't mess with conventional wisdom. It hurts the brain on a Friday. What is going on here?

Let's step back and think about the underlying TDF premise. I'm going to point out a huge flaw in that premise (I'd love to be besieged with comments telling me why I am wrong). TDFs are designed so that the participant takes more risk and presumably gets more reward early in his career, and that is gradually reversed to the point where the participant takes less risk and gets less reward later in his career. This makes a big assumption which is usually false.

Read on, MacDuff!

The assumption made in TDF design is that a participant nearing retirement actually has enough money in his account that he no longer needs the upside potential that he needed when he was younger. In other words, it presumes that our participant has nearly enough money in his account to retire on as he nears retirement age, and that only some significant downtrends could disturb that.

We've all seen the data. It's just not so. Our participants rarely have enough in their accounts these days to retire on and they don't understand that. So, this de-risking strategy, while it doesn't make our participants significantly less able to retire when they plan to, also doesn't make them able to retire.

What is the answer? If I could snap my fingers and tell you, I wouldn't need to be sitting here blogging. I'd be out in northern California in my rocking chair looking out the window at my vineyard being tended to by others. But, I'd like to posit that DC plans need to learn something from defined benefit (DB) plan funding methods. I'm not talking about the current farce where all plans with pay-related benefits are forced to use a non-pay related cost method to determine contribution requirements. I'm talking about an old friend, Individual Aggregate.

If I've lost you now, I'm going to bring you back to the story. A long time ago in a faraway land (shortly after the passage of ERISA in the ivory towers of the Treasury Department), the IRS and Treasury were good enough to tell us in regulations and other guidance what constituted a "reasonable actuarial cost method." Individual aggregate was a favorite for funding one-person plans. It was constructed to produce a cost of a level percentage of pay throughout a participant's career so that a participant's retirement benefit would be fully funded at retirement, and the cost of deviations from assumptions were spread over the participant's remaining future working lifetime, so that the benefit would be fully funded at retirement. That's a really neat concept, isn't it?

What does it have to do with DC plans? Well, part of funding a DB plan is (or at least used to be before Congress starting meddling with funding rules) that the actuary makes a bunch of actuarial assumptions related to compensation, retirement date, mortality patterns (life expectancy), rates of return, and other related factors. Couldn't a participant in a DC plan do that?

So, a participant makes assumptions. He decides what percentage of pay he can defer to his plan, and through this fancy concept known as individual aggregate, he gets a necessary rate of return. Necessary rate of return is too high, he better defer more, or retire later. In any event, this would amount to sound planning.

Yes, for those who start early, defer a lot, and get a nice inheritance, TDFs will work well. Those participants will be prepared for pre-retirement in their pre-retirement years, and for them, downside risk avoidance will be paramount. But data shows that most participants are not in that situation. For them, I think we need to re-think TDFs.

Tell me why I am wrong. Tell me why I am right. Tell me something.


  1. Hi John - Richard Faw here - hope you're well. First, I enjoy reading your posts. Always thought-provoking so thanks for putting these out.

    On to my comments here. I've spent a fair amount of time analyzing TDFs, including running deterministic and stochastic projections. I actually think TDFs are a great improvement over the previous "state-of-the-art" in 401(k) planning which was essentially the balanced fund (if you exclude managed accounts, which in my opinion really are by far the best approach). The reason for this that participants rarely rebalance their portfolios through time, so portfolios are never managed to changing risk tolerance. I think your point though is that it's fallacious to assume a declining risk tolerance. Either because participants may not save enough up front and/or because subsequent returns may be below expectations.

    The challenge I see with this reasoning though is that, first, although you don't have a homogenous population, TDFs essentially require you to make that assumption because there's only one TDF for all investors (in a particular age cohort). So you therefore have to assume the fund's investors in aggregate defer the appropriate amounts or they don't. If you assume they DO defer the appropriate amounts through time, then a declining risk tolerance is appropriate - the risk there is that some investors in the TDF won't be prepared for retirement because they didn't defer enough. If you assume they DON'T save enough over time, then the fund will need to assume greater risk to hope to get better returns to cover the shortfall in deferrals - the risk there is that you expose the participants who planned appropriately to great risk of losing more money right before retirement. So, neither scenario is perfect, but I think the first risk is the better one to take.

    I think there's an added dimension here also. The closer you get to retirement, the shorter the investment time horizon becomes. With a shorter time horizon, the lower the confidence that the expected rewards from assuming investment risk will be realized. To me, taking risk with too short of a time horizon is not investment it's speculation - it may payoff or it may not, but it's not a good bet to play. That leads me again to taking less risk as the investing time horizon shortens near retirement.

    Thoughts on this?

  2. Richard, good to hear from you, and I'm glad that you enjoy my posts.

    I don't disagree with you. You've said it differently than I have, but I think we would agree that for those who have saved appropriately, a GOOD target date fund is perhaps the closest to the state of the art that the retirement community has had. On the other hand, for those who have not saved appropriately, TDFs are not likely to close the gap as an individual nears retirement.

    I agree that betting the farm on risky investments in hope of the huge payoff as retirement nears is speculation as compared to investment. At the end of the day, my point, and I hope that I have expressed it appropriately, is that workers need to save more and save earlier, and that defaulting ignorant workers into TDFs, while it may be better than any other default generally available, is certainly not nirvana.

    Again, thanks for reading and thanks for commenting.