Rethinking Optimization
We must take a coarse view, both in how we construct the portfolio now, and how we adjust for the future expected path. A coarse view is one that takes a lesson from the cockroach. This insect has a very simple defense mechanism. It doesn’t hear, or see, or smell. All it does is move in the opposite direction of wind hitting little hairs on its legs. It will never win the best designed insect of the year award. But it has done well enough to survive even as jungles have turned into deserts and deserts into cities. It survives because it has coarse behavior.

The markets and clients’ objectives change as well. The lesson of the cockroach rings clear.

So, how can a mechanistic optimization engine be rejiggered to accommodate estimation errors, uncertainty about the future, and the dynamics of both the market performance and the client’s objectives?

One approach is to think of optimization in terms of risk factors rather than assets. By risk factors, we specifically mean a security’s exposure to certain sectors and industries, to countries, to market characteristics and to styles like value and growth.

Viewing a portfolio in factor space rather than in asset space has a number of advantages. Risk factors are more stable than assets, so the correlation issues, while still there, are lessened. The factor influences thread through the assets, so the relationships between them are tethered to something more than their historical behavior. And most of the risk can usually be found in a handful of factors as opposed to possibly hundreds of assets. When you can home in on these fewer dimensions, you make the problem cleaner, with a reduced chance that some company like General Mills, behaving just so, takes on an unreasonably dominant role in your portfolio.

Consider the client’s portfolio and its asset allocations described in Table 1. When compared to the target portfolio, the client’s current portfolio is overweight equities and underweight alternatives. In a naive rebalancing, we might end up substantially reducing the equity holdings and substantially increasing the alternatives. However, if we look at the underlying factors and corresponding risk contributions in the figure, and then rebalance for those, we end up making smaller changes. Because factors thread across assets, one can reach the target risk attribution through these smaller moves.

This suggests another important advantage in rebalancing across risk factors: It reduces the need for large trades. Furthermore, if the risk exposures are aligned between the target and the client portfolio, we might not even need to rebalance, and thus we’ve avoided unnecessary trading costs. Note that the original portfolio has no commodity exposure while the target does, and that is reflected in the rebalancing. (In this example, we are using Fabric’s platform, which employs MSCI’s Multi-Asset Class Factor Model Tier 4.)

Human Plus Machine
Finally, we bring the human aspect of the rebalancing, including our experience and common sense.

There are three main aims of rebalancing: One is to build the target portfolio. A second is to keep the portfolio within an acceptable range of the target. And a third is to make adjustments as the market changes or as views of the market change. Whenever we do this rebalancing, we have to stay sensitive to the client’s needs and the variable constraints.

Let’s move back to the mathematical world, to optimization tools and computers, but now add human judgment, which creates what can be called, in statistical terms, a Bayesian approach. (Though it’s not essential to cast things in a Bayesian framework, it’s comforting to know that bringing in the human element is fair game from a statistical purist’s standpoint.) Here we create a starting point based on our experience and judgment, which in the Bayesian world is called the prior, and then push it on to the computer to make adjustments, resulting in what is called the posterior. One way to do this is to set the target without an optimization machine. If we keep on top of the portfolio, we’ll stray only marginally from the target. And if we start rebalancing for factors rather than assets we can gain an intuitive understanding of material bias as we look at variations from the target.

People in the past have taken an abstract view of portfolio optimization, seeing it as a mathematical operation on a collection of assets. But that’s inadequate when we look at how client portfolios are actually set up. An investment portfolio is effectively a set of sub-portfolios, each with a particular mandate, possibly with underlying accounts that have differing objectives. Clients might hold legacy positions or place constraints on the buying or selling of certain assets. Any portfolio construction or rebalancing exercise should at least be aware of these things.

Our takeaway: Mean-variance optimization is too blunt a tool, one that ignores the essentials of the client’s objectives, one that is difficult to customize to address the multi-dimensional needs of a heterogeneous set of clients. The tools that advisors use need to be flexible to account for all these variations.

Today’s computing power, combined with advances in mathematical techniques, can help advisors move from brute force, machine-driven optimization to what we call guided rebalancing. By that we mean it’s guided by the human sense of the baseline and acceptable variations, and by optimization methods that respect the realities of the market and the needs and objectives of individuals.

Rick Bookstaber is the co-founder of Fabric RQ, a technology firm for risk-aware portfolio design. Dhruv Sharma is the head of portfolio intelligence at Fabric RQ.

First « 1 2 » Next