Download View Latest Issue - State Street Global Advisors

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Land banking wikipedia , lookup

Present value wikipedia , lookup

Greeks (finance) wikipedia , lookup

Financialization wikipedia , lookup

Systemic risk wikipedia , lookup

Lattice model (finance) wikipedia , lookup

Business valuation wikipedia , lookup

Business intelligence wikipedia , lookup

Investment fund wikipedia , lookup

Financial economics wikipedia , lookup

Beta (finance) wikipedia , lookup

Investment management wikipedia , lookup

Transcript
Q3&4 2016
IQ
ACTIVE
QUANT GOLDEN AGE
Smart Data to Big Alpha?
HIGH-DEFINITION
RETURNS
Turning up the
Factor Signal
MARCH OF
THE MACHINES
Deep Learning for
Better Alpha
IQ
Q 3 &4 2 016
IQ magazine provides the most relevant thought
leadership from State Street Global Advisors.
ACTIVE
The combination of big data
and artificial intelligence
is transforming how active
managers identify new
factor signals to build
better alpha models.
3 PUNCTUATED EQUILIBRIUM
Like shifts in evolutionary biology, the
combination of factor investing and new
technology is transforming active management.
7 WHEN LESS IS MORE
When it comes to active factor investing, the
quality of factors is far more important than
the number of factors in alpha models.
15 BACK IN THE SPOTLIGHT
Olivia Engel makes history as the first quant and
first woman to win Australia’s Blue Ribbon Award.
17 FACTOR TIMING
While timing factors is notoriously difficult,
we look at promising results when timing is
applied to multifactor Smart Beta strategies.
21 RISE OF ARTIFICIAL INTELLIGENCE
Dramatic breakthroughs in machine learning offer
promising potential for building better alpha models.
25 APPLYING THE FACTOR LENS
By readjusting unintended factor bets in equity
allocations, investors can target better riskadjusted returns.
Q3&4 2016 | The New Active
2
PUNCTUATED
EQUILIBRIUM
And the Golden Age of
Active Quant Management
In the last issue of the IQ,1 we discussed
how factor investing is disrupting traditional
active management and raising the bar on
managers to show how much of their return
is true, skill-based alpha.
RICK LACAILLE
Chief Investment Officer
State Street Global Advisors (SSGA)
SSGA’s
First Active
Quant Strategy
SSGA’s
First Smart
Beta Strategy
Launch of the
S&P 500 Index
1976 1984
1993
Punctuated Equilibrium
I
n this issue we focus on the ways
in which the combination of
unprecedented amounts of data
and advances in artificial intelligence
(AI) in our increasingly connected
world is helping active managers
pursue new sources of uncorrelated
alpha in rigorous, systematic ways.
We feature the perspectives from
our Active Quantitative Equity team
on their approach to specifying and
testing factors to improve the models
they use to generate alpha as well as
how our Investment Solutions Group
is using the factor lens to build more
resilient portfolios.
Given the recent performance
difficulties of many active managers,
including hedge fund managers, it
might seem strange to talk about a
“golden age” of active management.
But we do think we are at an important
inflection point in the industry as factor
investing contributes to the extinction
of certain kinds of active managers
whose factor exposures can be captured
more cost-efficiently through Smart
Beta strategies. At the same time, we
expect the advent of new data, tools and
technology will give rise to a new species
of active managers, increasingly looking
at investment opportunities and risks
through a factor lens.
We compare this transformative change
in the industry to a somewhat esoteric
phenomenon in evolutionary biology
called punctuated equilibrium. For
SSGA’s
Global Defensive
Equity Strategies2
those of you unfamiliar with the concept,
it caused quite a stir in 1972 when the
late, great American paleontologist and
evolutionary biologist Stephen Jay Gould
of Harvard University, along with his
colleague Niles Eldredge from Columbia
University, challenged Darwin’s
traditional view of gradual evolutionary
development. The two scientists argued
that the fossil record showed virtually
no evidence of evolutionary gradualism.
Instead, it seemed that long periods
of stasis or evolutionary equilibrium
dominated the history of most fossil
species until there were quite dramatic
shifts or “punctuations” when step
changes occurred that transformed
our world.
History of Investing
We believe it is not too far-fetched to
apply a similar theory of development
to our own investment management
industry today. For many years,
investing was based on an eclectic
combination of analysis, forecasting
and beliefs that was hard to reconcile
with evolving academic thinking.
Markets experienced periods of
irrational exuberance and corrections,
but the investment process remained
fairly static. One could argue that the
first moment of punctuated equilibrium
in investment management came with
the creation of market-cap weighted
indexes in the 1970s.
For most of the 20th century, active
investors regarded the entirety of their
return as having been achieved through
the application of skill. The advent of
the Efficient Market Hypothesis (EMH)
and the Capital Asset Pricing Model
(CAPM) suggested that the return
should properly be split between that
part due to exposure to the overall
equity market, and a residual that we
would expect to be on average zero. This
reframing of active management raised
the bar considerably for managers and
gave rise to the trillions-large passive
investment phenomenon.
Active managers had to adapt and
provide excess return beyond those
benchmarks. Thus ensued the era of
traditional active management where
managers tried to beat the index
primarily by fundamental processes
of company-focused security selection.
Over time, however, quantitative-based
managers joined the industry, forging
the path toward more systematic ways
of understanding the drivers of risk
and return with the first generation
of computer-aided analysis and
behavioral finance insights.
Active Quant
AUM $27B3
Smart Beta
AUM $88B3
SSGA’s First
Smart Beta Fixed
Income Strategy
2008 2010
2016
Punctuated Equilibrium
Age of Factor Investing
More than 40 years on from the creation
of market-cap weighted indexes, we
think we are now on the cusp of another
punctuation that could have far more
dramatic implications for investing.
The beginnings of this shift have been
driven by the insights factor investing
has brought to light, even though
the initial research into factors goes
back nearly as far as the first capweighted indexes.
Factor investing provides a powerful
lens for understanding the drivers of
risk and return beyond traditional
asset class categories. As the tools for
disaggregating investment returns
have become more widespread, we have
seen how Smart Beta strategies have
challenged traditional active managers
and, increasingly, alternatives managers.
Smart Data, Big Alpha
That factor-based process of natural
selection will likely weed out many
traditional active managers as well as
some high-priced hedge fund managers
at the same time that the combined
forces of big data and technology will
favor data-driven active managers
discovering new anomalies and
dislocations. In this new age, the lines
between fundamental and quantitative
active managers will become
increasingly blurred as both come to
embrace the new tools and technology.
We believe the successful active
managers of the future will be able
to incorporate the best elements of
both approaches. While it has become
commonplace to marvel at the amount
of data our new “Internet of Everything”
age is throwing off, it is worth reminding
ourselves of the order of magnitude by
which the information we can apply to
our investment process is growing.
The purported 12 exabytes4 of data
the US intelligence community is
storing in their enormous center in the
desert of Utah might send shudders of
apprehensiveness down the spine of
American citizens until they realize
that Google’s data centers reportedly
have up to 15 exabytes stored (or the
equivalent of around 30 million personal
computers).5
And the volumes of new data continue
to boggle the mind: every minute, 7.8
million videos are viewed; more than
3.3 million searches are entered; 151
million e-mail messages are sent; and
more than 436,000 tweets are posted!
Of course, all sorts of data sets have
been around for a while now. The
difference is the new assortment of
tools and artificial intelligence (AI)
technology for processing the data.
And indeed, this information overload
could actually create new scarcity in the
form of knowledge differentiation and
interpretation. This is the opportunity
facing active managers.
With real-time data literally pulled
from the ether, we may be able to assess
companies and markets far more quickly
and with more granularity than ever
before. All of this could be turned into
a new class of investable information,
heralding a new golden age of quantdriven active management.
Golden Age of Active
Quant Management
One way to understand how this
age of big data and AI will affect our
industry is to consider some of the
theoretical principles underpinning
active management. According to these,
active managers’ performance is thought
to be a function of three
Big Data Per Minute
Assessing companies and markets with real
time data presents an exciting opportunity
for active managers a new class of
investable information.
7.8
3.3
151
436K
Million Videos Viewed
Million Searches Entered
Million Emails Sent
Tweets Posted
Source: Internet Live Stats (cited by World Wide Web Consortium),
September 2016.
Q3&4 2016 | The New Active
5
Punctuated Equilibrium
basic drivers: the accuracy of their
forecasts; the number of independent
forecasts they can tap into; and the
degree to which they can transfer
those insights to their portfolios, given
the constraints of the markets they
operate in and imposed upon them
by client mandates.
Quant-driven processes will likely
come to play a larger role in active
management at the same time that
society at large will become more
comfortable with such processes
for everything from choosing a
restaurant and a movie to rebalancing
investment portfolios.
Skilled active managers can add
value on each of these dimensions by:
embedding proprietary refinements
into otherwise well-known return
drivers; becoming more effective
and consistent in processing publicly
available information at the specific
company level; transporting data from
one context to another to forecast
changes of direction in industry or
company dynamics; discovering
factors that are not in Smart Beta
strategies or otherwise in the public
domain; identifying better ways of
blending these elements; varying
exposures to return drivers over time
(recognizing that their effectiveness
will ebb and flow); and making
improvements to risk modelling and
portfolio construction.
But we still believe the most
successful firms will be able to
incorporate the best elements of
quant and fundamental approaches,
as human judgment will still play an
important role.
Steps like these can improve existing
forecasts and diversify them with
additional views — helping to ensure
that portfolios are driven by forces
over which managers have skill,
while mitigating the effects of forces
beyond their control. As the domain
of skill expands, areas that were once
considered unpredictable can shrink;
dimensions that had to be constrained
can become forecastable elements that
can be captured as additional sources
of alpha.
In short, expanding the manager’s
skill, the number of forecasts and
the transferability of those insights
into the portfolio bodes well for
active performance. Actionable
insights from big data and AI promise
to enhance all three drivers of active
managers’ performance.
The Active Quant
Shop of the Future
So what does this new golden age of
active quant management mean for
our industry? We think it has distinct
implications for what the preconditions
for success will be. Access to data and
the tools to harness that data will be
more important than ever.
Those asset management firms that
have already made the necessary
investments in data and technology
will have an edge.
DATA
Arms race for new and
better structured data sets
Of course there will also be new risks
in this new world in the form of data
highway crashes, which managers will
need to mitigate. The new golden age
of active quant management will also
require a new kind of talent. Graduates
in data science are likely to be relatively
more attractive to the industry than
graduates in economics or traditional
finance. Asset management will
become much more of a technology
industry than it is already, and it will be
competing with the Googles, Facebooks
and Amazons of the future for the same
kind of talent.
And what becomes of the golden age
when machine learning advances to the
stage where human portfolio managers
are no longer necessary, when we as an
industry reach the same point as the
joke about the fully automated factory
of the future with two employees: a man
and a dog. The man’s job is to feed the
dog. The dog’s job is to prevent the man
from touching any of the automated
equipment. Will we reach that “I, Robot”
state where artificial intelligence renders
humans in investment management
obsolete? Perhaps that day will
arrive, but we would argue that the
technological singularity will have likely
subsumed humans across all industries
and endeavors by that point anyway,
so the question will be moot — and a
truly dramatic example of punctuated
equilibrium will have ensued!
1
TECHNOLOGY
Rise of deep learning
algorithms to harness
new and e isting data
TALENT
Competing with the
oogles and ma ons of the
future for tech-savvy talent
2
3
4
5
State Street Global Advisors, “The New Investment
Reality,” IQ, Q1&2 2016.
Strategy names effective as of October 1, 2016.
Formerly named Managed Volatility Alpha Strategies.
State Street Global Advisors assets under
management as of 3/31/2016.
An exabyte is a unit of digital information. One
exabyte equals one quintillion (or 1000 to the 6th
power) bytes of information.
Richi Jennings, “NSA’s Huge Utah Datacenter: How
Much of Your Data Will It Store?” Forbes, July 26,
2013; Colin Carson, “How Much Data Does Google
Store,” Cirrus Insight, November 18, 2014.
Q3&4 2016 | The New Active
6
WHEN LESS IS
MORE
INVESTMENT DISCUSSION
The Virtues of Rigorous
Factor Selection
Q3&4 2016 | The New Active
7
With interest in factor investing as an alpha generator
growing, investors are keen to know how quant managers
specify and incorporate factor signals into their investment
models. This is especially important as the supply of data
explodes and managers need to have a rigorous process for
separating true signals from noise. SSGA’s Deputy CIO
Lori Heinel sat down with Vladimir Zdorovtsov, who heads
global research for our Active Quantitative Equity (AQE) team,
to discuss the research-based framework the team uses to
select factors for their models. When it comes to active factor
investing, Vlad says, the quality of the factors is far more
important than the number of factors used in a model.
LORI HEINEL, CFA
Chief Portfolio Strategist, SSGA
VLADIMIR ZDOROVTSOV, PhD
Managing Director, Active Quantitative Equity, SSGA
Q3&4 2016 | The New Active
8
When Less is More
LORI
W
hat’s the investment
philosophy behind the
alpha models you build?
VLAD
Our philosophy is three-pronged. First, we believe...
In the case of active
managers, the factors
are proprietary, based
on insights specific
to a given manager,
which are closely
guarded. hose insights
are meant to deliver
enhancements to
what might otherwise
be well-known
phenomena or
harness something
completely new, and
to generate genuine alpha on top of what
commoditi ed factors
might be capturing.
...markets are not efficient. This
inefficiency stems from irrational
behaviors and systematic biases,
coupled with market frictions or speed
bumps that distort the process of price
discovery and may also lead to transient
mispricings that we might be able
to harness.
The second prong speaks to how we
try to unearth these opportunities.
H
ow does that fit into the
broader category of factor
investing, and how does
AQE’s approach to factor
investing compare with the
way our Smart Beta team
uses factors?
I think of a factor
more broadly
as a systematic
decision criterion...
...a way to compare multiple investment
opportunities using the same rule.
There are a number of dimensions along
which you can differentiate factors. For
example, you can think of them as being
We believe our alpha ideas need to
have a strong economic rationale
and that we need to rigorously vet
these ideas with methodical and
careful empirical testing.
Thirdly, we strongly believe that
whichever opportunities we find,
the best way to tap into them is to
be systematic and process-driven
and to apply them broadly.
transparent versus non-transparent. So
in the case of Smart Beta, they are fully
transparent. Anyone can replicate
those factors because they are all in
the public domain. In the case of active
managers, the factors are proprietary,
based on insights specific to a given
manager, which are closely guarded.
Those insights are meant to deliver
enhancements to what might otherwise
be well-known phenomena or harness
something completely new, and to
generate genuine alpha on top of
what commoditized factors might
be capturing.
Another dimension would be crosssectional versus time series. A decision
rule may be systematically deployed
to compare multiple opportunities at a
specific point in time. So, for example,
I can compare stock A to stock B at a
given point in time. Alternatively, I can
use a systemic rule to make investment
decisions over time — for example, to
Q3&4 2016 | The New Active
9
When Less is More
vary my allocation to value or to make
the portfolio more or less defensive. One
can think of these systematic decision
rules as time series factors. To the extent
such rules may be in the public domain
and/or are implemented transparently,
they are fair game for Smart Beta. We are
beginning to see Smart Beta providers
make inroads here. I think our approach
in AQE is materially different from
Smart Beta since we reflect proprietary
enhancements that go above and beyond
what a public domain version of a factor
would capture. Moreover, we may have
factors that don’t have any equivalents
in the public domain. Another important
dimension in factor investing is the
difference between explicit and implicit
factors. Smart Beta focuses on explicit
factors but in many instances, an active
manager may be doing something that
is implicitly a factor exposure. Say, for
example, I am an active value manager
selecting underpriced stocks, or a
growth manager looking for growth
stocks. Underneath these allegedly
stock-specific exposures, there is often
still a factor bet. I may kick the tires by
How do you think about incorporating
these kinds of factor signals into your
alpha models, and why are there
variations across quant managers
in the number of factors they use?
To answer that, you need to
understand why rigor
is so important...
...in building models. To identify
a genuine factor, you first have to know
what factors are capturing. If you delve
deeply into why these factors exist in
the first place, it becomes fairly obvious
that there are not so many different
anomalies, human misbehaviors, risk
premia or market frictions. There is a
limited number of these opportunities.
They may manifest themselves
differently, at different points in time
or in different contexts and be measured
or approximated in different ways.
However, the number of drivers
underlying the predictability of
returns is relatively low.
You have to be extremely careful when
trying to capture those drivers, because
their explanatory power will be relatively
weak even under the best of
circumstances. You must adhere to a
strict economic rationale in terms of
the thoroughness of your vetting
process. This starts at the ideation stage,
which should involve a great deal of
scrutiny and debate within the team,
making sure that the intuition behind
the idea is robust, all the way through a
battery of structured in-sample testing
and to the out-of-sample corroboration
of the findings to ensure you haven’t
inadvertently overfitted the data.
interviewing a company’s management
or do some forensic accounting. But
behind all of this there are systematic
rules for comparing companies. So
if I ask similar questions to multiple
companies and management teams,
and I determine that this particular
company seems to be better run or
its accounting seems to be cleaner,
underneath my comparison there is
still a factor. I’m still using some rule
of thumb or decision rule, but it will
be an implicit rather than an explicit
factor bet.
In terms of the variation in how
parsimonious different managers’
models are, there are several reasons
for this. The first may be merely optical
or cosmetic. In some cases, one manager
may refer to a single factor, whereas
other managers might divide that into
several subcomponent factors, but
they are still capturing the same idea.
Secondly, managers might inadvertently
include redundant factors that are
already nested in or subsumed by
the other model components.
When you’re considering a new
candidate factor that shows promise
on its own, there are several possible
outcomes once it is plugged into the
existing process. Ideally you find that it
is truly orthogonal and merits inclusion
in the model. Secondly, you might find
that it is already subsumed or nested in
one of the model’s existing components
and thus redundant. A third possibility
is that the candidate factor is a better
version of what you already have and
it should displace the existing factor.
The point is that managers can end
up with more factors than they really
ought to have if they are not thorough
and methodical when looking for
Q3&4 2016 | The New Active
10
When Less is More
improvements to their process. We
believe you need to have a properly
high bar for including a factor, as well
as a process for revisiting and refreshing
what is in the existing model. As you
add new signals, you may have to remove
others, which some managers may not
do. We believe that adversely affects
the efficiency of the model.
W
hat about the
possibility of spurious
factor signals?
That is the most
troublesome concern.
Again, without
sufficient rigor...
... a manager can include a spurious
factor signal that is really just a fluke. If
you apply brute force to the data without
any modicum of economic intuition,
you will find many false positives.
Such blind torturing of data into
submission is probably fairly infrequent
among more careful managers. What
tends to happen more often is that you
may have some underlying economic
intuition but it is not prescriptive
enough. If you allow too much wiggle
room in letting the data speak for itself,
you may still find something that looks
efficacious, but just by chance.
The economic rationale needs to be
carefully analyzed in the context of the
data to ensure you have not arrived at
the results by happenstance. If your
economic rationale is pointing to some
underlying relationship, you should
test to validate that this economic
relationship is indeed driving the
behavior. For example, if you have some
intuition about investor disagreement
driving overpricing and subsequently
leading to lower returns, you should
be able to observe that across a
number of different ways of measuring
disagreement. If you look at 10 different
ways of measuring this and nine of
these support the hypothesis and one
doesn’t, most likely the last one is a
false negative. On the other hand, if
you look at 10 ways and only one points
in the right direction, then that is likely
a false positive.
It is important to look at not just
whether a variable is predicting returns
but to drill into why it is predicting them.
If disagreement leads to overpricing
because of short-selling constraints,
this effect should be more salient among
stocks where — and at times when —
those constraints are more binding.
Similarly, if you conjecture that a
given factor predicts returns because
it predicts earnings, I should actually
look to see if it is predicting returns only
because it is predicting earnings. If it is
predicting returns without predicting
earnings, then it is probably a fluke.
So in our rigorous framework it is not
enough for a factor signal to appear
to be working, it has to be working for
the right reasons.
Can you give an example
of a factor signal you
thought was promising
but proved redundant
or spurious?
An interesting
example was when
we tested how a
new sentiment
factor might work
in our model.
We called this new sentiment factor
“disposition” after the famous
disposition effect rooted in the seminal
Prospect Theory work of the Nobel
prize-winning behavioralist Daniel
Kahneman and his colleague, the late
Amos Tversky. Essentially the
disposition effect says that investors
have a greater propensity to sell
winning stocks and an aversion to suffer
losses by selling declining stocks,
resulting in an underreaction to good
and bad news. So this factor had a great
deal of elegant theory and strong
conceptual appeal behind it, as it
seemed to get directly at the core
drivers of investor underreaction.
As we see in the accompanying table,
the disposition factor (DISP) did
work well when measured in isolation
(the green dots signify a statistically
significant positive effect).
Q3&4 2016 | The New Active
11
When Less is More
Raising a High Bar for Factor Selection
(indicated by the green dots in the
second column) across a multitude of
different stock universes, though not as
many as the model’s existing momentum
factor (the CONS or “consistency”
signal). However, when the disposition
signal was tested together with all the
other existing alpha model components
The table below shows why it is
important to test factors together with
existing elements in an alpha model,
across a broad universe of markets.
When a new momentum factor (the
DISP or “disposition” signal) was tested
on its own (“univariate”), it showed
statistically significant positive effects
(“multivariate”) across the same stock
universes, the effect was no longer
statistically significant at conventional
levels (indicated by red and orange dots
in the last column), and therefore not
included in the model.
UNIVARIATE
CONS
Stock Universe
IC* (%)
ACWI IMI
4.1
WLD IMI
3.7
WLD STD
1.8
WLD SMALL
4.2
EM IMI
4.2
NA IMI
3.3
EU WLD IMI
3.2
APEXJP WLD IMI
6.3
JP IMI
3.1
NA STD
1.8
EU WLD STD
0.7
APEXJP WLD STD
2.5
JP STD
1.7
RUSSELL2000
3.2
EU WLD SMALL
4.3
APEXJP WLD SMALL
8.1
JP WLD SMALL
3.3
US
3.3
AU
4.4
CA
5.8
UK
2.3
EAFE
1.8
EAFE SMALL
5.1
MULTIVARIATE
DISP
p-value
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
IC* (%)
0.00
3.4
0.00
2.8
0.05
0.8
0.00
3.5
0.00
3.7
0.00
1.5
0.00
2.5
0.00
5.6
0.01
2.2
0.09
0.4
0.29
-0.1
0.04
2.2
0.11
1.4
0.00
1.3
0.00
3.8
0.00
7.1
0.00
1.9
0.00
1.3
0.00
5.9
0.00
5.0
0.03
1.5
0.06
1.0
0.00
4.8
CONS
p-value
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
Source: SSGA Active Quantitative Equity Research Team. Data sample reflects the time period
1/1997-12/2013. For illustrative purposes only.
* Information Coefficient.
IC* (%)
0.00
2.0
0.01
2.1
0.25
1.3
0.00
2.2
0.00
1.3
0.12
1.8
0.03
1.4
0.00
3.7
0.09
1.8
0.39
1.3
0.46
0.3
0.08
1.5
0.23
0.4
0.11
1.9
0.00
1.7
0.00
5.1
0.11
2.2
0.13
1.9
0.00
-0.2
0.00
2.0
0.16
1.3
0.23
1.2
0.00
2.7
DISP
p-value
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
IC* (%)
0.00
0.7
0.00
0.4
0.02
-0.8
0.00
0.9
0.01
0.8
0.00
-0.7
0.00
-0.6
0.00
2.3
0.00
-0.2
0.04
-1.5
0.30
-1.2
0.07
0.5
0.36
-1.2
0.00
-0.9
0.00
0.0
0.00
3.0
0.00
-0.4
0.00
-0.8
0.43
1.9
0.01
0.6
0.04
-0.9
0.04
-0.6
0.00
1.5
Strongest
p-value
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
0.16
0.31
0.21
0.15
0.15
0.25
0.24
0.01
0.43
0.12
0.16
0.35
0.20
0.12
0.49
0.00
0.36
0.23
0.07
0.31
0.24
0.28
0.04
Weakest
Q3&4 2016 | The New Active
12
When Less is More
However, when we combined it with
our existing momentum factor
(“consistency” or CONS), and controlled
for other components within our
process, we found that our existing
consistency factor remained strong,
while the disposition factor was no
longer significant. Those results were
broadly consistent across a variety of
testable stock universes, which is a
critical part of the vetting process. So
while it looked attractive in isolation,
it did not pass our bar for including
in the model.
I should mention though that while
the disposition factor was not additive
in this case, we are actually looking
at an improved version of the signal
which addresses a possibly
questionable assumption that all
shares of a stock float are equally
likely to change hands on a given day.
So we may well add a revised version
of this factor to our process in
the future.
H
ow does the onslaught
of ever more data and tools
in this age of big data and
artificial intelligence affect
the search for genuine
alpha signals?
As active managers,
what we are trying to
predict is...
...ultimately a function of information,
so the more relevant information we
have, the better. We already see multiple
new data sets capturing the underlying
behaviors of market participants and
economic agents in ever more granular
ways. These should enable us to better
capture future earnings and other
drivers of returns.
Doesn’t it increase the risk of
incorporating noise or spurious
signals into your model?
It does indeed, which is
why we believe ...
...a strict economic rationale will become
even more important in this age of
big data. That rationale will guide the
important decisions around how the data
needs to be structured; otherwise you’ll
find there are multiple ways of coalescing
the data into some predictive signal. So
rigorous parsimony will become even
more important in selecting factors.
On a positive note, as more of this
data becomes available, it will allow
us to reflect the true fundamentals of
a company in a more comprehensive
and direct way. To that extent, prices
will become increasingly reflective
of intrinsic value and markets will
become progressively more efficient.
Q3&4 2016 | The New Active
13
When Less is More
W
hat will that mean for
the search for alpha, and how
do you see the future of factor
investing evolving from here?
We already see an arms race for access to data,
technology and talent, as active managers become
better at gleaning insights from the data.
The unknowable part of what drives
variations in returns is likely to shrink.
But even in a world where a combination
of big data and deep learning tools
will lead to a better understanding of
a company’s fundamentals, the risk
embedded in those fundamentals,
and the market’s pricing of both of these,
there will always be room for active
management. That is because markets
can never be efficient without someone
making them so. This goes to the heart of
the Grossman-Stiglitz paradox whereby
for markets to be efficient there needs
to be some minimal level of inefficiency.
And the role of human judgment and
skill will continue to be key, even as
the data and tools evolve. The tools
might get “smarter,” but you’ll still need
human input to know which tools to
use where and how to apply them.
Whatever the future holds, I believe
the quality of people, tools and data
will continue to be the differentiating
element in active management. It is
these three areas we think clients
should focus on when evaluating
managers, rather than the narrow
and possibly misleading metrics of
supposed model complexity.
... the role of human
judgment and skill will
continue to be key, even
as the data and tools
evolve. he tools might
get smarter, but you ll
still need human input
to know which tools to
use where and how
to apply them.
Q3&4 2016 | The New Active
14
Back in the
SPOTLIGHT
ACTIVE QUANT
Olivia Engel makes history
as the first quant and first
woman to win Australia’s
Blue Ribbon Award.
Olivia Engel Makes
History as First Quant
and First Woman to
Win Australian Blue
Ribbon Award
Q3&4 2016 | The New Active
15
MANAGER PROFILE
Olivia Engel Head of Active Quantitative Equity, Asia-Pacific, SSGA
Olivia Engel did not begin her
career in asset management as a quant,
but over time she came to favor taking
a precise and systematic approach
to constructing stock portfolios.
“I saw the benefits of taking emotion out
of stock-picking and instead focusing
on building a rigorous model that would
buy, hold or sell stocks according to
objective criteria.”
That discipline has worked well for her.
Engel was recently awarded the Blue
Ribbon Award1 for the best Australian
large cap strategy, recognition for quant
managers at a time when the broad
category has struggled. It was the first
time that a woman and a quant manager
have won the award. Engel, who heads
quantitative equities for SSGA in the
Asia-Pacific region, believes that active
managers need to show clients that they
are adding value for their fees. “We know
that Smart Beta strategies are raising the
bar on active managers to demonstrate
that they are providing more than what
an investor could achieve by tilting to
some well-known factor premia. The
bar should be raised, so I welcome it.”
Based in Sydney, Engel joined
SSGA in 2011 and is responsible for
investment management and research
for Australian and Pacific ex-Japan
active equity portfolios. The Australian
large cap strategy managed by her
team (including three fellow portfolio
managers and one research analyst) is a
low volatility defensive strategy focusing
on the dual objectives of delivering
strong returns and reducing drawdown
risk, without using the performance
benchmark as an anchor to portfolio
construction. “In this lower-for-longer
return environment, investors need
far more time to recover from portfolio
drawdowns, which is why we are
so focused on managing the volatility
of the strategy.” The Blue Ribbon Award
research partner Morningstar described
the strategy as a credible option for
investors seeking core equity exposure.
“While we manage the Australian
version of this strategy,” Engel says,
“the same philosophy, process and stock
selection models underpin all of our
active quant equity strategies around
the world. For all the benchmarkunaware strategies, the goal is the same:
target the best possible return and a
meaningful reduction in the volatility
of the investment.”
I saw the benefits of
taking emotion out of
stock-picking and
instead focusing on
building a rigorous
model that would buy,
hold or sell stocks
according to
objective criteria.
Engel agrees that quant managers seem
poised on the cusp of a promising period
as better data sets and tools should help
them harness new sources of alpha. She
says it is important for managers to be
intentional about the factor bets they
have in their portfolios. “This year in
particular,” she says, “we’ve seen cases
where making the wrong or unintended
factor bets have undermined an active
manager’s stock-picking talents.”
Reflecting on her distinction, Engel
is quick to credit her colleagues for
their contribution to all aspects of the
investment process. As the first woman
to receive the prestigious award, she
wonders why it took so long. She believes
successful investment teams need to
have a high degree of diversity to avoid
groupthink and ensure they are drawing
on a wide universe of ideas. She assigns
team diversity the same importance
she gives to portfolio diversification,
and sees no reason why the number
of female portfolio managers should
not continue to rise.
Prior to joining SSGA, the Australian
native spent eight years at GMO as
a Senior Portfolio Manager in the
Australian Equities group, responsible
for portfolio management across all
Australian strategies. She also worked
for Colonial First State Global Asset
Management and Commonwealth
Investment Management as a portfolio
manager and has been working in
the investment industry since 1997.
She received a Bachelor of Economics
and a Bachelor of Commerce from
the Australian National University in
Canberra and earned the Chartered
Financial Analyst designation, serving
as a past president of the CFA Society
of Sydney. In her spare time she likes
to sing with her chamber choir, make
music with her daughters and hear live
music as much as possible, which she
says provides a welcome counterpoint to
days spent examining data sets to find
that next alpha-generating gem.
1
The Smart Investor Blue Ribbon Award, assessed by
Morningstar, was awarded to the State Street Australian
Equity Fund on August 19, 2016. The award recognised
the fund as ‘the best strategy’ for end investors in the
prevailing market environment over 1- and 3-year return
and risk outcomes.
Q3&4 2016 | The New Active
16
The Quest to Harness
Cyclicality for Better
Risk and Return
FACTOR
JENYA EMETS, CFA
Managing Director, Active Quantitative Equity, SSGA
KISHORE KARUNAKARAN
Portfolio Strategist, Active Quantitative Equity, SSGA
Q3&4 2016 | The New Active
17
Factor Timing
O
timing model developed by our
Active Quantitative Equity team
to universally defined factors, the
same as those used in our Smart
Beta strategies, to improve riskadjusted returns.
ne of the hallmarks of SSGA’s
active quantitative investment
process is maintaining
consistent exposures to factors with
durable, long-term value. At the same
time, empirical evidence pointing to the
inherent cyclicality of factors has fueled
efforts to try to time them to improve
both return potential as well as the
ability to manage drawdown risk. The
difficulty of timing factors has been
well-documented, given the uncertainty
of exogenous elements affecting
their behavior and the complexity of the
underlying relationships. However, we
believe at the margin it is possible to
time certain elements that can add value
and improve outcomes. Aside from
adding breadth to the investment
process by including another dimension
to an active manager’s views, we believe
trying to forecast factor pay-offs is a
critical element in helping to reflect
changes in the macro environment
and account for the time-varying
performance of factors. The following
discussion looks at the kinds of
systematic elements we believe are
needed to time factors effectively.
We describe how we applied the
behavior leading to systematic
mispricings will vary over time as
will the degree to which market
frictions slow down or distort
price discovery. Understanding the
interactions of these dynamics is key to
forming expectations of a factor’s
future pay-off. Building a Factor
Timing Framework
We believe an effective timing
framework should try to specify
some common drivers of factor
returns, notably:
Factor premia are composed of
three main components:
• Compensation for exposure to risk
• Return potential from irrational
market participant behavior
• Factor persistence
• Effects of systematic and structural
market frictions, such as market
circuit breakers or restrictions on
short selling
Each of these may be affected by
different drivers at different times.
For example, as market risk appetite
waxes and wanes, the compensation
for bearing an exposure to a risk
embedded within a factor will
move accordingly. Similarly, the
extent to which markets overreact,
underreact, or display other irrational
Figure 1
Timing Value With Valuation Spreads
Valuation Spread (%)
Value IC Ratio
1.6
Information
Coefficient
of Value (rhs)
1.2
0.2
Valuation
Spread (lhs)
0.1
0.8
0.0
0.4
-0.1
0.0
May
1991
Jan
1998
Sep
2009
May
2011
Nov
2015
Source: Universe: MSCI World Standard Index. Past performance is not an indication of future results.
Returns do not represent those of an index but were achieved by mathematically combining the actual
performance data of index-member stocks arranged and re-weighted according to their value ranking.
The performance assumes no transaction and rebalancing costs, so actual results will differ. Index
returns reflect capital gains and losses, income, and the reinvestment of dividends
• Factor valuation
-0.2
• Macroeconomic cycle phase
• Risk sentiment
Just as an investor would expect
a cheaper security to outperform
an expensive one, or a recent winning
stock to continue to outperform a recent
loser, the same applies to a factor, to
the extent that its return is driven by
securities based on that factor.
To illustrate this kind of cyclicality,
we plotted the information coefficient
(IC) of the value factor against
valuation spreads in Figure 1 over
a 27-year period. The IC of value
demonstrates the average power of
the factor on a 12-month horizon.
Valuation spreads measure the
difference in book-to-price between
the cheapest and the most expensive
value basket and can indicate when
a factor becomes cheap compared
to its history.
For example, as cheaper stocks
get cheaper and more expensive
stocks continue rising, valuation
spreads get wider and the value
factor underperforms. At the same
time, when this theme starts to
look cheaper, the opportunity set
increases. When spreads widen and
cheap stocks fall well below their
fair value, market participants start
looking for value opportunities and
the factor begins to outperform again.
Q3&4 2016 | The New Active
18
Factor Timing
An important caveat to this
relationship is the risk of rotating
into value too early and withstanding
some underperformance ahead
of the factor’s recovery. While it
can be argued that it is better to be
in too early than too late, we also
recognize the importance of adding
other dimensions to factor timing to
strengthen predictive power.
Understanding where we are in the
macroeconomic cycle and the degree of
risk sentiment are two other important
parts of a robust timing framework.
These will create important head/or
tailwinds for various factors, as market
regimes shift across recession, recovery,
boom periods and slowdowns. For
example, while value stocks might be
expected to fare better during economic
recoveries, quality defensive stocks
would be rewarded during recessionary
and “risk-off” conditions. Sentimentlinked trending factors might do better
during lower volatility regimes when
market-leading stocks are not changing
as much.
While these general factor timing
principles may seem reasonable, the
reality of factor performance is far more
nuanced because the underlying causal
links are time-varying. For example,
compare the performance of brick-andmortar–oriented value stocks struggling
during the boom of the 1990s with
value’s excellent performance during
the recession that followed. Similarly,
value stocks with high credit exposures
suffered during the global financial crisis
but showed strong performance during
the economic recovery, as shown in
Figure 2.
Risk exposures embedded within
sentiment factors may also vary over
time in a more predictable manner,
with risk-on sentiment coinciding with
a prolonged market rally and becoming
defensive after a bear market. As risk
appetite changes, the implications for
Figure 2
e o
nce o
e toc
co
i e ent
co e i e
%
50
1.0
Recession
40
0.8
30
0.6
Value
20
0.4
10
0.2
0
Feb
1995
Apr
2000
Jun
2005
Aug
2010
Oct
2015
0.0
Past performance is not an indication of future results. Returns do not represent those of an index
but were achieved by mathematically combining the actual performance data of index-member
stocks arranged and re-weighted according to their value ranking. The performance assumes no
transaction and rebalancing costs, so actual results will differ. Index returns reflect capital gains
and losses, income, and the reinvestment of dividends.
Figure 3
Rolling 1 Year Correlation Between Axioma Momentum and Volatility
0.8
Momentum/Volatility
0.4
0.0
-0.4
-0.8
-1.2
Dec
1997
Jul
2002
Mar
2007
Oct
2011
Jun
2016
Source: SSGA Active Quantitative Equity Research team.
a given factor will depend on how it is
exposed to a particular risk at a certain
point in time.
Furthermore, these risk effects do not
occur in isolation. The changes in risk
on/risk-off sentiment can both reflect
and affect the shifting expectations
of the macroeconomic environment.
Similarly, whether a factor is expensive
or cheap, or has performed well or
poorly recently — these dynamics
will interact with, and affect, other
potential predictors. In other words,
an effective timing model needs to
reflect the dynamism of the drivers of
factor premia as well as a range of other
possible interaction effects. Last but
not least, care should be taken to reflect
the regional specificity of these factor
drivers while mitigating the risk of
over-tuning model signals to potentially
spurious sampling noise.
Q3&4 2016 | The New Active
19
Factor Timing
Applying Factor Timing
to
ti cto
t
et t te ie
Those are the central issues we
considered when building our factor
timing model. To test its effectiveness,
we applied our proprietary dynamic
weights to our static multifactor
Smart Beta strategy over an 18-year
time period.
Figure 4 illustrates the theoretical
value added by SSGA’s dynamic factor
timing model to a static, equally
weighted allocation to value, quality,
momentum and low volatility. In
a backtest, the dynamic portfolio
outperformed the static portfolio by
1.08% on an annualized gross basis over
the 18-year period, while the tracking
error versus the MSCI World index
decreased by 0.05%, resulting in an
improvement in the information ratio
from 0.88 to 1.17. Of course, given the
dynamic reweighting of the portfolio,
monthly one-way turnover increased
from 8.24% to 11.29%.1
Understanding the interdependencies
of macroeconomic and market
behavioral influences on factor premia
is indeed at the heart of the active
quantitative process. Moreover, we
believe that advances in big data and
the tools to leverage that data may
improve our ability to more accurately
comprehend and harness the cyclicality
of factors for better outcomes. In the
meantime, we see distinct advantages
in using top-down drivers of factor
timing to add value to the active
investment process by:
• Increasing long-term alpha potential
• Enhancing portfolio diversification
with a dynamic approach to factor
weightings; and
• Improving overall portfolio risk
management by reducing the tail
of drawdown risk.
managers continue to develop their
skill in identifying and harvesting
alpha sources.
While the results of the analysis
we describe here are promising, it
is important to acknowledge the
notorious difficulties of timing factors
with precision, especially in the short
term. It is also important to emphasize
that this is only one of many drivers
of value in the active process as
1
As for transaction costs, a proprietary, tiered
transaction cost model was applied during the
performance analysis. The levels of transaction
costs varied across stocks but on average were
about 9 basis points in our simulations.
Figure 4
Applying SSGA’s Factor Timing Model to Smart Beta Factors
%
12
2.0
9
1.5
6
1.0
3
0.5
0
Static Portfolio
n Value Added
n Tracking Error
0
Dynamic Portfolio
n Turnover
Information Ratio
Backtest performance is not indicative of the past or future performance of any SSGA offering.
The portion of results through 09/30/2015 represents a backtest of the Dynamic Factor-Timing
model, which means that those results were achieved by means of the retroactive application
of the model which was developed with the benefit of hindsight. All data shown above does not
represent the results of actual trading, and in fact, actual results could differ substantially, and
there is the potential for loss as well as profit. The performance does not reflect management fees,
transaction costs, and other fees and expenses a client would have to pay, which reduce returns.
Please reference the Backtested Methodology Disclosure for a description of the methodology
used as well as an important discussion of the inherent limitations of backtested results.
Figure 5
Cumulative Net Active Return
%
400
Dynamic Portfolio
300
200
Static Portfolio
100
0
Jan
1997
Aug
2000
May
2004
Nov
2011
Sep
2015
Past performance does not guarantee future results.
Q3&4 2016 | The New Active
20
THE RISE OF
ARTIFICIAL
INTELLIGENCE
In the Search for Alpha
JEAN-SEBASTIEN PARENT-CHARTIER
Senior Quantitative Research Analyst, SSGA
e i eo
ti ci
nte i ence
T
he enormous amounts of
data generated by “The Internet
of Everything” might be
expanding the potential for active
managers to detect new sources of
alpha. But it is the dramatic advances in
artificial intelligence (AI) technologies
that seem to offer some of the most
promising methods of actually
harnessing the true value in big data.
Recent breakthroughs in artificial
intelligence, particularly in the area
of deep learning, suggest that the
new AI technologies could be poised
to revolutionize research across
any number of fields, including
investment management.
Relentless Research Assistant
Likened to a “relentless research
assistant,” deep learning can accomplish
a wide variety of tasks without human
supervision and learn to recognize
patterns through the act of processing
vast quantities of data. At its core, deep
learning originates from the work on
neural networks dating back to the
1950s, the dawn of artificial intelligence.
Although these early neural networks
showed theoretical promise, the meager
computing power available at the time
severely handicapped their ability to
mimic certain features of intelligence.
Chief among them was layered learning;
that is, absorbing simple concepts first
and then using them to understand
more complex ones. Deep learning
incorporates this critical feature by
building deep neural networks that
drastically reduce the amount of human
intervention and curation required to
develop iterative learning.1
One of the most dramatic examples
of the degree of complexity that deep
learning can now master was the ability
of Google’s AlphaGo program earlier
this year to defeat one of the world’s
best players of Go, an ancient Chinese
board game. The number of possible
board positions in Go is said to exceed
the number of atoms in the universe,
and most experts expected it would take
at least another decade for a computer to
beat an expert human player.2
Accelerated Progress
Given this accelerated progress, it is
not surprising that AI is now driving
millions of dollars of investment in startups and research into a range of possible
applications from strengthening internet
search engines to building self-driving
cars, while at the same time inspiring
visions of both wonder and worry. Worry
to the extent that the new technologies
could displace millions of low-skill
workers around the world more quickly
than they can be retrained for higherskill employment. This is a particularly
concerning scenario for many emerging
economies that have relied on global
labor cost advantages for goods and
services and could face what economists
call a “premature deindustrialization”
if automation spreads too quickly.3 But
many historians liken fears about the
potential implications of AI to similar
anxieties 200 years ago about the march
of machines during the Industrial
Revolution, and there are still significant
technological hurdles to be surmounted.
In the meantime, many researchers
are focused on more immediate and
practical applications of deep learning
that will help them better sift through
growing volumes of data to find
actionable insights. We believe that
machine learning can help improve
the ability to forecast returns and
manage risk while also increasing the
productivity of research processes. But
as with all systematic processes, the
quality of the output depends on the
quality of the input; in other words,
“garbage in, garbage out.”
Improving Data Quality
That is why SSGA’s Active Quantitative
Equity team has made machine learning
an important part of our big data
innovation strategy aimed at improving
how we assess the quality of data used
in our models. We believe deep learning
can measurably improve our ability to
detect data anomalies and strengthen
our ability to assess data integrity
quickly and at a scale that was previously
unthinkable with manual processes.
We define a data anomaly as an instance
where an erroneous data item prevents
our investment models from functioning
as expected. These anomalies can arise
from errors such as data obtained in the
wrong currency or unit, stale or missing
data, or extreme data values.
ecent breakthroughs in artificial intelligence,
particularly in the area of deep learning, suggest
that the new I technologies could be poised to
revolutioni e research across any number of fields,
including investment management.
Q3&4 2016 | The New Active
22
e i eo
ti ci
nte i ence
Previously, these anomalies were spotted
using manual, hard-coded rules, derived
from common sense or as a result of
lessons learned from previous errors.
This was a resource-intensive exercise
unequal to the huge volumes associated
with big data. Manually investigating
each data item did not scale with such
geometric increases in data volume.
This challenge provided a perfect testing
ground for developing deep learning
algorithms for assessing data properties
and flagging irregularities without
human intervention. This improved data
assessment has direct positive benefits
for the integrity of the alpha models we
use in our investment process.
We focused our work on training a
deep neural network to recognize
anomalies for financial data that are
either directly related or approximately
related. The former refers to financial
data such as prices and returns; return
on equity, earnings and book value;
sales in different currencies; and
book to price. Financial data that are
approximately related include items
such as returns measured over different
time horizons; realized versus forecast
metrics such as earnings, sales and gross
domestic product (GDP).
Training the Algorithm
First we “trained” the algorithm by
feeding it a dataset with 10 million
entries related to financial data with
exact relationships. By processing this
data, the algorithm began to recognize
the mathematical relationships among
the different categories of financial
data. Then we began to add anomalies
to test the algorithm’s ability to
detect those instances where the
mathematical relationships broke down.
Figure 1 shows the high degree of
accuracy the algorithm had in detecting
anomalies both large and small. An error
as small as a 0.002 deviation from the
correct amount was able to be detected.
While smaller anomalies might be
missed, arguably their downstream
effect on the integrity of the model
outputs will be commensurately less
impactful and likely immaterial.
Obviously training a deep learning
approach to recognize anomalies in
data that are only approximately related
is more difficult. The murkier and less
certain the mapping between different
data items, the foggier our lens is and
the harder it is to see smaller problems.
For example, a deep neural network will
learn that forecasted and realized GDP
growth is similar and may consider a 10%
difference between them as an anomaly,
but it would not consider suspicious a
0.1% difference, even if one of the two
quantities were genuinely wrong. Our
second exercise again used a dataset of
10 million entries to train the algorithm,
but found that anomalies now needed
to be about twice the size of the original
items for the deep neural network to
identify all of them (Figure 2). But
this is still a significant time savings
over combing the data manually for
such defects.
These are just two straightforward
examples of how artificial intelligence
can transform active quantitative
investing. In the future, we believe
that the combination of growing data
sources and improved machine learning
technology may revolutionize an active
managers’ ability to identify and harvest
new sources of alpha and open up a whole
new chapter in our industry’s evolution.
1
2
3
The advance of deep learning is also challenging
a fundamental tenet of data science wisdom
which stipulates that pre-treating the data to
satisfy machine learning requirements represents
almost 80% of the task.
Christopher Moyer, “How Google’s AlphaGo Beat a
Go World Champion,” The Atlantic, March 28, 2016.
“March of the Machines,” a special report on
Artificial Intelligence, The Economist, June 25th –
July 1st, 2016.
e believe that the
combination of growing
data sources and
improved machine
learning technology will
revolutioni e an active
managers ability to
identify and harvest
new sources of alpha
and open up a whole
new chapter in our
industry s evolution.
Figure 1
Anomaly Detection in Exact Items Relationships
Figure 2
Anomaly Detection in Approximate Items Relationships
280
Source: SSGA Active Quantitative Equity Research. As of July 1, 2016.
For illustrative purposes only.
Source: SSGA Active Quantitative Equity Research. As of July 1, 2016.
Q3&4 2016 | The New Active
24
Looking through the factor lens at equity allocations may reveal
unintended risks in portfolio exposures. In the current market
environment, for example, many investors who seem to be taking a
diversified approach have ended up with portfolios that are negatively
exposed to value and positively exposed to high volatility. Dane Smith
of our Investment Solutions Group highlights the benefits of monitoring
and understanding such implicit factor bets, which may not be rewarded
in the markets. By allocating to value and low volatility in combination,
investors may align factor exposures toward those premia that have
historically been compensated over the long run — and target
better risk-adjusted returns.
Applying the
FACTOR
DANE SMITH
Investment Strategist
Investment Solutions Group, SSGA
MAKING IT WORK
Case Study
Truth and Consequences
of Concentrated Value
The argument for a concentrated
value allocation is that value managers
tend to take a disciplined approach to
identifying higher quality companies
with lower ratios of price to fundamental
metrics such as book value, cash flow,
dividends, earnings or sales. The
historical evidence for a value strategy
has been compelling, with empirical
studies conducted by many academics
finding that value is among the few
equity factors that have earned better
risk-adjusted returns over the long
term compared to a benchmark index
weighted by market capitalization.
Factor performance can vary under
different market and economic
conditions, however. When the economy
is healthy or recovering, for example,
small cap stocks tend to do well. But
when the business cycle is waning, small
caps are likely to underperform. Quality
and low volatility generally fare better
in tougher market environments but not
as well during market upturns.
Similar to other factors in this regard,
value has been sensitive to market
distress and macro shocks, so its
historical return pattern has shown
short-term cyclicality. Prior to the global
financial crisis starting in late 2008,
securities with attractive valuations
tended to exhibit lower betas and less
volatility than the market overall. Yet
in the search for relative safety after
the crisis, lower beta stocks have moved
out of reach for value approaches (see
Figure 1).
Passively owning cheaper companies
could have the unintended consequence
of exposure to higher volatility. In
order to remain true to the style, value
investors would have to shift their focus
to higher beta companies that were
distressed — with poor balance sheets,
cash flows and income statements — or
otherwise out of favor with the market.
As a result, a concentrated portfolio of
value securities could have higher risk
relative to a cap-weighted benchmark.
Value Factor Risk Decomposition.
To demonstrate this, we can evaluate
risk exposures across a representative
grouping of concentrated value
managers in the United States. This
group comprises the top five strategies
with the highest active exposures
to the value factor as defined by the
Axioma risk model. Considering the
holdings in this representative value
portfolio, we observe that growth and
value are the two largest style factor
deviations from the US Large Cap
universe in Morningstar (see Figure
2). The portfolio is also underweight
momentum, return on equity and large
size (that is, overweight small size),
and overweight dividend yield, market
sensitivity and high volatility. These
results are consistent with our intuition
for fundamental value managers.
When looking through the lens of an
asset allocator, owning a concentrated
value portfolio has merits. Concentrated
managers have higher active risk and
require lower capital allocations to
Figure 1
Valuation of Low Beta Versus High Beta Has Reversed Since 2008
Median P/B
6
4
2
0
Dec
1991
— Q1 (Lowest) Beta
Oct
1996
Aug
2001
Jun
2006
Apr
2011
Mar
2016
— Q5 (Highest) Beta
Source: MSCI, FactSet, SSGA. Data as of June 30, 2016.
Universe: MSCI World Index. The results shown do not represent those of a singular index, but were
achieved by mathematically combining the actual data of index member stocks arranged and reweighted
according to their beta ranking.
Based on the consistency
of its track record over the
long run, the value premium
may be perceived as
evergreen. owever, value
performance tends to
be cyclical, so deciduous
may be a more
accurate description.
Source: Richard Hannam and Taie Wang, CFA,
“Understanding the Value Premium,” State
Street Global Advisors IQ Insights, May 2016.
Q3&4 2016 | The New Active
26
Applying the Factor Lens
impact the factor risk exposures within
their portfolios. As mentioned above,
this representative concentrated
value portfolio does have higher total
risk and higher beta relative to the
Russell 1000 benchmark.
Asset allocators may find benefits
in achieving a balance between the
contributions of factor risk and asset
specific risk to overall active risk. Stock
selection could be a meaningful source
of active risk for a skilled fundamental
manager. Taking an active approach and
having a differentiating fundamental
view on each security might help to
generate the risk-reduction benefit
of owning value — and increase the
potential for greater return. But in this
example, the grouping of concentrated
value managers has diversified away
the specific risk.
Low Volatility to the Rescue
Investors seek to capture the low
volatility factor because of the potential
for reduced variability and marketlike returns in their portfolios — much
like the traditional rationale for value
investing. By allocating capital to a
low volatility exposure, investors are
generally attempting to obtain downside
protection in falling markets, while also
participating in up markets.
Although the arguments for investing
in these factors are similar, their
Investors in the low vol
factor anticipate lower
volatility of returns and
improved harpe ratios
over time when compared
to a cap-weighted inde .
Figure 2
Factor Exposures and Risk Characteristics
of a Representative Grouping of Concentrated Value Managers
Growth
Medium-Term Momentum
Return-on-Equity
(Large) Size
Leverage
Industries
Market
Exchange Rate Sensitivity
Total Portfolio Risk
(%)
17.40
Benchmark Risk (%)
16.31
Predicted Beta
1.04
Active Risk
4.16
Asset
(%)
Liquidity
(High) Volatility
eci c i
Factor Risk (%)
Market Sensitivity
16.49
83.51
Dividend Yield
Value
-0.4
-0.2
0.0
0.2
0.4
0.6
0.8
Source: Axioma, FactSet, SSGA. Benchmark is Russell 1000 Index. Data as of December 31, 2015.
Figure 3
Rolling 36 Month Correlations of Q1 Minus Q5 Value to Low Volatility Show Reversal
1.2
0.8
0.4
0.0
-0.4
-0.8
Nov
1989
May
1996
Nov
2002
May
2009
Nov
2015
Source: MSCI, FactSet, SSGA. Data as of June 30, 2016.
Universe: US Large Cap Managers in Morningstar. Rolling 36 Month Information Coefficients (IC).
IC is Spearman rank correlation of universe model alpha with 1 month ranked returns. The information
coefficient measure the predictive power of asset return forecasts, and is a correlation ranging between
-1 (weak) and +1 (strong).
correlations show a stark difference
in how value and low volatility factors
have been trading (see Figure 3). Aside
from the technology bubble in the late
1990s, value and low volatility stocks
tended to be highly correlated leading
up to the global financial crisis. But
since then they have decoupled and now
show a significant negative correlation.
Investors who combine exposures
to the value factor with exposures to
low volatility may be able to benefit
from the diversifying effects of this
negative correlation — with the
potential to avoid the unintended
consequences of concentrated value
while smoothing out performance
over the long term.
Q3&4 2016 | The New Active
27
Applying the Factor Lens
Figure 4
Factor Exposures and Risk Characteristics
of a 50% Concentrated Value and 50% Low Volatility Portfolio
Further Reading
Growth
Medium-Term Momentum
Market Sensitivity
(Large) Size
Return-on-Equity
Leverage
Total Portfolio Risk
(%)
15.20
Benchmark Risk (%)
16.31
Predicted Beta
0.92
Exchange Rate Sensitivity
Active Risk
2.27
Market
Asset
(%)
Liquidity
Industries
eci c i
Factor Risk (%)
(High) Volatility
18.97
81.03
Dividend Yield
Value
-0.3
-0.2
-0.1
0.0
0.1
0.2
0.3
0.4
Source: Axioma, FactSet, SSGA. Benchmark is Russell 1000 Index. Data as of December 31, 2015.
Combined Factor Risk
Decomposition. To investigate this
premise, we return to the factor risk
decomposition analysis and consider
these same attributes when allocating
50% to concentrated value and 50% to
low volatility strategies. Following the
same methodology as described above,
the grouping of low volatility managers
comprises the top five strategies with the
highest active risk exposures to the low
volatility factor. The combined portfolio
maintains overweight exposures to
value, dividend yield and small size
— all risk premia that we expect to
be compensated in the long term (see
Figure 4). But now the portfolio has zero
exposure to the volatility factor. This
neutral positioning is beneficial because
we believe that over the long run, high
volatility is an uncompensated risk
factor exposure.
We find evidence of additional risk
reduction benefits from the combination
of value and low volatility. Total risk
and predicted beta are lower, and the
Mike Sebastian and Sudhakar Attaluri,
“Conviction in Equity Investing,” The
Journal of Portfolio Management,
Summer 2014, pp 77–83.
active risk of the portfolio has been
reduced significantly — the result of
diversification between value and
low volatility — but still remains high,
as desired.
Using global factor indexes, we obtained
similar results in the broad market,
although the risk benefits were of lesser
magnitude than those we measured
with the available data across US active
managers pursuing value and low
volatility strategies.
What this suggests to us is that
concentrated active managers could
be used to help enhance active risk
and achieve greater active exposure to
compensated risk factors — as long as
we are mindful not to diversify away the
specific risk that active management may
provide. By combining value with low
volatility, we have built a portfolio with
positive exposure to value, lower beta
and less total risk to the market, which
has been a difficult feat to accomplish in
today’s markets.
Barry Glavin, CFA, and Brian Routledge,
CFA, “Cheap for a Reason — Finding
Value in Uncomfortable Places,” State
Street Global Advisors IQ Insights,
June 2015.
Aligning Factor Exposures
it Co en te i
e i
Once again, we see the usefulness
of looking through the factor lens
to understand the risk in an equity
portfolio — whether or not it was
intended. Constructing a diversified
equity allocation focused on valuation
has been shown to reduce asset specific
risk and introduce unintended factor
exposures. Combining a concentrated
fundamental value approach with
an active low volatility strategy in a
thoughtful way may help to better
align the portfolio’s factor exposures
with compensated risk premia, as
well as lowering predicted beta and
total risk. In our view, this analysis
demonstrates the benefit of combining
and balancing factors to provide stability
and increase return potential across
all market conditions.
Q3&4 2016 | The New Active
28
For investment professional use only.
Not for public use.
Smart Beta Multi-factor Strategy While
diversification does not ensure a profit or guarantee
against loss, investors in Smart Beta may diversify
across a mix of factors to address cyclical changes in
factor performance. However, factors may have high
or increasing correlation to each other.
Smart Beta Strategies A Smart Beta strategy does
not seek to replicate the performance of a specified
cap-weighted index and as such may underperform
such an index. The factors to which a Smart Beta
strategy seeks to deliver exposure may themselves
undergo cyclical performance. As such, a Smart Beta
strategy may underperform the market or other Smart
Beta strategies exposed to similar or other targeted
factors. In fact, we believe that factor premia accrue
over the long term (5-10 years), and investors must
keep that long time horizon in mind when investing.
The back-tested performance shown in figure 4 was
created by the SSGA Active Quantitative Equity Team.
The historical back-test was performed using data as
available at the historical point in time to eliminate any
survivorship bias.
The SSGA’s Dynamic Factor Timing Model was back
tested in the Third quarter of 2015 using data from
Jan 1997 – September 2015.
The results shown do not represent the results of
actual trading using client assets but were achieved by
means of the retroactive application of an investment
process that was designed with the benefit of
hindsight, otherwise known as back-testing. Thus,
the performance results noted above should not
be considered indicative of the skill of the advisor
or its investment professionals. The back-tested
performance was compiled after the end of the period
depicted and does not represent the actual investment
decisions of the advisor. These results do not reflect
the effect of material economic and market factors on
decision making. In addition, back-tested performance
results do not involve financial risk, and no
hypothetical trading record can completely account
for the impact of financial risks associated with
actual investing.
No representation is being made that any client will
or is likely to achieve profits or losses similar to those
shown. In fact, there are frequently significant
differences between back-tested performance
results subsequently achieved by following a
particular strategy.
The back-tested performance data is reported on a
gross of fees basis, but net of administrative costs.
Additional fees, such as the management fee, would
reduce the return. For example, if an annualized gross
return of 10% was achieved over a 5-year period and
a management fee of 1% per year was charged and
deducted annually, then the resulting return would be
reduced from 61% to 54%. The performance includes
the reinvestment of dividends and other corporate
earnings and is calculated in US dollars.
The alpha scores were created using SSGA’s Active
Quantitative Equity Team’s Proprietary active emerging
markets stock selection model. The risk model data
was based on Axioma’s worldwide medium term
fundamental risk model estimated with data as
available at the historical point in time. Portfolio
construction methodology is similar to that used in
our Emerging Markets Defensive Equity Strategy.
Monthly portfolios were created, and returns
generated based on the results of a buy and hold
strategy over the next month. Transaction costs were
also included in the analysis and assumed to be 100
bps each way. Each component in the stock selection
process — growth, value, sentiment, and quality —
is being implemented in the same manner in which it
was back tested.
The whole or any part of this work may not be
reproduced, copied or transmitted or any of its
contents disclosed to third parties without SSGA’s
express written consent.
The views expressed in this material are the views of
the SSGA Active Quantitative Equity Research team
through the period ended 9/1/16 and are subject to
change based on market and other conditions. This
document contains certain statements that may be
deemed forward-looking statements. Please note that
any such statements are not guarantees of any future
performance and actual results or developments may
differ materially from those projected.
Q3&4 2016 | The New Active
29
CONTACTS
AUSTRALIA
HONG KONG
SINGAPORE
State Street Global Advisors, Australia Ltd.
Level 17, 420 George Street
Sydney, NSW 2000
State Street Global Advisors Asia Ltd.
68th Floor
Two International Finance Centre
8 Finance Street
Central, Hong Kong
State Street Global Advisors Singapore Ltd.
168 Robinson Road, #33–01 Capital Tower
Singapore 068912
(ABN 42 003 914 225) is the holder of an Australian
Financial Services Licence (AFSL Number 238276).
T +852 2103 0288
F +852 2103 0200
Company Reg. No: 200002719D
BELGIUM
IRELAND
T +612 9240 7600
F +612 9240 7611
State Street Global Advisors Belgium
Chausse de la Hulpe 120
1000 Brussels, Belgium
State Street Global Advisors Ireland Ltd.
Two Park Place, Upper Hatch Street
Dublin 2
T +32 (0)2 633 2036
F +32 (0)2 672 2077
T +353 1 776 3000
F +353 1 776 3300
State Street Global Advisors Belgium is a branch office
of State Street Global Advisors Limited. State Street
Global Advisors Limited is authorized and regulated by
the Financial Conduct Authority in the United Kingdom.
State Street Global Advisors Ireland Limited
is regulated by the Central Bank of Ireland.
CANADA
State Street Global Advisors Ltd.
770 Sherbrooke Street West, Suite 1200
Montréal, Quebec, H3A 1G1
T +1 514 282 2400
F +1 514 282 3048
State Street Global Advisors Ltd.
30 Adelaide Street East, Suite 500
Toronto, Ontario M5C 3G6
T +647 775 5900
F +647 775 6800
FRANCE
State Street Global Advisors France
Immeuble Défense Plaza
23–25 rue Delarivière-Lefoullon
92064 Paris La Défense Cedex
T (+33) (0) (1) 1 44 45 40 00
F (+33) (0) (1) 1 44 45 41 92
Authorized and regulated by the Autorité des
Marchés Financiers. Registered with the Register
of Commerce and Companies of Nanterre under
the number 412 052 680.
ITALY
State Street Global Advisors Ltd.
Sede Secondaria di Milano Via dei Bossi,
4 20121 Milan
T +39 02 32066 100
F +39 02 32066 155
JAPAN
State Street Global Advisors (Japan) Co. Ltd.
Toranomon Hills Mori Tower 25F1–23–1
Toranomon, Minato-ku Tokyo
105–6325 Japan
T +813 4530 7380
F +813 4530 7364
T +65 6826 7500
F +65 6826 7501
SWITZERLAND
State Street Global Advisors AG
Beethovenstrasse 19
Postfach, CH–8027 Zurich
T +41 (0)44 245 70 00
F +41 (0)44 245 70 16
UNITED ARAB EMIRATES
State Street Bank and Trust Company
(Representative Office)
Boulevard Plaza 1, 17th Floor
Office 1703, PO Box 26838
Dubai, United Arab Emirates
T +971 (0)4 437 2800
F +971 (0)4 437 2818
UNITED KINGDOM
State Street Global Advisors Ltd.
20 Churchill Place
Canary Wharf, London, E14 5HJ
T +020 3395 6000
F +020 3395 6350
Authorized and regulated by the Financial
Conduct Authority. Registered in England,
Number 2509928; VAT No. 5776591 81.
NETHERLANDS
State Street Global Advisors Netherlands Ltd.
Apollo Building, 7th floor
Herikerbergweg 29
1101 CN Amsterdam
T +31 (0) 20 7181701
F +31 (0) 20 7087329
A branch office of State Street Global Advisors
Limited; authorized and regulated by the Financial
Conduct Authority in the United Kingdom.
UNITED STATES
State Street Global Advisors
State Street Financial Center
One Lincoln Street
Boston, MA 02111–2900
T +1 617 664 7727
F +1 617 664 4024
GERMANY
State Street Global Advisors GmbH
Brienner Strasse 59
D-80333 Munich
T +49 (0)89 55878 100
F +49 (0)89 55878 440
Q3&4 2016 | The New Active
30
About Us
For nearly four decades, State Street Global Advisors has
been committed to helping our clients, and those who rely on
them, achieve financial security. We partner with many of
the world’s largest, most sophisticated investors and financial
intermediaries to help them reach their goals through a
rigorous, research-driven investment process spanning
both indexing and active disciplines. With trillions* in assets,
our scale and global reach offer clients access to markets,
geographies and asset classes, and allow us to deliver thoughtful
insights and innovative solutions.
State Street Global Advisors is the investment management arm
of State Street Corporation.
*
Assets under management were $2.30 trillion as of June 30, 2016. AUM reflects
approx. $40.9 billion (as of June 30, 2016) with respect to which State Street
Global Markets, LLC (SSGM) serves as marketing agent; SSGM and State Street
Global Advisors are affiliated.
ssga.com
© 2016 State Street Corporation. All Rights Reserved.
INST-6900 0816 Exp. Date: 09/30/17