Oil and Italy were the main themes last month.
Oil and Italy were the main themes last month.
Macro Letter – No 97 – 18-05-2018
Robots, employment and the mis-measurement of productivity
The subject matter of this Macro Letter is broad, so I shall confine my investigation to the UK. It was, after all, one of the first countries where services became a larger percentage of GDP than manufacturing. The crossover between manufacturing and services is estimated to have happened around 1881. When Napoleon Bonaparte described England as, ‘A nation of shopkeepers,’ his intension may have been derisive, but his observation was prescient. Of course, M. Bonaparte was actually quoting Adam Smith, who first coined the phrase in his magnum opus, An Inquiry into the Nature and Causes of the Wealth of Nations, published in 1776: now, he really was prescient.
As we stare into the abyss, anticipating the huge percentage of manufacturing – and now, many services – jobs which are expected to be replaced by machines, it behoves us to begin by reviewing the accuracy with which we measure services in general. A recent paper from the Centre for Economic and Business Research does just that for one sub-sector, although it suggests that mis-measurement of economic activity in services, always difficult to define, may be a factor in the poor productivity record of the UK. I have often described Britain as a post-industrial nation but this research, into one of the most vibrant corners of the economy, makes fascinating reading – The True Value of Creative Industries Digital Exports – CIC, CBER – March 2018 – finds, among other things that: –
The UK’s creative industries exports are: –
£46bn in goods and services – 24% higher than the official figure
£31bn of total creative exports are services – 41% higher than the official figure
£21bn of these creative services are digital services – 40% higher than the official figure
The CEBR goes on to point out other weaknesses in current measurements of economic activity: –
…estimated official figures for 2016 highlight that the majority of creative industries sub-sectors are exporting digital services. The IT, software and computer services sector, for example, exports £8.95bn in digital services. However, according to these figures, the crafts and museums, galleries and libraries sectors’ digital services exports are zero – which we know is not the case.
Many UK YouTube channels, for example, are watched by millions of viewers across the world. It is through these types of platforms that the creative industries export audiovisual content, music, and tutorials. Such platforms and the content they offer, however, may not be registered as a service export. This is due to difficulties capturing data for business models such as those offering free content and based on advertising revenues.
There are also structural challenges with collecting data on such exports. Often, it is difficult for digital intermediaries to determine the point of sale and purchase. The borderless way in which many global firms operate presents additional complications and the origin of the creative content, and of those who consume it, is frequently hard to track.
This brings me to the vexed question of productivity growth in the new machine age. In the Deloitte – Monday Briefing – Thoughts on the global economy – from 30th April, the author reflects on the discussions which occurred at the annual global gathering of Deliotte’s economic experts. I’m cherry picking, of course, the whole article is well worth reading: –
Despite discussion of recession risks I was struck by a cautious optimism about the long-term outlook. There was a general view that the slowdown in productivity growth in the West has been overstated, partly because of problems in capturing gains from technological change and quality improvements. As a result most of us felt that Western economies should be able to improve upon the lacklustre growth rates seen in the last ten years.
We agreed too that apocalyptic media stories about new technologies destroying work were overcooked; technology would continue to create more jobs than it destroys. The challenge would be to provide people with the right skills to prosper. The question was, what skills? We had a show of hands on what we would recommend as the ideal degree subjects for an 18-year-old planning for a 40-year career. Two-thirds advocated STEM subjects, so science, technology, engineering and maths. A third, myself included, opted for humanities/liberal arts as a way of honing skills of expression, creativity and thinking.
Mr Stewart ends by referring to a letter to the FT from Dr Lawrence Haar, Associate Professor at the University of Lincoln, in which he argues that poor UK productivity is a function of the low levels of UK unemployment. In other words, when everyone, even unproductive workers, are employed, productivity inevitably declines:-
…it does not have to be this way. Some economies, including Singapore, Switzerland and Germany, combine low unemployment and decent productivity growth. The right training and education can raise productivity rates for lower skilled workers.
This theme of productivity growth supported by the right education and training is at the heart of a recent paper written by Professor Shackleton of the IEA – Current Controversies No. 62 – Robocalypse Now? IEA – May 2018 – the essay cautions against the imposition of robotaxes and makes the observation that technology has always created new jobs, despite the human tendency to fear the unknown: why should the adoption of a new swath of technologies be different this time? Here is his introduction: –
It is claimed that robots, algorithms and artificial intelligence are going to destroy jobs on an unprecedented scale.
These developments, unlike past bouts of technical change, threaten rapidly to affect even highly-skilled work and lead to mass unemployment and/or dramatic falls in wages and living standards, while accentuating inequality.
As a result, we are threatened with the ‘end of work’, and should introduce radical new policies such as a robot tax and a universal basic income.
However the claims being made of massive job loss are based on highly contentious technological assumptions and are contested by economists who point to flaws in the methodology.
In any case, ‘technological determinism’ ignores the engineering, economic, social and regulatory barriers to adoption of many theoretically possible innovations. And even successful innovations are likely to take longer to materialise than optimists hope and pessimists fear.
Moreover history strongly suggests that jobs destroyed by technical change will be replaced by new jobs complementary to these technologies – or else in unrelated areas as spending power is released by falling prices. Current evidence on new types of job opportunity supports this suggestion.
The UK labour market is currently in a healthy state and there is little evidence that technology is having a strongly negative effect on total employment. The problem at the moment may be a shortage of key types of labour rather than a shortage of work.
The proposal for a robot tax is ill-judged. Defining what is a robot is next to impossible, and concerns over slow productivity growth anyway suggest we should be investing more in automation rather than less. Even if a workable robot tax could be devised, it would essentially duplicate the effects, and problems, of corporation tax.
Universal basic income is a concept with a long history. Despite its appeal, it would be costly to introduce, could have negative effects on work incentives, and would give governments dangerous powers.
Politicians already seem tempted to move in the direction of these untested policies. They would be foolish to do so. If technological change were to create major problems in the future, there are less problematic policies available to mitigate its effects – such as reducing taxes on employment income, or substantially deregulating the labour market.
Professor Shackleton provides a brief history of technological paranoia. Riccardo added a chapter entitled ‘On Machinery’ to the third edition of his ‘Principles of Political Economy and Taxation,’ stating: –
‘I am convinced that the substitution of machinery for human labour is often very injurious to the interests of the class of labourers’.
While Marx, writing only a few decades later, envisaged a time when man would be enabled to: –
‘…to hunt in the morning, fish in the afternoon, rear cattle in the evening, criticise after dinner… without ever becoming hunter, fisherman, herdsman or critic.’
As for Keynes essay on the, ‘Economic Possibilities for our Grandchildren’, his optimism is laudable if laughable – 15 hour working week anyone?
The paranoia continues, nonetheless – The Economist – A study finds nearly half of jobs are vulnerable to automation – April 2018 – takes up the story:-
A wave of automation anxiety has hit the West. Just try typing “Will machines…” into Google. An algorithm offers to complete the sentence with differing degrees of disquiet: “…take my job?”; “…take all jobs?”; “…replace humans?”; “…take over the world?”
Job-grabbing robots are no longer science fiction. In 2013 Carl Benedikt Frey and Michael Osborne of Oxford University used—what else?—a machine-learning algorithm to assess how easily 702 different kinds of job in America could be automated. They concluded that fully 47% could be done by machines “over the next decade or two”.
A new working paper by the OECD, a club of mostly rich countries, employs a similar approach, looking at other developed economies. Its technique differs from Mr Frey and Mr Osborne’s study by assessing the automatability of each task within a given job, based on a survey of skills in 2015. Overall, the study finds that 14% of jobs across 32 countries are highly vulnerable, defined as having at least a 70% chance of automation. A further 32% were slightly less imperilled, with a probability between 50% and 70%. At current employment rates, that puts 210m jobs at risk across the 32 countries in the study.
For a robust analysis, if not refutation, of the findings of Frey and Osborne, I refer you back to Professor Shackleton’s IEA paper. He is more favourably disposed towards the OECD research, which is less apocalyptic in its conclusions. He goes on to find considered counsel in last year’s report from McKinsey Global Institute (2017) A Future that Works: Automation Employment and Productivity.
The IEA paper highlights another factor which makes it difficult to assess the net impact of technological progress, namely, the constantly changing nature of the labour market. As the table below reveals it has hardly been in stasis since the turn of the millennium: –
Percentage change in employment 2001-2017, selected occupations
Notes: April-June of years. Figures in brackets are April-June 2017 levels of employment.
Source: Author’s calculation from ONS
The job losses are broadly predictable; that technology has usurped the role of the travel agent is evident to anyone who booked a flight, hotel or hire-car online recently. For economists there are always challenges in capturing the gains; back in 1987 Robert Solow, a recipient of the Nobel prize from economics, famously observed, ‘You can see the computer age everywhere but in the productivity statistics’ – perhaps the technology has been creating more jobs than thought. Does the 170% rise in Animal Care and Control owe a debt to technology? You might be inclined to doubt it but the 400,000 Uber drivers of London probably do. We are still seeking signs in the economic data for something we know instinctively should be evident.
Between the mis-measurement of economic activity (if technology is being under-estimated to the tune of 24% in the creative industries sector to what extent are productivity gains from technology being underestimated elsewhere?) and the ever changing employment landscape, I believe the human race will continue to be employed in a wide and varied range of increasingly diverse roles. If some of the more repetitive and less satisfying jobs are consigned to robots and machine learning computer code, so much the better for mankind. For more on, what is sometimes termed, the routinisation of work, this working paper from Bruegel – The impact of industrial robots on EU employment and wages: A local labour market approach – April 2018 is inciteful. They examine six EU countries and make comparisons, or highlighting contrasts, with the patterns observed in the US. Their conclusions are somewhat vague, however, which appears to be a function of the difficulty of measurement: –
We only find mixed results for the impact of industrial robots on wage growth, even after accounting for potential endogeneity and potential offsetting effects across different population or sectoral groups.
…We believe that future research on the topic should focus on exploiting more granular data, to explore whether insignificant aggregate effects (on wages) are to the result of counterbalancing developments happening at the firm level.
Bruegel refrain from proposing cuts to personal taxation as favoured by the IEA, suggesting that a more complex policy response may be required, however, their conclusions are only marginally negative. I am inclined to hope that market forces may be allowed to deal with the majority of the adjustment; they have worked well if history is any guide.
Conclusions and investment opportunities
Ignoring the fact that we are nine years into an equity bull market and that interest rates are now rising from their lowest levels ever recorded, the long term potential for technology remains supportive for equity markets, for earnings growth and for productivity. If history repeats, or even if it simply rhymes, it should also be good for employment.
With interest rates looking more likely to rise than fall over the next few years, companies will remain reticent to invest in capital projects. Buying back stock and issuing the occasional special dividend will remain the policy du jour. Assuming we do not suffer a repeat of the great financial recession of 2008 – and that remains a distinct possibility – the boon of technology will create employment with one invisible hand as it creatively destroys it with the other (with apologies to Smith and Schumpeter). If governments can keep their budgets in check and resist the temptation to siphon off investment from the productive sectors of the economy (which, sadly, I doubt) then, in the long run, the capital investment required to create the employment opportunities of the future will materialise.
A monthly review of the macro themes and drivers of markets for March 2018
Macro Letter – No 94 – 06-04-2018
What to expect from Central Bankers
As financial markets adjust to a new, higher, level of volatility, it is worth considering what the Central Banks might be thinking longer term. Many commentators have been blaming geopolitical tensions for the recent fall in stocks, but the Central Banks, led by the Fed, have been signalling clearly for some while. The sudden change in the tempo of the stock market must have another root.
Whenever one considers the collective views of Central Banks it behoves one to consider the opinions of the Central Bankers bank, the BIS. In their Q4 review they discuss the paradox of a tightening Federal Reserve and the continued easing in US national financial conditions. BIS Quarterly Review – December 2017 – A paradoxical tightening?:-
Overall, global financial conditions paradoxically eased despite the persistent, if cautious, Fed tightening. Term spreads flattened in the US Treasury market, while other asset markets in the United States and elsewhere were buoyant…
Chicago Fed’s National Financial Conditions Index (NFCI) trended down to a 24-year trough, in line with several other gauges of financial conditions.
The authors go on to observe that the environment is more reminiscent of the mid-2000’s than the tightening cycle of 1994. Writing in December they attribute the lack of market reaction to the improved communications policies of the Federal Reserve – and, for that matter, other Central Banks. These policies of gradualism and predictability may have contributed to, what the BIS perceive to be, a paradoxical easing of monetary conditions despite the reversals of official accommodation and concomitant rise in interest rates.
This time, however, there appears to be a difference in attitude of market participants, which might pose risks later in this cycle:-
…while investors cut back on the margin debt supporting their equity positions in 1994, and stayed put in 2004, margin debt increased significantly over the last year.
At a global level it is worth remembering that whilst the Federal Reserve has ceased QE and now begun to shrink its balance sheet, elsewhere the expansion of Central Bank balance sheets continues with what might once have passed for gusto.
The BIS go on to assess stock market valuations, looking at P/E ratios, CAPE, dividend pay-outs and share buy-backs. By most of these measures stocks look expensive, however, not by all measures:-
Stock market valuations looked far less frothy when compared with bond yields. Over the last 50 years, the real one- and 10-year Treasury yields have fluctuated around the dividend yield. Having fallen close to 1% prior to the dotcom bust, the dividend yield has been steadily increasing since then, currently fluctuating around 2%. Meanwhile, since the GFC, real Treasury yields have fallen to levels much lower than the dividend yield, and indeed have usually been negative. This comparison would suggest that US stock prices were not particularly expensive when compared with Treasuries.
The authors conclude by observing that EM sovereign bonds in local currency are above their long-term average yields. This might support the argument that those stock markets are less vulnerable to a correction – I would be wary of jumping this conclusion, global stocks market correlation may have declined somewhat over the last couple of years but when markets fall hard they fall in tandem: correlations tend towards 100%:-
Source: BIS, BOML, EPFR, JP Morgan
The BIS’s final conclusion?-
In spite of these considerations, bond investors remained sanguine. The MOVE* index suggested that US Treasury volatility was expected to be very low, while the flat swaption skew for the 10-year Treasury note denoted a low demand to hedge higher interest rate risks, even on the eve of the inception of the Fed’s balance sheet normalisation. That may leave investors ill-positioned to face unexpected increases in bond yields.
*MOVE = Merrill lynch Option Volatility Estimate
Had you read this on the day of publication you might have exited stocks before the January rally. As markets continue to vacillate wildly, there is still time to consider the implications.
Another BIS publication, from January, also caught my eye, it was the transcript of a speech by Claudio Borio’s – A blind spot in today’s macroeconomics? His opening remarks set the scene:-
We have got so used to it that we hardly notice it. It is the idea that, for all intents and purposes, when making sense of first-order macroeconomic outcomes we can treat the economy as if its output were a single good produced by a single firm. To be sure, economists have worked hard to accommodate variety in goods and services at various levels of aggregation. Moreover, just to mention two, the distinctions between tradeables and non-tradeables or, in some intellectual strands, between consumption and investment goods have a long and distinguished history. But much of the academic and policy debate among macroeconomists hardly goes beyond that, if at all.
The presumption that, as a first approximation, macroeconomics can treat the economy as if it produced a single good through a single firm has important implications. It implies that aggregate demand shortfalls, economic fluctuations and the longer-term evolution of productivity can be properly understood without reference to intersectoral and intrasectoral developments. That is, it implies that whether an economy produces more of one good rather than another or, indeed, whether one firm is more efficient than another in producing the same good are matters that can be safely ignored when examining macroeconomic outcomes. In other words, issues concerned with resource misallocations do not shed much light on the macroeconomy.
Borio goes on to suggest that ignoring the link between resource misallocations and macroeconomic outcomes is a dangerous blind spot in marcoeconomic thinking. Having touched on the problem of zombie firms he talks of a possible link between interest rates, resource misallocations and productivity.
The speaker reveals two key findings from BIS research; firstly that credit booms tend to undermine productivity growth and second, that the subsequent impact of the labour reallocations that occur during a financial boom last for much longer if a banking crisis follows. Productivity stagnates following a credit cycle bust and it can be protracted:-
Taking, say, a (synthetic) five-year credit boom and five postcrisis years together, the cumulative shortfall in productivity growth would amount to some 6 percentage points. Put differently, for the period 2008–13, we are talking about a loss of some 0.6 percentage points per year for the advanced economies that saw booms and crises. This is roughly equal to their actual average productivity growth during the same window.
Source: Borio et al, BIS
Borio’s conclusion is that different sectors of the economy expand and the contract with greater and lesser momentum, suggesting the need for more research in this area.
He then moves to investigate the interest rate productivity nexus, believing the theory that, over long enough periods, the real economy evolves independently of monetary policy and therefore that market interest rates converge to an equilibrium real interest rates, may be overly simplistic. Instead, Borio suggests that causality runs from interest rates to productivity; in other words, that interest rates during a cyclical boom may have pro-cyclical consequences for certain sectors, property in particular:-
During the expansion phase, low interest rates, especially if persistent, are likely to increase the cycle’s amplitude and length. After all, one way in which monetary policy operates is precisely by boosting credit, asset prices and risk-taking. Indeed, there is plenty of evidence to this effect. Moreover, the impact of low interest rates is unlikely to be uniform across the economy. Sectors naturally differ in their interest rate sensitivity. And so do firms within a given sector, depending on their need for external funds and ability to tap markets. For instance, the firms’ age, size and collateral availability matter. To the extent that low interest rates boost financial booms and induce resource shifts into sectors such as construction or finance, they will also influence the evolution of productivity, especially if a banking crisis follows. Since financial cycles can be quite long – up to 16 to 20 years – and their impact on productivity growth quite persistent, thinking of changes in interest rates (monetary policy) as “neutral” is not helpful over relevant policy horizons.
During the financial contraction, persistently low interest rates can contribute to this outcome (Borio (2014)). To be absolutely clear: low rates following a financial bust are welcome and necessary to stabilise the economy and prevent a downward spiral between the financial system and output. This is what the crisis management phase is all about. The question concerns the possible collateral damage of persistently and unusually low rates thereafter, when the priority is to repair balance sheets in the crisis resolution phase. Granted, low rates lighten borrowers’ heavy debt burden, especially when that debt is at variable rates or can be refinanced at no cost. But they may also slow down the necessary balance sheet repair.
Finally, Borio returns to the impact on zombie companies, whose number has risen as interest rates have fallen. Not only are these companies reducing productivity and economic growth in their own right, they are draining resources from the more productive new economy. If interest rates were set by market forces, zombies would fail and investment would flow to those companies that were inherently more profitable. Inevitably the author qualifies this observation:-
Now, the relationship could be purely coincidental. Possible factors, unrelated to interest rates as such, might help explain the observed relationship. One other possibility is reverse causality: weaker profitability, as productivity and economic activity decline in the aggregate, would tend to induce central banks to ease policy and reduce interest rates.
Source: Banerjee and Hoffmann, BIS
Among the conclusions reached by the Central Bankers bank, is that the full impact and repercussions of persistently low rates may not have been entirely anticipated. An admission that QE has been an experiment, the outcome of which remains unclear.
Conclusions and Investment Opportunities
These two articles give some indication of the thinking of Central Bankers globally. They suggest that the rise in bond yields and subsequent fall in equity markets was anticipated and will be tolerated, perhaps for longer than the market anticipate. It also suggests that Central Banks will attempt to use macro-prudential policies more extensively in future, to insure that speculative investment in the less productive areas of the economy do not crowd out investment in the more productive and productivity enhancing sectors. I see this policy shift taking the shape of credit controls and increases in capital requirements for certain forms of collateralised lending.
Whether notionally independent Central Banks will be able to achieve these aims in the face of pro-cyclical political pressure remains to be seen. A protracted period of readjustment is likely. A stock market crash will be met with liquidity and short term respite but the world’s leading Central Banks need to shrink their balance sheets and normalise interest rates. We have a long way to go. Well managed profitable companies, especially if they are not saddled with debt, will still provide opportunities, but stock indices may be on a sideways trajectory for several years while bond yields follow the direction of their respective Central Banks official rates.
Macro Letter – No 93 – 23-03-2018
Stocks for the Long Run but not the short
The first part of the title of this Macro Letter is borrowed from an excellent book originally written in 1994. Among several observations made by the author, Jeremy Siegel, was the idea that stocks would at least keep pace with GDP growth or even exceed it at the national level. The data, which went back to the 19th Century, showed that stocks also outperformed bonds in the long-run. The price one has to pay for that outperformance is higher volatility than for bonds and occasional, possibly protracted, periods of under-performance or, if your portfolio is concentrated, the risk of default. This is not to say that bonds are exempt from default risk, notwithstanding the term ‘risk free rate’ which we associate with many government obligations. A diversified portfolio of stocks (and bonds) has been seen as the ideal investment approach ever since Markowitz promulgated the concept of modern portfolio theory.
Today, passive index tracking funds have swallowed a massive percentage of all the investment which flows into the stock market. Why? Because robust empirical data shows that it is almost impossible for active portfolio managers to consistently outperform their benchmark index in the long run once their higher fees have been factored in.
An interesting way of showing how indexation has propelled the stock market higher recently, regardless of valuation, is shown in this chart from Ben Hunt at Epsilon Theory – Three Body Problem. He uses it to show how the factor which is QE has trumped everything in its wake. I’ll allow Ben to explain:-
Here’s the impact of all that gravity on the Quality-of-Companies derivative investment strategy.
The green line below is the S&P 500 index. The white line below is a Quality Index sponsored by Deutsche Bank. They look at 1,000 global large cap companies and evaluate them for return on equity, return on invested capital, and accounting accruals … quantifiable proxies for the most common ways that investors think about quality. Because the goal is to isolate the Quality factor, the index is long in equal amounts the top 20% of measured companies and short the bottom 20% (so market neutral), and has equal amounts invested long and short in the component sectors of the market (so sector neutral). The chart begins on March 9, 2009, when the Fed launched its first QE program.
Source: Bloomberg, Deutsche Bank
Over the past eight and a half years, Quality has been absolutely useless as an investment derivative. You’ve made a grand total of not quite 3% on your investment, while the S&P 500 is up almost 300%.
Long term there are two strategies which have been shown to consistently improve risk adjusted return from the stock market, momentum (by which I mean trend following) and value (I refer you to Graham and Dodd). Last month the Managed Futures community, consisting primarily of momentum based strategies, had its worst month for 17 years. Value, as the chart above declares, has been out of favour since the great recession at the very least. Indiscriminate Momentum has been the star performer over the same period. The chart below uses a log scale and is adjusted for inflation:-
Source: Advisor Perspectives
At the current level we are certainly sucking on ether in terms of the variance from trend. If the driver has been QE and QQE then the experiment have been unprecedented; a policy mistake is almost inevitable, as Central Banks endeavour to unwind their egregious largesse.
My good friend, and a former head of bond trading at Bankers Trust, wrote a recent essay on the subject of Federal Reserve policy in the new monetary era. He has kindly consented to allow me to quote some of his poignant observations, he starts by zooming out – the emphasis is mine:-
Recent debates regarding future monetary policy seem to focus on a degree of micro-economic precision no longer reliably available from the monthly data. Arguments about minor changes in the yield curve or how many tightening moves will occur this year risk ignoring the dramatic adjustments in all major economic policies of the United States, not to mention the plausible array of international responses…
…for the first time since the demise of Bear Stearns, et al; global sovereign bond markets will have to seek out a new assemblage of price-sensitive buyers…
Given that QE was a systematic purchase programme devoid of any judgement about relative or absolute value, the return of the price-sensitive buyer, is an important distinction. The author goes on to question how one can hope to model the current policy mix.
…There is no confident means of modeling the interaction of residual QE, tax reform, fiscal pump-priming, and now aggressive tariffs. This Mnuchin concoction is designed to generate growth exceeding 3%. If successful, the Fed’s inflation goal will finally be breached in a meaningful way…
…Classical economists will argue that higher global tariffs are contractionary; threatening the recession that boosts adrenaline levels among the passionate yield curve flattening crowd. But they are also inflationary as they reduce global productivity and bolster input prices…
Contractionary and inflationary, in other words stagflationary. I wonder whether the current bevvy of dovish central bankers will ever switch their focus to price stability at the risk of destroying growth – and the inevitable collapsed in employment that would signify?
Hot on the heels of Wednesday’s rate hike, the author (who wrote the essay last week) goes on to discuss the market fixation with 25bp rate increases – an adage from my early days in the market was, ‘Rates go up by the lift and down by the stairs,’ there is no reason why the Fed shouldn’t be more pre-emptive, except for the damage it might do to their reputation if catastrophe (read recession) ensues. A glance at the 30yr T-Bond chart shows 3.25% as a level of critical support. Pointing to the dwindling of foreign currency reserves of other central banks as the effect of tariffs reduces their trade surplus with the US, not to mention the deficit funding needs of the current administration, Allan concludes:-
…Powell will hopefully resort to his own roots as a pragmatic investment banker rather than try to retool Yellenism. He will have to be very creative to avoid abrupt shifts in liquidity preference. I strongly advise a very open mind on Powell monetary policy. From current levels, a substantial steepening of the yield curve is far more likely than material flattening, as all fiscal indicators point toward market-led higher bond yields.
What we witnessed in the stock market during February was a wake-up call. QE is being reversed in the US and what went up – stocks, bonds and real estate – is bound to come back down. Over the next decade it is unlikely that stocks can deliver the capital appreciation we have witnessed during the previous 10 years.
Whilst global stock market correlations have declined of late they remain high (see the chart below) the value based approach – which, as the Deutsche Bank index shows, has underwhelmed consistently for the past decade – may now offer a defensive alternative to exiting the stock market completely. This does not have to be Long/Short or Equity Market Neutral. One can still find good stocks even when overall market sentiment is dire.
Source: Charles Schwab, Factset
For momentum investors the first problem with stocks is their relatively high correlation. A momentum based strategy may help if there is a dramatic sell-off, but if the markets move sideways, these strategies are liable to haemorrhage via a steady sequence of false signals, selling at the nadir of the trading range and buying at the zenith, as the overall market moves listlessly sideways. Value strategies generally fare better in this environment by purchasing the undervalued and selling the overvalued.
The table below from Star Capital assesses stock indices using a range of metrics, it is sorted by the 10yr CAPE ratio:-
Source: Star Capital
Of course there are weaknesses in using these methodologies even at the index level. The valuation methods applied by Obermatt in the table below may be of greater benefit to the value oriented investor. These are there Top 10 stocks from the S&P500 index by value, they also assess each stock on the basis of growth and safety, creating a composite ‘combined’ evaluation:-
I was asked this week, why I am still not bearish on the stock market? The simple answer is because the market has yet to turn. ‘The market can remain irrational longer than I can remain solvent,’ is one of Keynes more enduring observations. Fundamental valuations suggest that stocks will underperform over the next decade because they are expensive today. This implies that a bear market may be nigh, but it does not guarantee it. Using a very long-term moving average one might not have exited the stock market since the 1980’s, every bear-market since then has been a mere corrective wave.
The amount of political capital tied up in the stock market is unparalleled. In a world or QE, fiat currencies, budget deficits and big government, it seems foolhardy to bite the hand which feeds. Stocks may well suffer from a sharp and substantial correction. Even if they don’t plummet like a stone they are likely to deliver underwhelming returns over the next decade, but I still believe they offer the best value in the long run. A tactical reduction in exposure may be warranted but be prepared to wait for a protracted period gaining little or nothing from your cash. Diversify into other asset classes but remember the degree to which the level of interest rates and liquidity may influence their prices. Unfashionable value investing remains a tempting alternative.
Linear Talk Macro Roundup for February