A less complex business is, generally speaking:
- easier to understand and govern
- more efficient
- more profitable
- more predictable
- easier to fix
- less fragile (vulnerable) – has less margins of maneuver
- more difficult to grow and expand
This not only sounds like common sense but has also been proven in a scientific manner over the past decade in a multitude of diverse fields and applications. All things being equal, ‘simpler is better’ applies, for example, to engineering design, traffic management, military strategy. Nothing new under the sun.
In this short blog we’d like to illustrate a number of eloquent examples of how the above logic may also be verified in economics and finance. To that end we have analyzed quarterly Standardized Balance Sheet data of companies belonging to the Dow Jones index and have computed the evolution of their respective business complexities. We have verified that the following (fuzzy) rules hold:
Rule 1: Business complexity increases – stock price decreases or remains ‘flat’
Rule 2: Business complexity decreases – stock price increases
Rule 3: Business complexity is ‘flat’ – stock price remains ‘flat’
(by ‘flat’ we mean fluctuates around a constant value).
Of course these rules represent trends which may be verified on mid/long-term. Some examples:
American Express Company. In the plot below we observe the daily stock price evolution during the past five years (20 quarters) which is stacked over the corresponding complexity evolution in the same period.
We can verify by inspecting both plots that in the first part Rule 3 holds, then Rule 2 and finally Rule 1. In other words, as a business becomes less complex the market rewards it with a higher stock price.
It is also interesting to analyze the evolution of quarterly Total Assets versus complexity. Here too we can verify that as a business becomes more complex – this is inevitable if the business is growing and expanding – there comes a point at which increasing complexity is detrimental. In the case in question it is clear how peak Total Assets are attained around complexity of 34 cbits, while at peak complexity – 48 cbits– Total Assets are less. Higher complexity would mean a further reduction of Total Assets. It is also evident how the same Total Assets ($160 billion) are possible with complexities of 32 or 44 cbits.
It is interesting to note how in this case it is actually possible to estimate how much can be gained (or sacrificed) by changing business complexity thanks to the equation reported below – Total Assets = f(business complexity).
Similar logic may be applied to other Balance Sheet entries, such as Revenue, EBITDA, etc.
Apple Inc. As above, Rules 1 and 2 hold during most of the examined period.
When it comes to Total Assets versus business complexity it is not possible to build an equation but, it is very clear how peak complexity – over 55 cbits – corresponds to Total Assets that are significantly lower than those at, say, complexity of 45 cbits (over 40% less). Evidently, there may be many reasons behind a company having more or less assets on its books but nevertheless, peak complexity never corresponds to peak assets. This is a firm rule.
JPMorgan Chase. Here we report only the stock price versus complexity. Rules 2 and 3 hold.
Bank of America. In this case we show how Revenue versus complexity follows a similar rule – peak revenue does not correspond to peak complexity. In other words, only a simpler business (or business model) will yield peak performance, not the most complex version thereof.
Complexity – which, since 2005 can be measured for any kind of system – is a new ‘super KPI‘ which combines numerous indicators in one. It is based on a scientific approach, which does not resort to subjective weighting, Balanced Score Cards, or similar techniques. It has a strong systemic connotation and possesses many interesting properties. For example, as it is a sound science-based metric, it is bounded. All metrics which are unbounded – for example there are those who ‘measure complexity’ by counting the number of components, or product – are irrelevant as no physical quantity can go to infinity. Our complexity metric has a lower and upper bound. The upper bound is called critical complexity and corresponds to a state in which the dynamics of a given system is dominated by uncertainty. This means that any properly managed system should function at a safe distance from its critical complexity. The key is to know how to measure complexity and the corresponding critical complexity. The difference between the two provides a measure of how ‘healthy’ the system is. This can be done for a single company, a system of companies, portfolios, funds or systems thereof.
The bottom line is simple: today there is so much information – much of which is often unreliable – that it is becoming difficult to pick the right sources, to make decisions. There is a pressing need to synthesize lots of data into ‘meta-indices’ that can expose the salient features of a system but without sacrificing data sources which may be seemingly unimportant but which may one day prove to be critical. Complexity is one such meta-index.
Science, not opinions.