I: Introduction
Any classically trained business school student who has survived one semester of finance liturgy can complete the analytical catechism: “Two companies with identical financial histories — same revenues, margins, CAC, inventory returns, debt ratios, growth rates, the whole dog and pony show — will look [______] in ten years.” (The answer is: “The same”.) Except, at this extreme of course, no one actually believes in such a data-predictive approach but high priest professors and a significant portion of global finance who, as Warren Buffett, Charlie Munger giggling impishly next to him, said in one shareholder meeting, “[otherwise] wouldn’t have an edge over the lay people, and that doesn’t sell well.” At the local level, however, we are all inheriting our metrics and working for and worshiping them like pious churchgoers.
This brings us to the Great Delusion that has distorted the minds of many of my founder compatriots and anyone who makes or reviews regular business metric reports of any kind, the evidence of which is so overwhelming and seemingly obvious that one blushes to cite it, like telling a room full of well-dressed church goers brunching on powdered doughnut after service that they all look like Tony Montana on a bad day. A metric is not merely a number. It is a kind of human cognitive prosthetic, an extension of our limited capacity to grasp complex realities that, like language and writing and memory, expand what can be perceived. As our eyes construct models of the world, metrics are an epistemological claim about reality, an assertion that something in the world can be known with precision, and acted upon. Metrics serve as wagers about what matters and how it can be known, but they are likely to bring us to ruin over the long-term.
Let me state this (and horrify my finance professors) more emphatically: In the real world, financial statements and standard metrics — whether we’re comparing public manufacturing companies or 50 person biotech startups — are nearly worthless for predicting which of the financially identical businesses will flourish over time. In a decade, one company could be worth $500 billion more with the best talent in the industry, margins to kill for, and a headquarters so grand that a receptionist has to take a golf cart from her desk to the restroom, while the other could be liquidating its Herman Miller chairs on Facebook Marketplace and on the brink of a deadly class action lawsuit. Not one set of conventional textbook metric or technique of quantitative analysis, which may well have produced the same outlooks for both companies, could reliably tell us which was which. Beyond serving as a screen or a pulse check and telling us that a company is not dead or likely to succumb in the near future, we might as well consider, over the long-term, traditional financial measurement a formality or, more accurately, legalistic and political convention.
If you think I’m exaggerating about Wall Street, consider the grim actuarial tables: Of the original Fortune 500 companies published in 1955 — companies of sparkly and unassailable balance sheets and market positions — only 53 remained on the list by 2019, an 89% drop in dominance. Or consider that list of top 10 most valuable companies by market capitalization since 1955, which is completely rewritten every few decades. These were the companies best positioned to expand upon their dominance, with the resources and mindshare to meet any changing tide and step into any challenger industry, all predicted to thrive indefinitely, failing to keep up with predictions with the regularity of the changing of the seasons. As the German folk saying goes, “We’re too soon old and too late smart.” The same could be said of our analytical methods.
The spectacle of corporate forecasting brings to mind a vision of American blood sport, our version of the Roman Empire’s Colosseum: watching fights and, from those fights, gladiatorial managers building tables of data with which experts predict winners with pompous certainty. Specifically, envision a betting parlor—we’ll call it Wall Street—where smartly dressed patrons wager fortunes on fights between gladiators—we’ll call them corporations—based on statistics of past performance.
Bettors, with the whole wealth and analytical resources of the empire, confidently make calls based on ratios of punches received vs punches thrown, arrow quiver size, and year over year change in muscle mass and buy stakes in the fortunes of the gladiators, superhuman combatants who could potentially life in prime shape forever. Mystifyingly, though, somehow the favored fighters keep collapsing dead in the ring against supposed underdogs, even as prediction models and measures get better, faster, and more comprehensive.
One is reminded of the famous Tolstoy question about Napoleon’s invasion of Russia. When Napoleon and his magnificent army of 600,000 men—the greatest military force ever assembled to that date—went to Russia, they were utterly destroyed. Only 30,000 returned. Tolstoy kept asking: Why? He finally concluded that the models of reality the French relied on — seemingly impervious to the entire world — did not properly consider what would happen in a Russian winter. History is filled with these cautionary tales, where extremely smart people make extremely dumb mistakes because they’re using the wrong models. We must ask why?
Any rational gambler whether on boxers or armies venturing into Russian winters would conclude not that their statistical models just need refining, but that they’re playing an entirely different game than they imagined, betting on yesterday’s winners instead of “Judging the team, not the scoreboard,” as my little league baseball coach would encourage us after a loss.
Pick any industry and you’ll encounter thousands of players who succeeded for a time based on what they measured, collecting hundreds of billions in investment support along the way, only to collapse. Let’s not kid ourselves. This is clear evidence of the systematic failure of the best and brightest and our most sophisticated financial tools to predict any long-term winners, and beyond Wall Street it corrupts analytical thinking from ice cream stand operators to film studios to candle makers.
The problem isn’t that accountants or $200,000 MBAs are stupid, just as quantum mechanics and Ray Crock didn’t prove Isaac Newton and local diner owners, respectively, were idiots. Wall Street brims with intelligence and Harvard Business School (mostly) doesn’t admit fools. As Keynes noted, there’s a difference between being wrong and being irrational. You can be wrong while following a perfectly rational process if your basic premises are flawed.
Like judging a book by weighing it, the problem is epistemological, a radical misunderstanding of what kind of thing a business actually is. When it comes to understanding businesses — outside of the static and unchanging (laundromats, pay-day lenders, railroads, and Coca-Cola distribution come to mind) — we’re committing what logicians calls a category error.
Categorically speaking, we treat businesses as if they are complicated mechanical systems when they are, in fact, complex adaptive ones. A Swiss watch is complicated, with thousands of precision parts with fixed relationships. If something breaks, you replace it, and order is stored, and you can basically break it down into a simple schematic to produce them in bulk. A living organism (except perhaps the options trader) is complex. It learns, adapts, and evolves in ways that emerge from countless interactions no single measurement could capture. The difference is of kind, not degree, which renders conventional financial tools not imprecise, but irrelevant over the long run, as is the daily body temperature of a puppy to its rascally adult temperament.
II: The Seven Deadly Delusions: Why Our Financial Paradigm Persists
The persistence of this failed paradigm presents an intriguing puzzle. How does a system so demonstrably inadequate maintain its hegemony? It’s instructive to understand the forces that perpetuate and originated the standard business metrical system so that we can avoid them and come up with a better one. The paradigm’s origins are seven-fold.
First, at the high level, there’s “physics envy,” the complex of economists that drive them to dress up their field in crisp equations and tidy predictions the way physicists, chemists, and mathematicians do. These elaborate mathematical models win Nobel prizes, and, perhaps more importantly, fat salaries and consulting contracts, but they also fail on contact with the real world. The Efficient Market Hypothesis confidently declared that markets reflect all available information, completely ignorant of the possibility of market bubbles and crashes or other mass delusions (or that rare individuals had significantly better insight than the masses). Black-Scholes gave a nice formula for pricing options, which worked pretty well in the short-term right up until the Nobel laureates themselves directly caused the Long-Term Capital Management disaster. Milton Friedman’s shareholder maximization theories led to short-termism, financial crises, stagnant wages, and ballooning wealth and income inequalities that have resulted in massive political and social upheaval and will continue to do so. Modern Portfolio Theory is a blatant rejection of intrinsic “understanding” in any real sense and has been outperformed regularly by an unending list of great entrepreneurs and fund managers to date. We could go on.
These models are like a well-dressed corpse at an open-casket funeral, maintaining the illusion of life while being fundamentally disconnected from reality. They persist because they offer comfort of certainty, sound plausible, serve as analytical and research fodder for academic careers (like sociologists obsessed with looking at the world through a “What would Marx [or Darwin] think?” lens, create the illusion of expertise that pays enormously high returns, meet the confirmation bias of ideologues and investors who profit from groupthink, and are accurate within very limited and often completely made up contexts (which is no more impressive than a dead clock being right twice a day). The problem isn’t that these models aren’t elegant, it’s that they’re wrong.
Second, this paradigm is eminently teachable and testable. Universities need something concrete to put into textbooks and exams. You can’t build the world’s most prestigious department at Harvard unless the work looks rigorous and difficult, and you can’t build a world class curriculum around qualitative judgements, uncertainty, and confusion about the field entirely. No, you need formulas, case studies, and frameworks that can be taught in 90 minute blocks and tested on midterms, preferably via multiple choice, and what kind of university are you if you don’t have a serious economics department? What Max Planck said of science also applies to economics and business: progress is “one funeral at a time.”
Third, entire financial superstructure profits handsomely from this paradigm. That it leads to massive market inefficiencies and volatility does not affect this. The financial industry doesn’t need accuracy, it needs activity, and beautiful mathematical models provide the logical reasoning and justification for constant activity, and transactions generate fees and revenue streams regardless of whether or not the activity is moving in an intelligent way or not. As Benjamin Graham noted, “Wall Street people learn nothing and forget everything.” The show must go on.
The conflict of interest is fundamental to the entire enterprise. Wall Street, accounting firms, business publications, business schools, and consultants all make greater and greater sums from maintaining the illusion that business can be reduced to numbers that they alone are equipped to discover and manipulate through hard-learned methods. They’re paid for plausible-sounding reports that generate trading volume this quarter, not for more nuanced reports complicated with deeper, wider, and more uncertain or novel analysis, relevant for decades out. In finance, a bird in the hand is worth two in the bush, even it’s the wrong bird, or destroys the whole bird population.
Fourth, markets need some common language, however imperfect. Double entry bookkeeping standards weren’t designed to predict business success but to create uniformity for creditors and prevent outright fraud. GAAP originated to help 15th century Venetian merchants track down who owed them what, not to predict which ventures or projects would thrive over decades; and they evolved in the 20th century to prevent blatant fraud, but not identify more qualitative holes or actual staying power. Yet, we’ve enshrined these standards as if they contained the keys to profound business wisdom rather than recognize them for what they are, convenient to calculate and read across industries and eras. Any smart financier could propose a language of better metrics with greater nuance, but you’d have just as much luck getting Americans to speak German or Chinese, even if it helped them express certain ideas more sharply.
Fifth is the “institutional imperative,” the overwhelming tendency for received wisdom to continue unchecked and for organizations to mindlessly imitate one another, regardless of merit. This perpetuates powerful network effects. Once everyone uses the same metrics, departing from them becomes professionally dangerous and seemingly stupid, inviting ridicule or potential disaster if a novel idea doesn’t benefit an organization as hoped in the near-term or can’t be interpreted gainfully be one’s managers or peers. No CFO ever got fired for using standard accounting practices or betting the farm on the same metrics of success everyone else is using, even when they obscure economic reality, and no manager or accountant got in trouble for using industry standards. I’ve personally been in several meetings in which managers rejected more long-term or nuanced analytical frameworks in favor of greater fidelity to the existing models, and each time the conversation was quite awkward. As Keynes observed, “It’s better for reputation to fail conventionally than succeed unconventionally.” Career incentives make cowards of us all.
Sixth, we suffer from “precision bias”, the irrational preference for exact numbers rather than admitting uncertainty and fuzziness, an evolutionarily evolved trait, but a flawed one. It’s much easier to invest in a new machine that empirically cuts production time by 3.2% than in a reimagined customer experience that creates “more” enduring loyalty. The first fits into a spreadsheet, the second requires judgement, but, as Keynes (to reference him again) said and Warren Buffett often cites him, “It’s better to be roughly right than precisely wrong.”
Seventh, “quarterly capitalism” means our system rewards short-term thinking. Quarterly reporting creates a myopic focus that distorts decision-making. Executives and managers facing pressure to “make the quarter” routinely sacrifice long-term value creation for immediate results. This pressure is amplified by compensation systems tied to short-term stock performance, by shrinking CEO tenures (down to about 5 years, hardly enough to demonstrate significant evolutionary change), routine masse layoffs, and by stagnant or low wages, which disconnect workers from long-term company benefits and goals.
These seven forces combine to create an interlocking latticework that perpetuates itself despite predictive failures, a system that increasingly — in some way, shape, or form — rewards all participants who operate with short-term goals or preservation of the system itself in mind. In this paradigm, it’s hard to imagine an environment that rewards or provides examples or structures for those who might be interested in better tools, particularly those interested in correcting the ontological problems of betting the future on backwards-looking observations, extracting meaning via comparisons between these observations, and the massive category error of thinking of a business as a completely different kind of thing.
Our financial tools are like a series of snapshots of a farm. They can tell us the farm’s output day to day and year over year and from to farm, but they’ve got nothing to say about how the farmer is going to deal with the inevitable eventualities of an invasive species, tariffs, dramatic consumer demand or industry changes, or, if he’s lucky, how he’ll spend a windfall of excess income. Yet, we continue to bet billions on these snapshots, like the proverbial drunk searching for his keys under the streetlight rather than where he actually dropped them, simply because “that’s where the light is.”
III: Three Fundamental Errors: Why Financial Metrics Fail to Predict
The seven delusions above produce three fatal flaws that doom financial metrics to predictive failure. Understanding these errors is essential before we can construct something better, which we will seek to do once finished.
First, financial metrics create a “snapshot fallacy”, measuring outcomes or static qualities rather than the capabilities that produced them. This is like judging someone’s solely by their temperature and weight today while ignoring diet, family medical history, or the fact that they smoke two packs a day. The snapshot tells you nothing about the structure of the true trajectory. I paid for the snapshot fallacy with time and energy at “full retail price,” as I often say, several times before learning my lesson, taking equity percentages as part of consulting and advising fees for companies that were, according to my analysis and input, going to make a great deal in the next 12 months, rather than analyzing their potential to grow and adapt over the long run. I was often right about the next 12 months, but I should’ve spent my time invested in companies that had greater long term promise. I imagine I’ve lost more than $300,000 several times over.
Consider these widely-used metrics and their specific failures:
- Return on Invested Capital (ROIC): Coca-Cola showed a stellar 27% ROIC in 2007, while Amazon apparently languished at 6%. Any algebra-capable fool using this metric would have favored Coca-Cola, yet Amazon has delivered roughly 5 times greater shareholder returns since then. Why? ROIC measures capital efficiency at a moment in time but tells you nothing about a company’s ability to identify and capture new markets—what biologists would call adaptive potential.
- Gross Margin: Circuit City maintained healthier gross margins than Best Buy through 2005 (24.8% vs 23.7%). Analysts favored Circuit City based partly on this metric, completely missing Best Buy’s superior capabilities in supplier relationships, inventory management, and customer experience design. Circuit City filed for bankruptcy in 2008, reminding me of Darwin’s observation that “it is not the strongest of the species that survives, nor the most intelligent; it is the one most adaptable to change.” Circuit City had better snapshots while Best buy had a better movie… and Amazon today has the better production company.
- Free Cash Flow: Between 2012-2014, IBM generated substantially more free cash flow than Microsoft ($12-15 billion annually vs. $8-10 billion). Purely quantitative analysis would have favored IBM, completely missing Microsoft’s superior capability to pivot toward cloud services—a capability visible only through qualitative assessment of their organizational learning architecture.
The snapshot fallacy becomes particularly treacherous when examining multi-decade periods. Consider Sears, Roebuck and Company, which maintained impressive financial metrics for nearly 20 years (1980-2000)—steady margins, solid return on equity, and a seemingly impregnable market position. Investors repeatedly pronounced the company “undervalued” based on these metrics, completely missing the organization’s underlying rigidity and failure to adapt to changing retail channels. The company that dominated American retail for a century has now disappeared, serving as a monument to the dangers of mistaking a series of favorable snapshots for an adaptable organization. As Einstein wisely noted: “Not everything that counts can be counted, and not everything that can be counted counts.”
If you want to protect yourself from this error with a simple test, ask “If conditions in this industry changed dramatically tomorrow, will the most important metrics we’re optimizing for tell us anything meaningful about how we’d respond?” If you can’t answer confidently, you’re gambling, not investing — no matter how pristine the current financials might look.
Second, financial metrics create what economists call “Goodhart’s Law,” the regularity with which a measurement stops being a good measurement once it becomes a target. What gets measured gets managed, as they say, at the expense of the unconsidered and disregarded, which are often the things that really matter. I’ve often watched companies gut or put off R&D to hit quarterly targets, defer crucial maintenance and capex upgrades to prevent production delays and hit output targets, create customer experiences that are dead on arrival in favor of greater customer service efficiency or a capital preference for advertising spend, and repeatedly sacrifice employee development in favor of arbitrary hiring and compensation goals. Each decision helped to boost the next day or month’s goals, and destroy company value.
Here’s how specific metrics create perverse incentives:
- Earnings Per Share (EPS): General Electric famously met or exceeded analyst EPS expectations for 75 consecutive quarters under Jack Welch—an achievement later revealed to involve accounting manipulation, unnecessary end-of-quarter transactions, and deferral of R&D investments. The pressure to maintain the streak created decisions that hollowed out GE’s long-term competitive position. As Benjamin Franklin might have warned if he wrote his almanack in 2025, “A company focused on quarterly earnings reports is like a farmer who keeps harvesting but never plants.”
- Same-Store Sales Growth: RadioShack reported healthy same-store sales growth of 2-4% from 2002-2006 while Best Buy was disrupting the electronics retail model. How? They achieved this by cutting inventory levels for forward-looking categories (computers, smartphones) while maximizing immediate revenue from high-margin accessories and batteries. The metric looked healthy while the business model was dying—like a patient with excellent blood pressure readings who happens to have terminal cancer.
- Customer Acquisition Cost (CAC): I watched a promising software company reduce their CAC from $380 to $190 over three quarters—a figure that delighted their growth-stage investors. They accomplished this by eliminating customer education programs, reducing onboarding support, and shifting acquisition to lower-quality leads. Churn predictably skyrocketed within 18 months. In a twist that would make Molière laugh, they had simply transformed an acquisition cost problem into a retention catastrophe, much like a man who “saves money” by never changing his car’s oil.
- Operating Margins: A manufacturing client proudly reported improving operating margins from 8.2% to 17% in a year. How? By deferring crucial maintenance, cutting engineer training programs, and eliminating the R&D team working on next-generation materials and tools. The margin improvement showed up immediately; the catastrophic equipment failures appeared three years later. As the great physicist Richard Feynman said after the Challenger disaster, “Nature cannot be fooled,” neither can the fundamental economics of a business.
The most alarming aspect isn’t that companies occasionally make these trade-offs, but that our financial system structurally incentivizes them. If you want to test whether a company is falling into this trap, look at what they’re not measuring and reporting. The gaps often tell you more than the numbers.
Third and most fundamental is the category error I opened with, which mathematicians and engineers would describe as applying linear equations to non-linear systems. Like using algebra to take a space rocket into orbit, it will just never. Complex adaptive systems exhibit properties that linear analysis cannot capture:
- Emergence: System behaviors cannot be predicted from analyzing individual components
- Non-linearity: Small changes can produce disproportionate effects
- Path dependence: History and sequence matter fundamentally
- Phase transitions: System properties can change dramatically at critical thresholds
This mathemetical mismatch error explains why even the most sophisticated and respected financial metrics consistently fail:
- Discounted Cash Flow (DCF) Models: In 2006, DCF valuations of BlackBerry (then Research in Motion) showed a company worth 3-4 times more than Apple’s iPhone business would eventually become. The models correctly calculated the present value of projected cash flows based on BlackBerry’s dominant market position. What they couldn’t model was Apple’s superior adaptive capability—what an ecologist might recognize as the difference between a specialist species (highly adapted to one environment) and a generalist species (adaptable to changing conditions).
- A particularly egregious version of this category error appears in the terminal value component of DCF models, the single-number summary of all future cash flows. Most such models attribute 60-80% of a company’s calculated value to cash flows beyond the explicit forecast period—essentially admitting that most of the value depends on conditions we can’t possibly predict. Yet analysts and investors treat these terminal values as meaningful, displaying a level of self-deception that would impress Freud himself.
- Customer Lifetime Value (CLV): Blockbuster’s CLV calculations showed their average customer generating $2,400 in lifetime rental value versus Netflix’s early subscription model generating only $1,800. Their models reasonably extrapolated from historical customer behavior, completely missing how digital delivery would transform the industry. CLV measures the value of a customer within a business model but says nothing about the adaptability of the model itself. This is similar to what the great naturalist Alexander von Humboldt observed about ecosystems—understanding the interrelationships between existing elements tells you nothing about how the system will respond to a novel disturbance.
- Price-to-Earnings Growth (PEG) Ratio: In 2001, Yahoo had a more attractive PEG ratio (1.5) than Google (2.3) when Google was emerging. Based on this metric, Yahoo appeared to offer better value for its growth rate. The metric captured the relationship between current earnings, price, and projected growth, but completely missed Google’s superior capability to identify and dominate adjacent markets. As Aristotle, who studied category errors significantly, might have pointed out, they confused the essential with the accidental properties of these businesses.
- Economic Value Added (EVA): General Motors consistently showed better EVA than Toyota throughout the 1990s because their historic capital investments had been largely depreciated. This metric, designed to measure true economic profit, gave no insight into Toyota’s superior capabilities in developing new manufacturing techniques, reducing defects, and creating customer-pleasing designs. One might as well try to determine the winner of a marathon by measuring the runners’ body temperatures.
The failures of this paradigm don’t mean financial analysis is worthless, but you shouldn’t mistake great customer acquisition cost, comparative inventory turnaround, or even double or triple digit three-year profit growth for long-term success. Rather, we should venture to understand and be equipped to create, when necessary, better measures for success.
In mathematics, physics, biology and other fields, we’ve known since Poincaré and later reinforced by chaos theory that complex, non-linear systems defy simple prediction. A three-body problem in physics has no analytical solution, yet we insist that we can predict the behavior of organizations with thousands of employees interacting with millions of customers amid countless competitive, material, conceptual, and regulatory variables, among others. It’s no wonder our predictions fail. As the polymath Herbert Simon noted, complex systems exhibit “bounded rationality.” Their behavior may be locally rational but globally unpredictable. It’s for this reason a politician’s predictions or promises will never reliably come true, and why we — as has been proven — will never be able to accurately predict the weather accurately even more than a couple of days in advance. We can, however, get a read on long-term climate prospects. But how?
IV: Businesses as Complex Systems
How do we act confidently in an unpredictable world? If traditional metrics fail us so consistently, what alternative do we have? The answers are revealed via abstraction, and I don’t mean that in making business seem like some vague object. Quite the opposite, abstraction is about being ruthlessly precise about what fundamentally matters while having the discipline to ignore everything else. Through abstraction, we can see fundamental properties that determine system behavior.
The history of human intellectual progress is essentially the history of successful abstraction. When Einstein determined that E=mc², he wasn’t being vague about energy and mass; he was identifying the exact relationship that mattered with relegating everything else—color, shape, position, temperature, the mood of the research assistant bringing coffee—to irrelevance. This is the core exercise that separates middling theories from revolutionary ones. It’s the art of seeing more by looking at less.
This same principle has transformed our understanding across disciplines across virtually every field of human inquiry:
- Calculus emerged when Newton and Leibniz abstracted motion as continuous, rather than a sequence of discrete steps, a discovery essential for modern engineering, physics, and high school math
- The periodic table appeared when Mendeleev abstracted the chaotic variety of chemical elements into patterns based on atomic weights and important behaviors
- Evolution became comprehensible when Darwin and Wallace abstracted countless biological variations and similarities into a single principle of natural selection
- Modern geology crystallized when scientists abstracted countless surface phenomena from earthquakes, mountains, and ocean ridges into a unified theory of plate tectonics floating on a semi-molten mantle
- Electromagnetic theory became unified when Maxwell and Faraday abstracted a bewildering splatter painting of seemingly magical observations into the fundamental concept of the field
In each case, abstraction didn’t sacrifice precision for vagueness, it achieved a higher precision by focusing on essential relationships. Mathematician Henri Poincare said, “Science is built with facts as a house is with stones, but a collection of facts is no more a science than a heap of stones is a house.” Abstraction turns heaps of stone into a house, and heaps of business data into actual understanding.
Standard business analysis breaks things into ever-small components, while abstraction tells us how the components interact and what the patterns of interaction mean. In a world obsessed with ever more granular metrics and big data, this capacity to recognize the essential relationships that determine long-term outcomes becomes increasingly valuable, something I know very well by abstracting my business to its essence — the relationship between individuals through gifts. This opened up insights from anthropology about gift-giving rituals and their universal importance across cultures, and allowed us to refine our marketing approach and pricing power, none of which conventional analysis would never have ever revealed.
Abstraction has also come to help me discover answers to other puzzles that conventional analysis could not touch: Why do some oragnizations adapt seamlessly to technological disripution while others with equal or greater resources fail? Why do certain business models create compounding returns over time while others exhaust themselves? Why do specific leadership teams succeed in multiple contexts while others fail despite their credentials?
These questions find coherent answers not through understanding the fundamental properties of complex adaptive systems — the abstracted form of all businesses — properties I’ve identified studying various complex domains— from biological ecosystems to weather systems to markets. These aren’t merely descriptive characteristics but functional necessities for organizational longevity as respiration is to biological survival.
The Fundamental Properties of Thriving Systems
Now that we’ve thoroughly discredited the standard financial paradigm, I suppose I’m obligated to offer something better in its place. That’s the trouble with criticism. Once you’ve torn down the existing structure, common decency requires you help to build something more useful, and I believe the following seven qualities describe the structure that can do so. This structure, drawn from personal experience, studying complex systems, and reading about hundreds of companies and asking “what was missing?” when things go wrong, or, when things go very right, “what exactly is happening here?”. Let me explain each property and why it matters, and offer some practical ways to apply these ideas. As you read, consider what is true for a small single-celled organism and how it’s more true the greater the complexity, up to a human and a society; and do the same for thinking about how the element could be applied positively for increasingly complex forms of some business. You can apply this framework to any complex system you wish to improve: social circles, family, schools, nations, political systems, you name it.
1. Reality Processing Architecture: “Requisite Variety’
All successful systems need accurate information, to see reality clearly. The first essential property comes from cyberneticist Ross Ashby’s “Law of Requisite Variety,” which states that a system’s internal model must possess at least as much variety as its environment to function effectively. For our purposes, your organization’s information processing architecture must match the complexity of its competitive landscape.
If your business operates in a complex, rapidly changing industry but your information processing systems are simple and rigid and unchanging, you’re guaranteed to miss crucial signals. This seems obvious, but we’ve all heard warnings about executives living in bubbles and been in workplaces punctuated by hushed conversations and pessimistic group chats. Organizations distort information. The human can not interpret the feelings of its individual cells and a single business leader cannot get a completely accurate reading on all of his workers’ thoughts and processes’ problems. Often, it’s even worse. Bad news consciously gets sanitized as it moves up the hierarchy and contradictory evidence gets filtered out. The result is executives and managers making decisions based on a corporate version of a telephone game rather than reality, and workers simply don’t participate in any more observation and communication than is expected and required. The result? Companies make decisions based not on reality but on internal representations of reality that have been distorted beyond recognition.
This distortion isn’t random. It follows patterns catalogued in many a psychology textbook. Confirmation bias leads organizations to notice evidence supporting existing beliefs while filtering out contradictory data. Authority bias causes information to be distorted to please or reflect the opinions of superiors. Commitment and consistency bias makes organizations double down on failing strategies rather than seek out and acknowledge shortcomings. Structurally, traditional metrics do not account for reality processing capacity, though it’s essential for good health.
Consider Intel under Andy Grove versus Kodak in the 1990s. Grove’s famous paranoia wasn’t a personality quirk – it was a systematic approach to reality processing. Intel institutionalized what Grove called “constructive confrontation,” where employees at any level could challenge ideas regardless of hierarchy and information flowed via specifically structured mechanisms that did not distort the message. When technological shifts threatened their memory chip business, this reality processing capacity allowed them to make the painful but necessary pivot to microprocessors. Analogously, imagine a business with eyes on its environment like a basketball team full of players endlessly watching and discussing game tape like Kobe.
Contrast this with Kodak, which actually invented digital photography in 1975 but buried the technology because it threatened their film business. Despite mounting evidence of the digital revolution’s inevitability, Kodak’s reality processing was so compromised that they continued investing in film manufacturing capacity well into the 1990s. Hierarchical approval processes, siloed product divisions, and rigid strategic planning cycles muddied reality. Their executives weren’t stupid – they were simply trapped in an organizational structure that systematically filtered out threatening information. By the time reality forced itself on their attention, it was too late.
Here’s a practical test you can apply to any organization: Find its Cassandras. In Greek mythology, Cassandra was cursed to speak true prophecies that no one would believe. Every organization has them — people who accurately identify and predict problems, but are systematically ignored. I once worked with a product designer who accurately predicted the timing and reason for the end of his company’s growth. Management did not like him but kept him around because he was damn talented, but I think he was right, and so did a lot of other workers whose opinions were never invited to the table. When I evaluate a business, I try to identify these voices and assess how the organization responds to them. Are they marginalized or elevated? Ignored or engaged with? Practiced in sharing ideas or bottled up like high pressure release valves? Are members of the company articulate about the vision, about the reality of the company, about its operations and processes and faults and goals on different levels? The answer tells me more about adaptive capacity than any financial metrics.
Another revealing technique: Track how the organization responds to small failures. Does it hide them, explain them away, or share them and leverage them as learning opportunities? The immune system doesn’t deny infections when it’s working, it actively creates new tools to seek them out and responds to them immediately, learning from each exposure. Your people and their abilities to convey thoughts and come up with tools to better keep an eye out are your organization’s nerve and immune systems combined, or lack thereof.
To evaluate this property, I look for specific structural elements:
- First, signal amplifaction meechanisms: Does the organization systematically elevate weak but important signals? Intel’s “constructive confrontation” ensured that uncomfortable information reached decision makers. Kodak’s hierarchy dampened them. Create regular forums exclusively for surfacing problems and threats. Reward accurate assessment over positive assessment in performance reviews. Regularly engage with dissatisfied customers, not just satisfied ones. Conduct pre-mortems where teams imagine a future failure and work backward to identify current warnings signs.
- Second, information flow topology: A big mathematical term for asking, how does information move through the organization? Network theorists have shown that centralied networks process routine information efficiently but fail castrophically with novel or irregular information. Distributed networks with appropriate connection patterns handle diverse signals better.
- Third, feedback cycle time: How quickly does information about results flow back to decision makers? Amazon’s real time dashboards create immediate feedback, while traditional retailers often wait for weeks or months for quarterly results.
- Fourth, filter design: When algorithms (human or digital) determine which information reaches decision makers? Grove’s practice of directly engaging with engineers two levels down created a bypass mechanism around management filters. Establish anonymous channels for information that bypasses hierarchical filters
2. Energy Capture and Cycling: Compounding Capabilities
All systems require energy to function and resist entropy. Sunlight, food, capital, commitment, and passion are all energetic inputs. The difference between thriving and failing systems isn’t just how much energy they capture, but how they transform outputs back into enhanced inputs. This property comes straight from the second law of thermodynamics — closed systems inevitably increase in entropy (chaos) and the only way to maintain order is to process energy with increasing efficacy. The idea was adapted to ecology by Howard Odum’s work on energy transofmration in living systems. Odum demonstrated that successful ecosystems create hierarchical energy transformation processes where outputs from one level become enhanced inputs for others, creating “emergy,” embodied energy that increases rather than decreases utility, how molecules become cells with DNA, how cells becomes complex creatures, and how those creatures create ever more impressive organizations.
I distinguish between extractive systems and regenerative systems. Extractive systems consume resources until they’re depleted. Regenerative systems create virtuous cycles where outputs become enhanced inputs, like a garden where plant waste becomes compost that enriches the soil. Regenerative metrics seldom appear in financial statements yet fundamentally determines long-term viability.
Ancient Athens provides a perfect historical example of a regenerative system. During its golden age, Athens created remarkably efficient energy cycling: naval supremacy generated trade wealth, which funded education and public works, which produced innovations improving technology, which enhanced trade advantages. This was a self-reinforcing cycle that transformed a minor city-state into a civilization-defining power.
Sparta, meanwhile, operated extractively. It’s military power required helot slave labor, which required constant suppression, which drained resources from other developments, which increased dependence on exploitation; a degenerative cycle that appeared strong but collapsed when any component failed.
In the business world, look at Netflix versus Blockbuster. Netflix designed a system where every customer interaction made their service better for all users – ratings improved recommendations, viewing habits informed content acquisition and production, and even DVD returns provided data that improved logistics. Every interaction with a customer made Netflix stronger. Blockbuster, meanwhile, operated on an extractive model – late fees, which accounted for a significant portion of their profits, actively antagonized customers. This analysis couldn’t be deduced from standard metrics, but each customer interaction depleted rather than enhanced their relationship.
Similarly, each user interaction of Adobe’s Creative Cloud generates data about features, workflows, and bugs through telemetry explicitly designed to improve future versions. Toyota’s production system doesn’t just build cars, it creates knowledge that compounds across production cycles.
To evaluate this property systematically, consider four specific energy flows:
- First, knowledge transformation: How effectively does the organization convert experience into reusable knowledge? Do similar challenges become progressively easier over time? How quickly do solutions spread from one area to others? The pharmaceutical industry provides a stark contrast. Traditional pharma companies typically treat each drug development program as a discrete project. Knowledge transfer happens informally, if at all, or only at the very top. Failed candidates are seen as sunk costs rather than learning opportunities.
- Meanwhile, upstarts and more tech-forward companies maintain searchable databases of all compound tests, regardless of outcome. They employ machine learning systems to identify patterns across seemingly unrelated drug candidates. They conduct formal post-mortems on failed candidates designed specifically to extract reusable insights. Their development processes include explicit stages for applying cross-program learning. These mechanisms create dramatically decline resource requirements per drug candidate as their knowledge compounds.
- Second, customer relationship transformation: Does each customer interaction strengthen or weaken the relationship? Does acquiring the thousandth customer cost as much as acquiring the hundredth, or does one transaction encourage the next several? The banking industry demonstrates a differential clearly. Traditional banks interact with customers through discrete transactions, such as deposits, withdrawals, and loans, with minimal mechanisms for strengthening relationships. It’s like the company just doesn’t give a damn.
- USAA, on the other hand, has built explicit relationship cycling mechanisms. They map complete customer lifecycles and proactively offer relevant services at life transition points. They capture and use detailed customer data across all services to personalize interactions. They design each customer touchpoint to deepen the relationship to create a bond rather than a series of transactions. These mechanisms create customer profitability that increases with relationship duration, such that a ten-year customer generates about 3.5 times the annual profit of a two-year customer.
- Third, capital transformation: Does invested capital create capabilities that compound or merely deplete over time? I’ve observed several consumer goods companies commit 7 to 8 figures on advertising and marketing spend and less than one quarter of that on new product research and development, and I’ve seen similar companies spend a bare minimum on advertising and minimum with almost all of their free capital allocated to research and development. The companies that spent more on product always compounded growth and revenues faster and more reliably than the big marketing spenders. The big marketing spenders were also much more likely to be in a panic.
- Fourth, network transformation: Does the organization’s relationship with the world become more or less valuable over time? Traditional retailers squeezed suppliers for margin, creating increasingly adversarial relationships. Walmart, despite its tough negotiations, established information sharing system with suppliers that reduced inventory costs for both parties, compounding advantages over time.
- Fifth, team transformation: Does each new employee create incremental value for your company? I’ve seen companies hire as fast as they possibly could to keep up with growth, and watched those same companies spend years churning out terrible products, losing key customers, and . Companies that do not hire judiciously and cannot inculcate new hires in the culture of the company can guarantee a free-fall of passion, ambition, and understanding, all of which are compounding qualities.
The empirical evidence is overwhelming: systems that convert outputs into enhanced inputs outperform extractive ones over meaningful timescales. This is why Amazon could afford to accept years of losses while Borders and Barnes & Noble couldn’t – Amazon’s flywheel of customer reviews, data, and recommendations was regenerative, while traditional booksellers faced increasing customer acquisition costs in a mostly extractive model.
Create systems that explicitly capture and reapply learning across the organization. Design customer interactions to generate useful data beyond the immediate transaction. Structure relationships to become more valuable with duration rather than depleting over time. Invest in capabilities that compound rather than depreciate. Establish feedback loops that convert experience into improved processes.
There is a trade-off, of course. Regenerative cycling typically requires greater upfront investment and longer-term perspectives throughout an organization. Companies optimizing for immediate returns naturally gravitate toward extractive approaches, creating the illusion of efficiency while slowly depleting their fundamental capabilities, unless we’re talking too-big-to-fail organizations, which goes without saying.
3. Evolutionary Learning Architecture
Nature doesn’t design—it evolves through variation, selection, and replication. This simple algorithm has solved more complex problems than all human engineering combined (even solving for the ability for humans to engineer in the first place), but most organizations rely on centralized planning, despite overwhelming evidence of its limitations and inferiority in complex environments.
Complexity scientist Stuart Kauffman demonstrated mathematically why evolutionary processes outperform designed solutions in “the adjacent possible,” the set of all potential configurations reachable from the current state. The possible configurations are astronomically large, and no centralized algorithm can effectively search this space. Evolutionary algorithms, however, can efficiently discover optimal solutions through parallel processing and selection.
The British Royal Navy’s rise to dominance in the 18th century is an interesting case in the superiority of evolutionary learning. Unlike more rigid naval hierarchies, the British system created explicit mechanisms for variation (captains had substantial tactical discretion), selection (combat outcomes determined which approaches spread), and replication (the Admiralty systematically circulated meticulous battle reports throughout the fleet).
British naval capabilities evolved far more rapidly than their French and Spanish competitors, who relied primarily on centralized doctrine and planning. Lord Nelson’s revolutionary tactics at Trafalgar emerged through this evolutionary system rather than genius in isolation. Similarly, the Islamic Golden Age—when medieval Baghdad led global scientific advancement through the House of Wisdom, despite Europe living in the dark ages—when they employed a systematically evolutionary approach: scholars translated and tested ideas from multiple civilizations, rigorously evaluated results against reality rather than doctrine, and systematically disseminated proven knowledge. Europe wouldn’t match their breakthroughs in mathematics, astronomy, medicine, and chemistry for centuries.
Why do organizations resist this approach? The psychological barriers are formidable:
Arrogance, generally, (also known as overconfidence bias) leads us to the illusion that our rational minds can design optimal solutions, despite military strategists Helmuth von Moltke’s accurate assessment that “No plan survives contact with the enemy,” or as Mike Tyson put it more colorfully, “Everyone has a plan until they get punched in the mouth.” Furthermore, commitment bias leads us to double down on failing strategies rather than abandon them. Authority bias makes us defer to hierarchical decision-makers, experts, or confident individuals regardless of evidence. And clearly the bias for self preservation prevents people from taking atypical action.
These biases don’t just influence individuals, they become institutionalized in organizaional process that systematically prevent evolutionary learning. The difference between Steve Ballmer’s Zune making Microsoft and Nadella’s Microsoft illustrates this beautifully. Ballmer’s Microsoft relied on centralized planning and top-down execution and came out with embarrassing and extractive software and hardware for years. The latter, Nadella, instituted a genuine evolutionary architecture – rapid experimentation, willingness to kill failing initiatives (like the Nokia acquisition), and a culture that rewards learning rather than being right. The company’s shift to cloud services wasn’t a single brilliant insight but an evolutionary process of testing, learning, and scaling what worked. Standard financial analysis would have shown Microsoft’s balance sheet strength throughout the Ballmer period, and could not have seen the lack of evolutionary architecture that saved it under Nadella.
To strengthen evolutionary learning, I look for specific mechanisms supporting each component of the evolutionary algorithm:
- First, variation: What mechanisms create potential adaptations? Organizations with strong evolutionary architecture maintain high experiment density relative to their size. They establish protected spaces for experimentation with minimal approval requirements (these could be in sandboxes, real-world implementations, or through research, analysis, and synthesis). They generate funding mechanisms for small-scale tests that bypass normal budget processes. They encourage diverse approaches to similar problems rather than premature standardization. And they build education programs and require education as part of the job for all full-time employees.
- Second, selection pressure: How does the organization determine which variations succeed? Strong evolutionary systems develop clear, reality-based criteria for evaluating experiments. They create rapid rapid feedback mechanisms that provide unambiguous performance data, ensure selection happens through objective criteria and reasoned explanation rather than politics or hierarchy, and incentivize a wide distribution of solution sources for leaders. They explicitly reject rigid politics and instantiate fluid hierarchical structures where possible and on a per-project or per-term basis.
- Third, replication: How effectively does the organization copy successful adaptations? Organizations with poor evolutionary architecture frequently reinvent solutions or struggle to win the same game twice because they lack mechanisms for spreading or memorizing successful innovations and incentivize, if anything, personal brilliance rather than collective sharing. Strong systems build explicit knowledge-sharing systems across organizational boundaries and require knowledge dissemination to and from all full-time employees, create incentives for adopting proven innovations from other units and spreading innovations to other units, and establish formal mechanisms for codifying and spreading successful adaptations.
Most organizations learn accidentally, if at all. The executive who “knows exactly what to do” commands more respect than one who systematically tests multiple approaches. Yet nature – which has been solving complex problems far longer than Harvard Business School has existed – never relies on omniscient designers. It maintains a portfolio of options, culls failures ruthlessly, and doubles down on what works. The business that can overcome its ego attachment to visionary leadership and instead embrace systematic evolutionary learning has discovered something more valuable than any single brilliant strategy. If you’re betting everything on prediction in unpredictable environments rather than real-world and well-modeled sandboxed experimentation, you may succeed once or even ten times, but evolutionary gaps compound and at the end of the day it didn’t mean anything to be the biggest dinosaur.
4. Identity Continuity Through Change
The paradox of successful adaptation is that systems must change continuously while maintaining coherent identity. A tiny rodent-like beast evolving over time is identifiably very similar to its peers, yet future iterations off it look strikingly different, and then one day its progeny is both Wendy and the guy eating a Whopper at 6AM in the airport. Without identity continuity, adaptation becomes dissolution and the system loses what makes it valuable while responding to environmental pressure. Standard financial analysis is blind to this property and traditional analysis can make no sense of it, yet it fundamentally determines which organizations can evolve successfully.
This property draws from autopoiesis theory, developed by biologists Humberto Maturana and Francisco Varela, who identified the mechanisms through which living systems maintain identity through continuous material change. Their insight is that identity persists through transformation patterns, not static elements.
Aristotle gave us a useful distinction between essential and accidental properties. Essential properties define core identity and value, whereas accidental properties are specific implementations that can change without altering fundamental identity. Most organizations get this backward, clinging to specific accidental practices while allowing their fundamental value proposition to drift.
No organization in history demonstrates this property more remarkably than the Catholic Church. Over two-thousand years, it has adapted to countless social, political, and intellectual revolutions while maintaining core theological identity. It would be unrecognizable to early church goers on its face, but basis tenants would be familiar. This hasn’t happened by accident but through explicit mechanisms: distinguishing between dogma (essential doctrines) and discipline (adaptable practices), creating defined processes for incorporating new elements without compromising core identity (like the Amish adopting new tech), and maintaining symbolism that bridges changes with continuity.
Contrast this with revolutionary France, which attempted complete reinvention, even creating a new calendar and measurement system, only to experience complete identity collapse and eventual reversion. The abrupt abandonment of identity created system-wide incoherence despite rational intentions. We see a bit of that in the Western world today.
In the tech world, consider IBM versus Compaq. When the personal computer revolution threatened its mainframe business, IBM made a remarkable pivot from hardware manufacturer to service provider. What’s crucial is that they preserved their core identity – solving complex business problems for large enterprises – while completely transforming how they delivered that value.
Contrast this with Compaq, which began as a premium PC manufacturer known for engineering excellence. As the PC market commoditized, they abandoned this identity through acquisitions, trying to become a full-spectrum technology provider. In the process, they lost what made them distinctively “Compaq” – neither low-cost like Dell nor innovation-focused like Apple. Without a coherent identity to guide their evolution, they ended up as an incoherent collection of businesses eventually gobbled up by HP. The same goes for Gateway, DEC, and countless other computer manufacturers.
To evaluate this property: I examine specific mechanisms that enable identity-maintaining transformation:
- First, essential/accidental property distinction: Can people across the organization articulate which elements are essential to identity versus which are implementation details? Organizations with strong transformation capacity explicitly define this distinction and revisit it regularly, without ego or nostalgia.
- Second, identity narratives: What stories does the organization tell about major changes? And about its nature? Strong systems develop narratives connecting transformations to enduring purpose rather than presenting disconnected lurches. Southwest Airlines maintained its identity as a democratic transportation provider (until very recently) while evolving from a regional Texas airline to a national carrier through consistent identity narratives.
- Third, symbolic continuity: What symbols bridge changes with continuity? UPS maintained its reliability identity through dramatic technological change by preserving symbolic elements like the brown uniform while evolving delivery mechanisms.
The practical test is, when facing disruption, ask, does the organization attempt to preserve specific practices at the expense of essential identity, or does it maintain essential identity while radically transforming practices? Apple preserved its essential identity as a creator of beautifully designed, user-friendly products while transforming from a computer company to a consumer device and services and now financial company.
If you were to ask people in your organization, “What business are we really in?,” and “What could change about our business without changing our fundamental identity?,” would you get coherent and regular answers? Furthermore, stakeholder expectations, of course, differ, so contexts will determine how much stability is needed. Additionally, evolution is always necessary, and even the DNA of the world’s most stable creatures (crocodiles) changes over time. The art lies in finding a balance.
5. Distributed Intelligence Coordination & Emergence
Complex challenges exceed the cognitive capacity of any individual. The most adaptive systems harness distributed intelligence, enabling local actors to respond to local conditions while maintaining system-wide coherence.
This property, invisible to standard financial analysis yet fundamental in determing an organization’s ability to respond to complex environments, comes from computer scientist and mathematician Jon Von Neumann’s proof that no centralized processor, regardless of its capacity, can optimally respond to environmental complexity beyond certain thresholds. It’s physically impossible to know all the meaningful amount of information at every level, so distributed decision-making becomes mathematically necessary for adaptation. When environmental variety exceeds central processing capacity — when your business is operating in different environments, you have many different kinds of teams, or you’re working on projects with time goals and time horizons — you’re tackling the “coordination problem” of game theory.
I recall a technology CEO – brilliant by any measure – who insisted on personally reviewing all major product decisions. When his company hit a growth inflection point, this arrangement produced something akin to a neurological disorder at the organizational level. Product development — bottlenecked through the CEO — slowed down not just to a crawl but a halt, despite the CEO’s remarkable intelligence. He just did not have the time and memory to process all of the designs and code. The irony reached its apotheosis when he commissioned a Chief of Staff to help him make decisions faster, adding a second person to help him make product calls, but it only created conflict and confusion about whose opinion and decisions mattered. The solution wasn’t increasing the processing power at the bottleneck but eliminating the bottleneck entirely by distributing decision authority. Cracks began to show through bad reviews, greater sales costs, and more employee turnover.
Financial statements would have shown this company’s growth metrics only skipping a beat after several months of bottlenecking, but a closer analysis of decision rights would have created cause for remedy right away. The practical threshold for determining when to centralize versus distribute intelligence follows a simple heuristic: when the complexity and variety of local conditions exceeds the information-processing capacity of the center, distributed decision-making becomes necessary. Organizations like Haier have formalized this through their “microenterprise” model, where the company functions as an ecosystem of small, entrepreneurial units connected through internal market mechanisms rather than hierarchical control. Berkshire Hathaway lets all of its subsidiaries operate independently, only stepping in to rearrange capital when necessary.
I evaluate this property by examining specific coordination mechanisms:
- First, decision rights allocation: Where do decisions actually get made? Strong systems push decision rights to where information naturally exists. Anheuser-Busch InBev created “ownership zones” where local managers have completed decision authority within defined parameters, dramatically outperforming more centralized competitors.
- Second, information accessibility: Can local units access information across the organization> Amazon’s API-driven architecture allows teams to access data and capabilities from other units without centralized coordination, creating what Jeff Bezos calls “primitives” that enable distribution innovation.
- Third, alignment mechanisms: How does the system ensure distributed actions create coherent outcomes? Valve Corporation eliminated traditional management entirely, using peer feedback and profit-sharing to create alignment without hierarchy. Their productivity per employee is among the highest in the gaming industry.
If you want to improve decision architecture, push decision rights to where information naturally exists, create information systems that make local knowledge available throughout the organization, establish coordination mechanisms that ensure alignment without centralization, design incentives that reward both local responsiveness and system-wide coherence., and implement boundary-spanning roles that connect distributed units without creating bottlenecks.
Different contexts require different balancing points for maintaining system-wide coherence and local autonomy. Emergency services and high risk projects need tighter coordination than research organizations because the consequences of incoherence differ substantially.
6. Enabling Constraints Design
Counterintuitively, the most adaptable systems don’t maximize freedom by minimizing constraints, but by winnowing the system down to what engineers call “enabling constraints.” They design specific constraints that actually enable greater functionality. Effective systems, people, and companies make constraints clear: what they do and do not do, what they are and what they aren’t, who they stand for and who they don’t, and, in general, what kind of interactions they will have. We are what we aren’t.
This principle comes from Alicia Juarrero, whose work shows that true generative power emerges not from unlimited freedom but from properly structured limitations. Consider Apple’s product development philosophy under Jobs compared to the sprawling product lines of competitors — fewer options, more rigorously constrained, yet driving greater innovation. The limitations of the iPhone’s form factor — just a screen and a home button — pushed the software interface to new heights to meet novel demands, just as classical Greek architecture with its precisely limited proportions of Doric, Ionic, and Corinthian styles enabled some of the greatest constructions civilization has ever known, or less impressively but perhaps more profitably, see how LEGO’s standardized connection system enables infinite creative combinations from a finite set of pieces, while providing just enough constraint to ensure structural integrity.
Southwest Airliens built their entire business model around enabling constraints, flying only one type of aircraft (enabling operational efficiency), servin only point-to-point routes (enabling faster turnarounds), and offering no assigned seating (enabling faster boarding). Each constraint isn’t a limitation but a deliberate design choice enablign superior performance within the constrained space. It also helps customers identify the company better, the same way an interesting hair style or a color palette or a tattoo helps one identify a person better.
Linguists call this “generative grammar” — finite rules producing infinite valid expressions. In music, the constraints of scales and time signatures enable rather than limit creative expression, just as sonnets have rigid line, rhyme, and meter requirements, and popular books have standard print size and story structure limitations. Even in technology development, the concept of “opinionated software” demonstrates how well-designed constraints accelerate innovation by eliminating decision fatigue and creating compatible components. Without constraints in business, as with language, we’d make no sense and be impossible to interpret.
This brings us to what Stuart Kauffman calls “the edge of chaos” — the fertile zone between rigid order and total disorder where complex adaptive systems thrive. Too few constraints lead to chaotic dissipation of energy; too many create brittle, unadaptable structures. The sweet spot lies in constraints that channel creative energy toward productive ends.
To evaluate this property, I examine specific mechanisms that enable constraint-based innovation:
First, boundary clarity: Can people across the organization articulate what’s in-bounds versus out-of-bounds? Apple’s employees can clearly explain what makes something “Apple-like” versus not, while companies with fuzzy boundaries waste energy on projects that don’t fit their essential identity. As a real-world test, randomly select employees from different levels and ask them to identify which potential products or initiatives would be appropriate for the company. High alignment in these answers reveals clear enabling constraints.
Second, constraint generativity: Do the organization’s constraints generate possibilities rather than merely restrict them? LEGO’s standardized connector system enables infinite combinations rather than limiting creation. In contrast, many corporate “guidelines” function purely as restrictions without enabling new possibilities. As a lollapalooza example, the NFL’s seemingly restrictive rulebook has generated endless strategic innovation within its constraints.
Third, constraint evolution mechanisms: How does the organization evaluate and evolve its constraints over time? Constraints that remain static despite changing environments become disabling rather than enabling. Nintendo has systematically evolved its creative constraints from playing cards to electronic games while maintaining its essential focus on “play,” while Kodak clung to film-based constraints long after they became restrictive.
The critique often leveled against this property is that constraints inevitably stifle innovation — after all, isn’t “thinking outside the box” the mantra of creative success? This argument fundamentally misunderstands the nature of creativity. True innovation nearly always emerges within constraints, not from their absence. Jazz improvisation happens within scales and chord progressions. Scientific breakthroughs occur within the constraints of experimental methods and physical laws. The box isn’t the enemy; it’s the very thing that makes creativity possible and meaningful. As G.K. Chesterton put it, “Art consists of limitation. The most beautiful part of every picture is the frame.”
Another objection comes from organizational complexity theorists who argue that different business units require different constraint architectures. This is correct but not contradictory. The art lies in designing nested constraints — organizational identity constraints that apply universally, within which different units may have unit-specific enabling constraints. Amazon’s overarching customer obsession constraint applies to all divisions, while AWS and retail have unit-specific constraint systems.
To develop better enabling constraints in your organization:
- Identify the minimal constraints necessary for system integrity
- Design constraints that focus energy rather than merely restricting options
- Establish clear boundaries while maximizing freedom within these boundaries
- Create constraint review processes that evaluate whether constraints remain enabling as conditions change
- Regularly conduct protected experimentation at the edges of established constraints
- Develop constraints that enable coordination without requiring centralization
The business implication is clear: freedom without appropriate constraints produces not innovation but paralysis. The most successful organizations identify the minimum constraints that maximize functional possibilities rather than imposing restrictions that limit adaptation. When implementing constraints, be mindful that creative endeavors typically benefit from outcome constraints with process freedom, while operational contexts often require process constraints with outcome freedom.
7. Multi-Temporal Integration
All complex systems operate across multiple timeframes simultaneously. Only the most successful integrate these different time horizons, preventing short-term optimization from undermining long-term viability. If you don’t balance time horizons, you are going to die. History and the laws of thermodynamics guarantee this as surely as the sun rises in the east, despite the fervent wishes of quarterly-obsessed executives and the endless parade of management consultants promising immediate transformations.
Step back in time and view the destruction of civilizations that had temporal myopia. Easter Island’s catastrophic deforestation is a clear example. Islanders harvesting too many trees collectively undermined the resource base essential for long-term survival, and by the time of Western discovery, the island’s population was in free fall. The Easter Islanders weren’t stupid — they were trapped in a system that made short-term optimization individually rational but collectively suicidal, a perfect metaphor for modern corporate behavior that would make even the most hardened MBA blush with recognition.
Conversely, the Iroquois Confederacy — after hundreds of years of tumult — instituted a “seven generations” principle requiring council decisions to consider impacts seven generations forward (roughly 150 years). This wasn’t philosophical but operationalized through specific council roles, decision procedures, and evaluation criteria that deliberately balanced immediate pressures against long-term consequences.
The barriers to multi-temporal integration are manifold. Psychologically, hyperbolic discounting — our tendency to undervalue future outcomes — isn’t just an individual bias but becomes institutionalized in organizational decision processes. When executive tenure averages under five years, short-term optimization becomes perfectly rational individual behavior, despite system-destroying consequences.
Our incentive systems actively work against this property: When executive compensation is tied to short-term metrics and average tenure is under five years, we’ve created a perfect machine for mortgaging the future to enhance the present. No amount of exhortation about long-term thinking will overcome compensation systems that reward its opposite. In finance, a bird in the hand is worth two in the bush, even if it’s the wrong bird, or destroys the whole bird population.
This temporal myopia produces a kind of corporate equivalent of the “sorites paradox” — the puzzle of precisely how many grains of sand constitute a heap. Executives routinely justify small sacrifices of long-term capabilities for short-term gains, each seemingly inconsequential in isolation. “Surely delaying this maintenance by one quarter won’t matter,” they reason. But these “single grains” accumulate until one day, the organization wakes up to discover its capabilities have been so thoroughly depleted that recovery becomes nearly impossible.
The solution isn’t mystical long-termism, but practical multi-temporal integration: incentive systems that explicitly reward actions creating value across multiple time horizons simultaneously. This means elevating maintenance from cost to investment, viewing employee development as capability enhancement rather than expense, hiring and developing the team for the long-term, and treating customer relationships as appreciating assets rather than short-term revenue sources.
To evaluate this property, I examine specific mechanisms of temporal integration:
- First, decision timeframe distribution: What proportion of management attention and resources is allocated to different time horizons? Most organizations are pathologically present-focused, with 90%+ of management meetings devoted to quarterly or annual issues. Amazon famously inverts this, with Bezos insisting that executives spend at least 70% of their time thinking beyond the current year — a practice foreign to most companies as Sanskrit is to a rodeo clown. The litmus test: Review the last ten executive meetings and categorize agenda items by timeframe. The distribution reveals true temporal priorities.
- Second, incentive structures: How do compensation and recognition systems balance short, medium, and long-term outcomes? The fundamental question isn’t whether long-term incentives exist, but whether they have teeth. Traditional stock options and three-year vesting schedules masquerade as “long-term” while actually reinforcing short-termism. Berkshire Hathaway managers have compensation structures tied to multi-decade performance, while most companies call a three-year plan “long-term” with a straight face.
- Third, institutional memory mechanisms: How does the organization preserve knowledge across time? Japanese temple builders have maintained wooden structures for over 1,300 years through apprenticeship systems that transmit knowledge across generations. Most corporations can barely remember what happened three years ago because their records are scattered across departed employees’ email archives and forgotten SharePoint sites.
- Fourth, consequence visibility: What mechanisms make long-term consequences of current decisions visible? When maintenance is deferred, do finance systems track the accumulating liability? When customer relationships are harvested for short-term gain, is the depletion of relationship capital measured? Most organizations track near-term metrics with exquisite precision while leaving long-term consequences unmeasured and thus effectively invisible.
Several critiques of multi-temporal integration deserve attention. First, market fundamentalists argue that capital markets efficiently price all timeframes, making explicit temporal balancing unnecessary. This argument collapses upon contact with reality — if markets efficiently priced long-term outcomes, we wouldn’t see the systematic destruction of long-term value for short-term gains that characterizes most public companies. The markets’ time-horizon is demonstrably shorter than optimal for societal welfare or even long-term shareholder returns.
Another objection comes from complex systems theorists who note that long-term prediction is fundamentally impossible in complex environments, making long-term planning futile. This misunderstands the nature of multi-temporal integration. The point isn’t to predict the future with precision but to protect the system’s adaptive capacity across multiple timeframes. We don’t need to know exactly what challenges a business will face in 20 years to know that depleting its financial resilience, talent development, and research capabilities will leave it vulnerable to whatever those challenges might be.
To strengthen multi-temporal thinking:
- Establish explicit decision processes that consider impacts across different time frames. This is obvious but almost never done.
- Create governance structures specifically responsible for long-term considerations. Someone needs to represent the future in current decisions, just as the Iroquois assigned specific council roles to represent future generations. These thinkers must be not just the wisest people you know, but the wisest people you could know.
- Design incentive systems that reward outcomes across different time horizons. Behavior follows incentives with remarkable reliability. If you reward only short-term results, you’ll get only short-term optimization.
This property connects directly to capital allocation. Effective capital allocation requires integration of multiple time horizons. The rare organization that excels at this temporal balancing act creates extraordinary value over time, which explains much of Berkshire Hathaway’s success despite its seemingly simple approach.
The Lollapalooza Effect and the Utility of Abstraction
The Lollapalooza Effect: When Systems Create Free Energy
I once watched mesmerized as a materials scientist demonstrated water’s peculiar stubbornness at phase transitions. Apply heat, and the temperature rises predictably, steadily, almost boringly—until suddenly, at precisely 212°F, something miraculous happens. The temperature stops rising entirely as the water transforms from liquid to vapor. The additional energy isn’t increasing temperature but creating a phase transformation—a fundamental shift in state that defies linear prediction. What was once bound by gravity and surface tension now expands freely, capable of driving turbines and powering engines with newfound properties that no examination of the liquid state could possibly predict. In labs across the world, physicists are deploying billions trying to replicate the sun’s power in metal chambers to achieve another phase transition to similar effect: limitless nuclear fusion. The problem isn’t getting hydrogen atoms close enough together—it’s creating the precise conditions where multiple forces synchronize perfectly. When they finally succeed, for a brief, magnificent moment, they don’t just get heat—they created a miniature star on Earth, releasing energy that defies every incremental expectation.
This, dear reader, is the perfect metaphor for what I call the “lollapalooza effect”—that explosive outcome when our seven system properties don’t merely add up but catalyze each other into something transcendent. When these properties combine with the right architecture, they don’t just enhance each other incrementally—they transform the entire system into something that operates by different rules altogether, creating what physicists would recognize as “free energy” in the system.
Now, the iron rule of nature is that 2+2=4. Except when it doesn’t. In complex systems with interacting elements, sometimes 2+2=16, and occasionally 2+2=1. That’s the lollapalooza effect in action.
I experienced this firsthand in my own life. When I was a young finance graduate, I made a decision that most people thought foolish—I deliberately abandoned a promising career and all of my college training to focus on entrepreneurship. I didn’t just dabble; I reconstructed my entire life system. I combined intensely focused reality processing (my “no BS” rule of examining facts objectively) with an evolutionary learning approach (reading as much as humanly possible and testing my findings), multi-temporal thinking (willingness to live in squalor for years), and distributed intelligence (partnering with people who complemented my weaknesses and inexperience).
The result wasn’t just an incremental improvement in my financial condition—it was a phase transition that transformed my entire life trajectory. This wasn’t just about making money. It was about creating a system where capabilities emerged that I couldn’t have achieved through any linear combination of separate efforts. It’s like the difference between a campfire and a nuclear reaction.
The philosophical question behind all this is profoundly important: Why does reality work this way? Why do complex systems generate these non-linear outcomes? I believe it comes down to the fundamental structure of the universe. Nature doesn’t optimize components; it optimizes interactions. Evolution didn’t separately perfect the heart, lungs, and brain and then assemble them. It optimized their interactions within the whole system. The result is capabilities that no collection of separate organs could achieve.
This is where abstraction proves its supreme utility. By discarding the overwhelming majority of data points in favor of these essential system properties, we can see patterns and make predictions that conventional analysis systematically misses. While others drown in quarterly metrics and industry benchmarks, proper abstraction lets us focus on the deep structures that actually determine outcomes.
The consulting profession is filled with people who create elaborate models that explain everything and predict nothing. I remember sitting through a presentation at a Fortune 500 company where the consulting team spent two hours walking through a framework with sixteen different quadrants, each with its own special term and color-coding. When I asked which quadrant contained the company’s most profitable segment, there was an uncomfortable silence followed by someone admitting they’d have to “get back to me on that.” They’d created a classification system so elaborate that they’d forgotten to check whether it actually mapped to business reality! And the company had paid millions for this nonsense.
Old Karl Popper would have identified this immediately as pseudoscience – a framework that can explain anything explains nothing. A genuinely useful abstraction must stick its neck out – it must make specific predictions that could potentially be wrong. Most business frameworks are carefully constructed to avoid exactly this kind of accountability.
Our system property framework doesn’t have this problem. It’s eminently testable. You can evaluate organizations using these properties and predict which will develop superior adaptive capacity. You can redesign organizations around these properties and observe whether their adaptability improves. The framework has tremendous explanatory power, making sense of patterns across domains that otherwise appear unrelated. And it has obvious practical utility, helping leaders and investors make better decisions by focusing on what actually determines long-term outcomes.
Take SpaceX. While the aerospace establishment insisted $100 million launches were an immutable floor, SpaceX delivered $60 million, then $50 million, then eventually sub-$30 million launches. This wasn’t marginally better execution—it was a fundamental phase transition in the system. Reality processing (refusing to accept “that’s how it’s always been done”), evolutionary learning (failing fast and adapting), and distributed intelligence (cross-functional problem solving) interacted to create problem-solving capabilities beyond what any collection of separate properties would predict.
The resulting capabilities didn’t just improve linearly with investment; they emerged explosively like a chemical reaction crossing its activation threshold. This explains why organizations with seemingly identical resources, talent, and markets experience wildly divergent outcomes. One creates fusion; the other mere combustion.
But here’s where it gets both fascinating and terrifying. These lollapalooza effects work with precisely the same power in reverse. When properties interact negatively, they don’t just create additive harm—they create catastrophic implosions beyond what any reasonable analysis would predict.
I’ve seen this in my personal life too. My friend Joe Pope taught me about alcoholism and how it creates a negative lollapalooza effect. The alcoholic doesn’t just experience separate problems with health, relationships, and work. Instead, these problems interact and amplify each other. Relationship problems increase drinking, which worsens health, which damages work performance, which creates financial stress, which further strains relationships, and on and on in a devastating downward spiral. That’s why AA’s system is so brilliant—it doesn’t just address drinking; it reconstructs the entire human system with new reality processing mechanisms, identity continuity, and social feedback loops.
Look at Boeing’s 737 MAX disaster. This wasn’t a simple failure of engineering or oversight. It was a lollapalooza effect in reverse—a situation where reality processing broke down (financial metrics displacing safety concerns), distributed intelligence fractured (headquarters overruling engineers’ warnings), and multi-temporal integration collapsed (short-term delivery pressures overwhelming long-term safety culture). These weren’t separate problems but a single complex system failure where negative properties amplified each other toward catastrophe.
The beauty and terror of lollapalooza effects is their multiplicative nature. When I was a young boy, my math teacher explained compound interest, saying, “Nate, you take the probability of one event, multiply it by the probability of another, and pretty soon you have a number approaching hopeless improbability.” But the same compounding logic applies to interacting system properties—except sometimes the effects multiply in your favor, creating seemingly miraculous capabilities, and sometimes they multiply against you, creating disasters beyond anything simple addition would predict.
The practical implications couldn’t be more profound: Stop thinking about improving individual functions in isolation—that’s the equivalent of rearranging deck chairs on the Titanic. Start architecting their interactions. Don’t just enhance your reality processing; ensure it feeds directly into evolutionary learning. Don’t just balance short and long-term thinking; ensure this temporal integration reinforces your enabling constraints.
Organizations harnessing positive lollapalooza effects don’t just outperform competitors; they make competition irrelevant by operating in an entirely different state—like comparing jet propulsion to horse-drawn carriages. Conversely, those suffering negative lollapaloozas don’t just underperform; they spectacularly self-destruct in ways that conventional analysis could never predict—like watching a seemingly solid building suddenly collapse when multiple support systems fail simultaneously.
As I told the directors at USG Corporation after they’d rejected an excellent acquisition opportunity: “You made a decision based on the reality that 2+2=4. But in this case, because of the particular circumstances and interaction effects, 2+2 would have equaled 6.” Understanding when and how properties create lollapalooza effects might be the most valuable insight any business leader, investor, or human being can possess.
Conclusion
There’s something deeply satisfying about finding the right level of abstraction for understanding. It’s like suddenly gaining x-ray vision—seeing through surface details to the essential structures beneath. Engineers call this “isomorphism”—when superficially different systems reveal identical underlying patterns. It’s world-changing, as Johannes Kepler must have felt when he discarded thousands of years of epicycle astronomy and realized planets move in ellipses, or when Darwin saw beyond the bewildering variety of finch beaks to the underlying principle of natural selection.
The most successful investors, entrepreneurs, and leaders share this ability to abstract properly. They may not use this terminology, but they instinctively focus on what fundamentally matters while ignoring the noise that distracts their competitors. Warren Buffett, for instance, has an almost supernatural ability to cut through complexity to the one or two factors that actually determine an investment’s success, while others drown in data that obscures more than it reveals. As Buffett likes to say, “The difference between successful people and very successful people is that very successful people say ‘no’ to almost everything.” I’d add that knowing what to say no to—and what to say yes to—depends entirely on your ability to abstract properly, to separate what fundamentally matters from what merely demands attention.
The framework I’ve presented here—understanding businesses as complex adaptive systems with seven essential properties whose interactions create lollapalooza effects—provides exactly this kind of abstraction. It gives you a lens for seeing beyond the noise of quarterly metrics and industry benchmarks to the deeper structures that determine long-term adaptability and success. But proper abstraction isn’t just a better analytical tool—it’s a competitive advantage of the highest order. While others chase the latest management fads or optimize for this quarter’s metrics, those who understand the true nature of complex adaptive systems can make decisions that create compounding advantages over time. It’s the difference between playing checkers and playing chess, or perhaps more aptly, between playing chess and inventing a new game altogether.
This is the secret to my businesses’ successes. My teammates and I don’t try to predict next quarter’s earnings or optimize for short-term stock price movements. We look for opportunities and partners and organizations with system properties that create sustainable adaptive advantage—entities that will still be generating benefits decades from now, regardless of what specific challenges they face along the way. As Buffett likes to say, “we want businesses that an idiot could run, because sooner or later, one will.” What he means is that we want businesses with such robust system properties that they can withstand even mediocre management, which I extend to include mediocre environments, as well.
Most business metrics suffer from what I call “temporal myopia”—a fixation on short-term signals even when long-term outcomes matter most. Quarterly earnings, daily active users, click-through rates—these are snapshots, not stories. They favor tactical optimization over strategic insight. Amazon famously avoided this trap for years as Bezos used free cash flow instead of profit as his lodestar, allowing the company to build system properties that created extraordinary long-term value while Wall Street scratched its head. The greatest businesses understand that, like a tree, they don’t grow faster just because you measure their height every day.
It’s important to recognize that despite their veneer of objectivity, metrics are not neutral instruments. They reflect power dynamics, cultural values, and implicit assumptions. Who decides what gets measured? What timeframes are privileged? These choices aren’t just technical—they’re expressions of authority that shape perception and behavior. As Facebook learned with its obsession with “time spent,” a metric optimized for attention extraction at the expense of user well-being, what you measure is what you center, and therefore what you control. The danger lies in how easily metrics can launder ideology, transforming subjective beliefs into apparent data-driven objectivity.
We must also acknowledge that not everything important can be measured—nor should it be. Trust among founders, product intuition, cultural resonance, deep insight—these may only emerge from qualitative synthesis, long conversations, or sometimes even irrational conviction. Steve Jobs famously rejected market research for early Apple products: “People don’t know what they want until you show it to them.” This isn’t an argument against rigor; it’s an argument for multi-modal cognition. Metrics should not crowd out imagination, narrative, and intuition. The best businesses build spaces for the unmeasurable to be valued—until they crystallize into something that can be seen. What we still lack in business is a discipline of measurement as robust as in physics or economics—a true epistemology of management. How do we construct metrics that serve as languages for navigating uncertainty rather than instruments of control? How do we design measurement systems that don’t just record the past but help us imagine and test future visions? These are the questions that separate those who use metrics as hammers from those who use them as compasses.
And here’s the truly fascinating thing about proper abstraction: it creates its own evidence. Unlike management fads that require evangelical faith to sustain them in the absence of results, genuine systemic improvements generate objective performance differentials that even the most ideologically committed traditionalists eventually cannot ignore. I observed this at a struggling manufacturer. The overall company was hamstrung by legacy systems, crushing debt, and entrenched leadership, but one division manager applied these principles within her limited domain. She created what I’ll call a “microclimate of rationality”—a small ecosystem operating on different principles than its surroundings. Within eighteen months, her division’s performance so dramatically outpaced the rest of the company that the board took notice. Within three years, she was CEO, and within five, the entire organization had adopted her approach. As one board member quipped, “We’re not sure why her division works so much better, but we’d be idiots not to copy it.”
Let me leave you with an observation that might seem pedestrian but has served me well: In a world relentlessly focused on addition—more metrics, more data, more complexity—the greatest breakthroughs often come through subtraction. Finding the elegant simplicity on the far side of complexity requires the courage to discard not just what’s obviously irrelevant, but what’s deceptively important-looking yet ultimately distracting. It’s like sculpture, where the artist removes everything that isn’t the statue. Or as Antoine de Saint-Exupéry put it, “Perfection is achieved, not when there is nothing more to add, but when there is nothing left to take away.”
My journey into the wilderness of abstraction has produced an irony I rather enjoy: I’ve spent years studying complexity only to conclude that the essence of wisdom lies in finding simplicity on the other side. It’s a bit like the story of the Zen master who, after years of meditation, was asked to summarize his insights. “When I began, mountains were mountains and rivers were rivers,” he said. “As I studied, mountains ceased to be mountains and rivers ceased to be rivers. Now that I’ve attained enlightenment, mountains are again mountains and rivers are again rivers.” The difference, of course, being that you now see the mountains and rivers with a clarity impossible before the journey. Life is experience, and experience is reality.
And that’s what proper abstraction ultimately provides—not the illusion of understanding, but its genuine article. In a world drowning in data but starving for wisdom, this capacity to discern the essential from the incidental isn’t merely an intellectual parlor trick. It’s the fundamental meta-skill that separates those who merely participate in complex systems from those who comprehend and shape them. The difference between good companies and great ones, between adequate investors and extraordinary ones, between competent leaders and transformative ones, isn’t raw intelligence or access to data. It’s the ability to see what others miss by focusing on what truly matters—the system properties and their interactions that create extraordinary, lollapalooza outcomes.
Like that water molecule suddenly transforming into steam at exactly 212°F, organizations with the right system properties don’t just perform better—they operate in an entirely different state with capabilities their competitors cannot match regardless of incremental improvements. Understanding this transformation, recognizing it, designing for it, and investing in it may be the single most valuable insight I’ve gained in my long career. It certainly has been worth more to me than any 212-page McKinsey report or 212-page Harvard Business Review case study I’ve ever had to endure.
Here’s a peculiar observation I’ve been turning over for years but rarely shared: When ancient Japanese temple builders constructed wooden temples that have now stood for over 1,300 years, they embedded structural secrets that weren’t written down but passed from master to apprentice through direct experience. These temples have weathered earthquakes that leveled modern steel structures nearby. The most profound system knowledge often resists direct articulation—it must be absorbed, lived, and discovered. I’ve noticed that truly great business leaders operate with intuitive understanding of these complex system properties without necessarily having the vocabulary to explain them. They navigate by feel, by pattern recognition too subtle to reduce to formulas.
This suggests something profound about knowledge itself. Perhaps the most valuable insights aren’t those that can be easily communicated in quarterly reports or PowerPoint decks, but those that emerge from the direct confrontation with reality over time. Like the master carpenter who can feel when wood is properly joined, the greatest business leaders sense when system properties are properly aligned without necessarily being able to produce the formula for their intuition. There may be fundamental limits to how much of this knowledge can be transferred directly versus discovered through experience. As you consider these system properties in your own organization, remember that understanding them intellectually is just the beginning. The real wisdom comes from observing them in action, feeling their interactions, and developing the intuitive sense for when they’re working in harmony and when they’re not.
As you finish reading this, I hope you’ll look at your own organizations, investments, and even personal systems through this lens. Ask not just what properties you possess, but how they interact. Search not just for additive improvements but for transformative combinations. And remember that in complex adaptive systems, the relationships between elements matter far more than the elements themselves. Or, to put it all as plain as a Texas farmhouse: When you find the right abstraction, complexity surrenders its secrets. And when you see how properties create lollapalooza effects, you gain a superpower that most of your competitors will never possess. The rest, as they say, is just accounting.