top of page

Thinking in Systems by Donella H. Meadows

  • Writer: Lars Christensen
    Lars Christensen
  • 15 hours ago
  • 14 min read
ree

I finished this book in November 2025. I recommend this book 9/10.


Why you should read this book:

This book was written by one of the pioneers in Systems Thinking. The book is a primer, with great and easy-to-understand examples. Systems are all around us, and by mapping those out, you will be able to see feedback loops and challenge your mental models in business as well as in life.


Get your copy here.


🚀 The book in three sentences

  1. Great intro to system thinking

  2. Look for feedback loops and delays.

  3. There is a limit to growth. And, don't try to fix bad policies, pivot.


📝 My notes and thoughts

  • P15. The function of a thermostat-furnace system is to keep a building at a given temperature. One function of a plant is to bear seeds and create more plants. One purpose of a national economy is, judging from its behavior, to keep growing larger. An important function of almost every system is to ensure its own perpetuation. System purposes need not be human purposes and are not necessarily those intended by any single factor within the system. In fact, one of the most frustrating aspects of systems is that the purposes of subunits may add up to an overall behavior that no one wants. No one intends to produce a society with rampant drug addiction and crime, but consider the combined purpose and consequent action of the actors involved:

    • desperate people who want quick relief from psychological pain.

    • farmers, dealers, and bankers who want to earn money

    • pushers who are less bound by civil law than are the police who oppose them

    • governments that make harmful substances illegal and use police power to interdict them

    • wealthy people living in close proximity to poor people

    • non-addicts who are more interested in protecting themselves than the encouraging recovery of addicts

  • P17. To ask whether elements, interconnections, or purpose are most important in a system is to ask an unsystematic question. All are essential. All interact. All have their roles. But the least obvious part of the system, its function or purpose, is often the most crucial determinant of the system's behavior. Interconnections are also critically important. Changing relationships usually changes system behavior. The elements, the parts of the system we are most likely to notice, are often (not always) least important in defining the unique characteristics of the system—unless changing an element also results in changing relationships or purpose. Changing just one leader at the top—from a Brezhnev to Gorbachev, or from a Carter to a Reagan—may or may not turn an entire nation in a new direction, though its land, factories, and hundreds of millions of people remain exactly the same. A leader can make that land and those factories and people play a different game with new rules, or can direct the play toward a new purpose. And conversely, because land, factories, and people are long-lived, slowly changing, physical elements of the system, there is a limit to the rate at which any leader can turn the direction of a nation.

  • P22. A company can build up a larger workforce by hiring, or it can do the same thing by reducing the rates of quitting and firing. These two strategies may have very different costs. The wealth of a nation can be boosted by investment to build up a larger stock of factories and machines. It also can be boosted, often more cheaply, by decreasing the rate of which factories and machines wear out, break down, or are discarded.

  • P24. Most individual and institutional decisions are designed to regulate the levels in stocks. If inventories rise too high, then prices are cut, or advertising budgets are increased, so that sales will go up and inventories will fall. If the stock of food in your kitchen gets low, you go to the store. As the stock of growing grain rises or fails to rise in the fields, farmers decide whether to apply water or pesticide, grain companies decide how many bargers to book for the harvest, speculators bid on future values of the harvest, and cattle growers build up or cut down their herds. Water levels in reservoirs cause all sorts of corrective actions if they rise too high or fall too low. The same can be said for the stock of money in your wallet, the oil reserves owned by an oil company, the pile of wood chips feeding a paper mill, and the concentration of pollutants in a lake. People monitor stocks constantly and make decisions and take action designed to raise or lower stocks or to keep them within acceptable ranges. Those decisions add up to the ebbs and flows, successes and problems, of all sorts of systems. Systems thinkers see the world as a collection of stocks, along with the mechanisms for regulating the levels in the stocks by manipulating flows. That means systems thinkers see the world as a collection of "feedback processes."

  • P28. A balancing feedback loop opposes whatever direction of change is imposed on the system. If you push a stock too far up, a balancing loop will try to pull it back down. If you shove it too far down, a balancing loop will try to bring it back up. Here's another balancing feedback loop that involves coffee, but one that works through physical law rather than human decision. A hot cup of coffee will gradually cool down to room temperature. Its rate of cooling depends on the difference between the temperature of the coffee and the temperature of the room. The greater the difference, the faster the coffee will cool. The loop works the other way too—if you make iced coffee on a hot day, it will warm up until it has the same temperature as the room.

  • P31. Reinforcing loops are found wherever a system element has the ability to reproduce itself or to grow as a constant fraction of itself. Those elements include populations and economies. Remember the example of the interest-bearing bank account? The more money you have in the bank, the more interest you earn, which is added to the money already in the bank, where it earns even more interest.

  • P33. Hint on reinforcing loops and doubling down: Because we bump into reinforcing loops so often, it is handy to know this shortcut: The time it takes for an exponentially growing stock to double in size, the "doubling time," equals approximately 70 divided by the growth rate (expressed as a percentage). Example: If you put $100 in the bank at 7% interest per year, you will double your money in 10 years (70/7=10). If you get only 5% interest, your money will take 14 years to double.

  • P39. The information delivered by a feedback loop can only affect future behavior; it can't deliver the information, and so can't have an impact fast enough to correct behavior that drove the current feedback. A person in the system who makes a decision based on the feedback can't change the behavior of the system that drove the current feedback; the decisions he or she makes will affect only future behavior. Why is that important? Because it means there will always be delays in responding. It says that a flow can't react instantly to a flow. It can react only to a change in a stock, and only after a slight delay to register the incoming information. In the bathtub, it takes a split second of time to assess the depth of the water and decide to adjust the flows. Many economic models make a mistake in this matter by assuming that consumption or production can respond immediately, say, to a change in price. That's one of the reasons why real economies tend not to behave exactly like many economic models.

  • P47. Questions for testing the value of a model:

    • Are the driving factors likely to unfold this way?

    • If they did, would the system react this way?

    • What is driving the driving factors?

  • P57. Changing the delays in a system can make it much easier or much harder to manage. You can see why systems thinkers are somewhat fanatic about the subject of delays. We're always on the alert to see where delays occur in systems, how long they are, and whether they are delays in information streams or in physical processes. We can't begin to understand the dynamic behavior of systems unless we know where and how long the delays are. And we are aware that some delays can be powerful policy levers. Lengthening or shortening them can produce major changes in the behavior of systems.

  • P76. The human body is an astonishing example of a resilient system. It can fend off thousands of different kinds of invaders, it can tolerate wide ranges of temperature and wide variations in food supply, it can reallocate blood supply, repair rips, gear up or slow down metabolism, and compensate to some extent for missing or defective parts. Add to it a self-organizing intelligence that can learn, socialize, design technologies, and even transplant body parts, and you have a formidably resilient system—although not infinitely so, because, so far at least, no human body-plus-intelligence has been resilient enough to keep itself or any other body from eventually dying.

  • P90. Suppose you knew nothing at all about thermostats, but you had a lot of data about past heat flows into and out of the room. You could find an equation telling you how those flows have varied together in the past, because under ordinary circumstances, being governed by the same stock (temperature of the room), they do vary together. Your equation would hold, however, only until something changes in the system's structure—someone opens a window or improves the insulation, or tunes the furnace, or forgets to order oil. You could predict tomorrow's room temperature with your equation, as long as the system didn't change or break down. But if you were asked to make the room warmer, or if the room temperature suddenly started plummeting and you had to fix it, or if you wanted to produce the same room temperature with a lower fuel bill, your behavior-level analysis wouldn't help you. You would have to dig into the system's structure. And that's one reason why systems of all kinds surprise us. We are too fascinated by the events they generate. We pay too little attention to their history. And we are insufficiently skilled at seeing in their history clues to the structures from which behavior and events flow.

  • P92. Good system story.

  • P101. Bread will not rise without yeast, no matter how much flour it has. Children will not thrive without protein, no matter how many carbohydrates they eat. Companies can't keep going without energy, no matter how many customers they have—or without customers, no matter how much energy they have. Rich countries transfer capital or technology to poor ones and wonder why the economies of the receiving countries still don't develop, never thinking that capital or technology may not be the most limiting factors. Economics evolved in a time when labor and capital were the most common limiting factors to production. Therefore, most economic production functions keep track only of these two factors (and sometimes technology). As the economy grows relative to the ecosystem, however, and the limiting factors shift to clean water, clean air, dump space, and acceptable forms of energy and raw materials, the traditional focus on only capital and labor becomes increasingly unhelpful. One of the classic models taught to systems students at MIT is Jay Forrester's corporate-growth model. It starts with a successful young company, growing rapidly. The problem for this company is to recognize and deal with its shifting limits—limits that change in response to the company's own growth. The company may hire salespeople, for example, who are so good that they generate orders faster than the factory can produce. Delivery delays increase, and customers are lost, because production capacity is the most limiting factor. So the managers expand the capital stock of production plants. New people are hired in a hurry and trained too little. Quality suffers, and customers are lost because labor skill is the most limiting factor. So management invests in worker training. Quality improves, new orders pour in, and the order-fulfillment and record-keeping system clogs. And so forth. There are layers of limits around every growing plant, child, epidemic, new product, technological advances, company, city, economy, and population. Insight comes only from recognizing which factor is limiting, but from seeing that growth itself depletes or enhances limits and therefore changes what is limiting. The interplay between a growing plant and the soil, a growing company and its market, a growing economy and its resource base, is dynamic. Whenever one factor ceases to be limiting, growth occurs, and growth itself changes the relative scarcity of factors to the next potential limiting factor. To gain a real understanding of and control over the growth process, you have to identify the next potential limiting factor.

  • P114. The alternative to overpowering policy resistance is so counterintuitive that it's usually unthinkable. Let go. Give up ineffective policies. Let the resources and energy spent on both enforcing and resisting be used for more constructive purposes. You won't get your way with the system, but it won't go as far in a bad direction as you think, because much of the action you were trying to correct was in response to your own action. If you calm down, those who are pulling against you will calm down too. This is what happened in 1933 when Prohibition ended in the United States; the alcohol-driven chaos also largely ended. That calming down may provide the opportunity to look more closely at the feedback within the system, to understand the bounded rationality behind it, and to find a way to meet the goals of the participants in the system while moving the state of the system in a better direction.

  • P115. The most effective way of dealing with policy resistance is to find a way of aligning the various goals of the subsystems, usually by providing an overarching goal that allows all actors to break out of their bounded rationality. If everyone can work harmoniously toward the same outcome (if all feedback loops are serving the same goal), the results can be amazing. The most familiar examples of this harmonization of goals are the mobilization of economies during wartime, or recovery after war or natural disasters.

  • P120. Life is full of mutual-coercion arrangements, most of them so ordinary you hardly stop to think about them. Every one of them limits the freedom to abuse a commons, while preserving the freedom to use it. For example:

    • The common space in the center of a busy intersection is regulated by traffic lights. You can't drive through whenever you want to. When it is your turn, however, you can pass through more safely than would be possible if there were an unregulated free-for-all.

    • Use of common parking spaces in downtown areas is parceled out by meters, which charge for a space and limit the time it can be occupied. You are not free to park wherever you want for as long as you want, but you have a higher chance of finding a parking space than you would if the meters weren't there.

    • You may not help yourself to the money in a bank, however advantageous it might be for you to do so. Protective devices such as strongboxes and safes, reinforced by police and jails, prevent you from treating a bank as a commons. In return, your own money in the bank is protected.

    • You may not broadcast at will over the wavelengths that carry radio and television signals. You must obtain a permit from a regulatory agency. If your freedom to broadcast were not limited, the airwaves would be a chaos of overlapping signals.

    • Many municipal garbage systems have become so expensive that households are now charged for garbage disposal depending on the amount of garbage they generate—transforming the previous commons to a regulated pay-as-you-go system.

  • P127. Anyone who has played the game of Monopoly knows the success-to-the-successful system. All players start out equal. The ones who manage to be first at building "hotels" on their property are able to extract "rent" from the other players, which they can then use to buy more hotels. The more hotels you have, the more hotels you can get. The game ends when one player has bought up everything, unless the other players have long ago quit in frustration. Once, our neighborhood had a contest with a $100 reward for the family that put up the most impressive display of outdoor Christmas lights. The family that won the first year spent the $100 on more Christmas lights. After that, the family won three years in a row, with their display getting more elaborate every year, and the contest was suspended. To him that hath shall be given. The more the winner wins, the more they can win in the future. If the winning takes place in a limited environment, such that everything the winner wins is extracted from the losers, the losers are gradually bankrupted, or forced out, or starved.

  • P138. One of the most powerful ways to influence the behavior of a system is through its purpose or goal. That's because the goal is the direction-setter of the system, the definer of discrepancies that require action, the indicator of compliance, failure, or success toward which balancing feedback loops work. If the goal is defined badly, if it doesn't measure what it's supposed to measure, if it doesn't reflect the real welfare of the system, then the system can't possibly produce a desirable result. Systems, like the three wishes in the traditional fairy tale, have a terrible tendency to produce exactly and only what you ask them to produce. Be careful what you ask them to produce. If the desired system state is national security, and that is defined as the amount of money spent on the military, the system will produce military spending. It may or may not produce national security. In fact, security may be undermined if the spending drains investment from other parts of the economy, and if the spending goes for exorbitant, unnecessary, or unworkable weapons. If the desired system state is good education, measuring that goal by the amount of money spent per student will ensure that money is spent per student. If the quality of education is measured by performance on standardized tests, the system will produce performance on standardized tests. Whether either of these measures is correlated with good education is at least worth thinking about.

  • P154. Examples of strengthening balancing feedback controls to improve a system's self-correcting abilities include:

    • preventive medicine, exercise, and good nutrition to bolster the body's ability to fight disease,

    • integrated pest management to encourage natural predators of crop pests,

    • the Freedom of Information Act to reduce government secrecy,

    • monitoring systems to report on environmental damage,

    • protection of whistleblowers, and

    • impact fees, pollution taxes, and performance bonds to recapture the externalized public costs of private benefits.

  • P165. Leverage points to intervene in a system:

    • 10) Stock + Flow structure

    • 9) Delays

    • 8) Balancing feedback loops

    • 7) Reinforcing feedback loops

    • 6) Information to the Flows

    • 5) Rules and constraints

    • 4) Add-Change-evolve the system

    • 3) Goal or purpose of the system

    • 2) Paradigm shifts - What we believe

    • 1) Go in a different direction

  • P167. We had many earnest discussions on the topic of "implementation," by which we meant "how to get managers, mayors, and agency heads to follow our advice." The truth was, we didn't even follow our advice. We gave learned lectures about the dynamics of eroding goals and eroded our own jogging programs. We warned against the traps of escalation and shifting the burden, and then created them in our own marriages. Social systems are the external manifestations of cultural thinking patterns and of profound human needs, emotions, strengths, and weaknesses. Changing them is not as simple as saying "now all change," or of trusting that he who knows the good shall do the good. We ran into another problem. Our systems insights helped us understand many things we hadn't understood before, but they didn't help us understand everything. In fact, they raised at least as many questions as they answered. Like all the other lenses humanity has developed with which to peer into macrocosms and microcosms, this one too revealed wondrous new things, many of which were wondrous new mysteries. The mystery our new tool revealed lay especially within the human mind, heart, and soul. Here are just a few of the questions that were prompted by our insights into how systems work. A systems insight... can raise more questions!

  • P180. The thing to do when you don't know is not to bluff and not to freeze, but to learn. The way you learn is by experiment—or, as Buckminster Fuller put it, by trial and error, error, error. In a world of complex systems, it is not appropriate to charge forward with rigid, undeviating directives. "Stay the course" is only a good idea if you're sure you're on course. Pretending you're in control even when you aren't is a recipe not only for mistakes, but for not learning from mistakes. What's appropriate when you're learning is small steps, constant monitoring, and a willingness to change course as you find out more about where it's leading.

Comments


© 2025 by Lars Christensen

  • LinkedIn
bottom of page