back · main · about · writing · notes · reading · d3 · now · contact

Thinking in Systems – A Primer: Donella H. Meadows

Chapter 1 – Basics

A system is made up of three parts: components, interactions between the components and a purpose. Take a tree as an example: the components include its branches, leaves, trunk, roots and cell structure; the interactions involve nutrient passing, water conservation, the leaves being connected to the branches and the tree shedding leaves for winter. The purpose is to stay alive; to feed off the sun; to perpetuate and to reproduce. Systems can have multiple functions.

A key feature of many systems is information transfer. Examples include players yelling to each other on a soccer game, the chemical signals our stomach passes to our saliva, and the volume of shares traded that day on a stock exchange.

The effects of a system cannot be predicted by looking at the elements that make up the system and their interactions. It will simply be too complicated. The best way to look at the effects of a system is to look at what the system does over a period of time (not what they say they do, like a political system). Prediction of systems is fiendishly difficult – just ask weather forecasters and market predictors.

Systems can be nestled within systems. A key reason that the behaviour of a system is hard to predict is that the sub systems often have very different incentives to the bigger system. Take a university system for example. The motive of the students is to study and get good grades; that of the professors is to publish papers; that of the vice chancellor is to make sure the university is renowned for its research. Yet these agents are misaligned; the professor has no incentive to teach well, and the vice chancellor has no incentive to make the professor teach well, or to have him focussed on anything other than maximising his citation count. So the net overall impact isn’t a system which adds to the overall body of knowledge as much as it should; instead, it’s a bit of an overall mess.

The components in a system can be switched out pretty easily; the system still more or less functions as is. But, if the relationships in a system change, the system itself will change. And, if the function or purpose of the system changes, it is extremely likely that the system will be completely different. (university, trees, soccer teams)

Stock is the quantity of a resource; money, tree growth, self-confidence. It goes up with inflows and down with outflows. We have a tendency to focus on increasing inflows to increase the quantity of a stock more than we focus on decreasing the outflows. There is a delay between stock levels and inflow/outflow rates – things don’t change instantly even if the inflow/outflow rates change. It follows that understanding the rate of change of the system allows us to calibrate how quickly we should expect a system to change.

Systems have feedback loops. A hot coffee will interact with the environment and become cooler; money will grow in a bank account. The second example is an example of a self-perpetuating feedback loop; the more money you have, the quicker your money grows, and the more money you have.

Feedback loops are everywhere. Thinking in systems means that you don’t look at things in cause and effect; but rather as an interplay of different elements of the system. If a bank lowers interest rates to control the economy, we see that the economy also controlled the decision of the bank to do that. The more a stock goes down, the more it continues to go down un a perpetuating feedback loop. If A causes B, think about it – does B also cause A?

Chapter 2 – The Systems Zoo

Systems have loops: reinforcing loops and balanced loops. A reinforcing loop affects the rate to get bigger and bigger over time (like compound interest at a bank affecting the stock more as it gets higher). A balancing loop will pull the amount of the stock towards a constant – like depreciation pulling the value of assets towards zero, or the ambient temperature of a room pulling the temperature of a cup of hot coffee down or a cup of cold coffee up.

When you hear a prediction about a system (business forecast, stock forecast, weather forecast, political change, population forecast) here are three questions you should think about:

  1. Will the system flows that would produce this output change like the prediction says? For a population forecast is it likely that birth rate will decrease and mortality rate will also decrease?
  2. If the system flows do change, will that necessarily cause the behaviour of the output changing, or are there a range of outcomes? Take a weather system – if there is increased rainfall and wind speed, will that necessarily mean that a cyclone is coming, or could it be something less destructive?
  3. What factors are causing the system flows to change? For a stock forecast relying on investor sentiment to change, what would drive an increase or decrease in investor fear or bullishness?

The behaviour of systems will change as the feedback loops governing the system changes. Often unexpected behaviour about a system can be expected by looking at the shifts in the feedback loops. Systems with similar structures of feedback loops can be generalised together to learn from each other. The system of capital assets and world population has their system of loops in common – they each have a balancing loop (mortality rate, depreciation) and a reinforcing loop (birth rate, investment).

Delays in a system have the impact of making the level of stock oscillate. A system can have many delays and changing the length of its delays can cause results that are counterintuitive. It is difficult to predict the effect of changing the delay period.

Systems which are constrained by a non-renewable resource display a pattern. These systems have the following properties:

What happens:

Systems constrained by a renewable resource are a little different. Things are the same as above, except;

Chapter 3 – Why systems work so well

Systems have three traits that are self-beneficial: hierarchy, self-organisation and resilience.


Some systems are resilient, meaning that they have the ability to heal themselves. Resilient systems are built out of many feedback loops. This can mean that even if one feedback loop fails, there are others to take its place – the system has redundancy. The system can handle many different states with ease, and if it is damaged it is able to restore itself. Example: trees in Australian habitat (can survive bushfires, can reproduce quickly if needed, high(?) rate of reproduction), or the human body (can survive famine/feast, protection against diseases, able to learn to function without some missing parts).

Static systems do not mean resilient systems (link to fragility and antifragility). Systems are often manipulated to increase productivity or stability, and this often greatly harms their resilience. Examples: cows get injected with hormones to increase milk production at the expense of immune systems and lifespan; European forests are replaced with single species of tree, reducing soil nutrients and making them vulnerable to air pollution.


Some systems can adjust themselves to become more complicated over time. Think trees adapting to become bushfire proof, humans adapting to vegetarian diet, evolutionary systems and strategies. This complex behaviour is very beneficial to the system.


Systems are often comprised of smaller systems making up a whole. You can affect a smaller system in isolation, but the follow on effects may affect the bigger systems it is a part of. Hierarchies are useful because they mean that the individual components of the system don’t have to keep track of all the system information; they can delegate that to other parts of the hierarchy. Hierarchical systems are also less fragile and decomposable – you can substitute parts for each other, take a unit and put it somewhere else. Relationships inside the hierarchy are stronger than relationships between hierarchy parts – like someone talking to their team at work vs them talking to other business units.

When individual units of the system behave with different goals to the overall system, bad things happen. Like researchers at uni not teaching the students who are there to learn.

Chapter 4 – Why systems surprise

There are a few reasons why systems don’t work like we expect.


Bounded Rationality


Beguiling events

Linear minds in a non-linear world


Chapter 5

There are patterns of how systems don’t react the way that we expect them to.

Tragedy of the commons

Trend towards low performance

Policy change

Seeking the wrong goal

Gaming the rules


Shifting the burden to the intervenor

Success to the successful

Chapter 6 – Finding leverage points in Systems

How do you fix a system? From least effective to most effective:

12 - Changing numbers – constants and parameters

11 – Buffers – The sizes of stabilsing stocks relative to their flows.

10 - Changing the structure of the system

9 – Delays – Lengths of time relative to rate of system changes.

8- Balancing feedback loops

7 – Reinforcing feedback loops

6 – Information flows – who does and who doesn’t have information

5 – Changing the rules – incentives, punishments and constraints

Systems practices

Get the beat of the system

Expose your mental models to the light of day

Honour, respect and distribute information

Watch what language you use

Pay attention to what is important, not just what is quantifiable

Make feedback policies for feedback systems

Listen to the wisdom of the system

Locate responsibility within the system

Celebrate complexity

Expand time horizons

Defy the disciplines

Don’t erode the goal of goodness

back · main · about · writing · notes · reading · d3 · now · contact