Chapter 1 – Basics
A system is made up of three parts: components, interactions between the components and a purpose. Take a tree as an example: the components include its branches, leaves, trunk, roots and cell structure; the interactions involve nutrient passing, water conservation, the leaves being connected to the branches and the tree shedding leaves for winter. The purpose is to stay alive; to feed off the sun; to perpetuate and to reproduce. Systems can have multiple functions.
A key feature of many systems is information transfer. Examples include players yelling to each other on a soccer game, the chemical signals our stomach passes to our saliva, and the volume of shares traded that day on a stock exchange.
The effects of a system cannot be predicted by looking at the elements that make up the system and their interactions. It will simply be too complicated. The best way to look at the effects of a system is to look at what the system does over a period of time (not what they say they do, like a political system). Prediction of systems is fiendishly difficult – just ask weather forecasters and market predictors.
Systems can be nestled within systems. A key reason that the behaviour of a system is hard to predict is that the sub systems often have very different incentives to the bigger system. Take a university system for example. The motive of the students is to study and get good grades; that of the professors is to publish papers; that of the vice chancellor is to make sure the university is renowned for its research. Yet these agents are misaligned; the professor has no incentive to teach well, and the vice chancellor has no incentive to make the professor teach well, or to have him focussed on anything other than maximising his citation count. So the net overall impact isn’t a system which adds to the overall body of knowledge as much as it should; instead, it’s a bit of an overall mess.
The components in a system can be switched out pretty easily; the system still more or less functions as is. But, if the relationships in a system change, the system itself will change. And, if the function or purpose of the system changes, it is extremely likely that the system will be completely different. (university, trees, soccer teams)
Stock is the quantity of a resource; money, tree growth, self-confidence. It goes up with inflows and down with outflows. We have a tendency to focus on increasing inflows to increase the quantity of a stock more than we focus on decreasing the outflows. There is a delay between stock levels and inflow/outflow rates – things don’t change instantly even if the inflow/outflow rates change. It follows that understanding the rate of change of the system allows us to calibrate how quickly we should expect a system to change.
Systems have feedback loops. A hot coffee will interact with the environment and become cooler; money will grow in a bank account. The second example is an example of a self-perpetuating feedback loop; the more money you have, the quicker your money grows, and the more money you have.
Feedback loops are everywhere. Thinking in systems means that you don’t look at things in cause and effect; but rather as an interplay of different elements of the system. If a bank lowers interest rates to control the economy, we see that the economy also controlled the decision of the bank to do that. The more a stock goes down, the more it continues to go down un a perpetuating feedback loop. If A causes B, think about it – does B also cause A?
Chapter 2 – The Systems Zoo
Systems have loops: reinforcing loops and balanced loops. A reinforcing loop affects the rate to get bigger and bigger over time (like compound interest at a bank affecting the stock more as it gets higher). A balancing loop will pull the amount of the stock towards a constant – like depreciation pulling the value of assets towards zero, or the ambient temperature of a room pulling the temperature of a cup of hot coffee down or a cup of cold coffee up.
When you hear a prediction about a system (business forecast, stock forecast, weather forecast, political change, population forecast) here are three questions you should think about:
- Will the system flows that would produce this output change like the prediction says? For a population forecast is it likely that birth rate will decrease and mortality rate will also decrease?
- If the system flows do change, will that necessarily cause the behaviour of the output changing, or are there a range of outcomes? Take a weather system – if there is increased rainfall and wind speed, will that necessarily mean that a cyclone is coming, or could it be something less destructive?
- What factors are causing the system flows to change? For a stock forecast relying on investor sentiment to change, what would drive an increase or decrease in investor fear or bullishness?
The behaviour of systems will change as the feedback loops governing the system changes. Often unexpected behaviour about a system can be expected by looking at the shifts in the feedback loops. Systems with similar structures of feedback loops can be generalised together to learn from each other. The system of capital assets and world population has their system of loops in common – they each have a balancing loop (mortality rate, depreciation) and a reinforcing loop (birth rate, investment).
Delays in a system have the impact of making the level of stock oscillate. A system can have many delays and changing the length of its delays can cause results that are counterintuitive. It is difficult to predict the effect of changing the delay period.
Systems which are constrained by a non-renewable resource display a pattern. These systems have the following properties:
- they have a reinforcing loop in investment (investment grows exponentially). This means that their rate of extraction gets higher over time
- they have a finite resource that they are extracting from (say an oil well with “200 years” worth of oil). The rate of extraction of the finite resource follows an exponential decay curve – it is much easier to get the resource at the start than at the end.
- initially investment is low. The rate of extraction is low. There is still almost 100% of the resource.
- There is a reinforcing loop around investment so that it grows exponentially. This increases the rate of extraction.
- The rate of extraction also has a declining force, in that it is harder to extract the resource. Investment has to stay ahead of this declining force to maintain constant extraction rate.
- If investment grows quickly, the overall rate of return will increase, and the resource will decline quicker.
- If the resource is substituted for a bigger resource (e.g. 400yrs oil vs 100 years oil) the time difference is much smaller than you would expect. This is because of the exponential curve of investment – it has the chance to get higher, increasing the overall rate much higher and as a result the resource is depleted remarkably quickly.
- There comes a point where overall production drops off because the last bit of oil is too hard to get out of the ground. At this point the system stops very quickly.
Systems constrained by a renewable resource are a little different. Things are the same as above, except;
- The renewable resource has a replenishment rate. Sometimes this rate depends on the levels of the stock, like the reproduction rate for fish. Other times it is a constant factor, like with sunlight.
- If there is an equilibrium between the rate of return and the rate of replenishment then the system is sustainable – it can go on perpetually. If the rate of return is greater than the rate of replenishment then the system can oscillate, or if it is much greater then the system is essentially turned into the above scenario.
- If the levels of a resource (like fish in the sea) get too low, other factors can start to affect the replenishment rate. Low numbers of fish can be affected by pollution, storms or predators where high numbers of fish wouldn’t be. This can effectively wipe out the resource.
Chapter 3 – Why systems work so well
Systems have three traits that are self-beneficial: hierarchy, self-organisation and resilience.
Some systems are resilient, meaning that they have the ability to heal themselves. Resilient systems are built out of many feedback loops. This can mean that even if one feedback loop fails, there are others to take its place – the system has redundancy. The system can handle many different states with ease, and if it is damaged it is able to restore itself. Example: trees in Australian habitat (can survive bushfires, can reproduce quickly if needed, high(?) rate of reproduction), or the human body (can survive famine/feast, protection against diseases, able to learn to function without some missing parts).
Static systems do not mean resilient systems (link to fragility and antifragility). Systems are often manipulated to increase productivity or stability, and this often greatly harms their resilience. Examples: cows get injected with hormones to increase milk production at the expense of immune systems and lifespan; European forests are replaced with single species of tree, reducing soil nutrients and making them vulnerable to air pollution.
Some systems can adjust themselves to become more complicated over time. Think trees adapting to become bushfire proof, humans adapting to vegetarian diet, evolutionary systems and strategies. This complex behaviour is very beneficial to the system.
Systems are often comprised of smaller systems making up a whole. You can affect a smaller system in isolation, but the follow on effects may affect the bigger systems it is a part of. Hierarchies are useful because they mean that the individual components of the system don’t have to keep track of all the system information; they can delegate that to other parts of the hierarchy. Hierarchical systems are also less fragile and decomposable – you can substitute parts for each other, take a unit and put it somewhere else. Relationships inside the hierarchy are stronger than relationships between hierarchy parts – like someone talking to their team at work vs them talking to other business units.
When individual units of the system behave with different goals to the overall system, bad things happen. Like researchers at uni not teaching the students who are there to learn.
Chapter 4 – Why systems surprise
There are a few reasons why systems don’t work like we expect.
- Changes need time to take effect. Often we change the flow of a system and we expect the results instantly. But life doesn’t work like that.
- Delays in one part of the system can cause unexpected behaviour in other parts of the system, like oscillations.
- Foresight is critical when there are large system delays. To act only when a problem is obvious is usually too late.
- Delay between being infected with a disease and showing symptoms
- Delay between the earth heating up from global warming and the effects on the environment
- What makes sense to an individual in the system doesn’t always make sense for the system. Examples: overfishing, universities, tourists in Hawaii, corporations causing the GFC.
- The individuals in the system don’t often have the information to change their behaviour effectively. They may have incomplete information – the fisherman doesn’t know what his competitors are doing, he doesn’t know how many fish there are left in the sea. The academic is often unaware of the full impact of their shitty teaching.
- We are also biased. We discount the future in favour of maximising the present. We don’t measure the effect of risks correctly. We suffer from confirmation bias.
- We can try to mitigate the effects of bounded rationality by putting ourselves in someone else’s shoes. By seeing things from their perspective, we can broaden our minds.
- Systems eventually hit limits. They can be imposed (rate of extraction of fish limited by equipment) or environmental (amount of fish in the sea)
- The key limit stopping the growth of a system or the system from progressing further is the most important input to the system. Progress the system further by removing that limit- that will be fine, until a second limit comes up.
- Perpetual growth is impossible. “Ultimately, the choice is not to grow forever but to decide which limits to live within.”
- We see systems in terms of their events. Stock market crashes in the stock system. Landslides in the mountains. Events are spectacular and easy to remember; what is not easy is to look beneath the events and see them as manifestations of a system structure.
- If we look at the pattern of events over time, we are less surprised by them. This is looking at the behaviour of the system and means that we can better understand why events unfold the way they do.
- Be careful of models that emphasise behaviour flows and not stocks; often behaviour flows are dependent on stocks in a feedback loop. Likewise, because of this dependence it will also appear as if behaviour flows are often correlated with each other.
- Behaviour based models are good for predicting short term but not for long term. They rely on the structure of the system not changing, and when it does the models fail. The only way to get great prediction is to dig into the structure of the system.
Linear minds in a non-linear world
- Our brains evolved to deal with things that happened in a linear fashion. If I work twice as hard, I get twice as much output. Non linear things confuse us; if I work twice as hard, I get four times the output, or perhaps half the output. It is hard for us to think in these terms.
- Example: the worms attacking the trees. The worms bred high numbers when there was a lot of their favourite food available (a type of tree), and the same type of tree was the ones that humans wanted to make timber. So there were heaps of worms, and only one type of tree, and the pesticides didn’t work as expected because they killed off a lot of the natural predators of the worms, so the population of the worms just spiralled out of control.
- We abstract away the boundaries of the system. In real life, there are no boundaries and the world is one big system.
- Sometimes it makes sense to abstract away bits of the system, in order to reduce the complexity of our mental models. It’s a necessity; if we don’t our system diagrams get cluttered and super complicated.
- On the other hand, we have to be really careful when we do this. We often can’t abstract away the system so easily. There are things hiding in the clouds that will sting us.
There are patterns of how systems don’t react the way that we expect them to.
Tragedy of the commons
- A common resource exists, like land available for pasture to people with sheep. It makes sense for each individual person to have as many sheep as possible (bounded rationality), but the common ground will suffer and there will be nothing left for anyone in time.
- Three approaches to dealing with this: enforce moral codes (like shaming litterers), privatisation (people are allocated specific amounts of the resource, like a data plan) or legislation (traffic lights, public parks etc)
Trend towards low performance
- This occurs when there is more focus and belief on the negatives than the positives. A stock market crash falls into this example. Another example is a weight loss goal; people are sometimes more disheartened by the goal and focus on its negative bits than encouraged by it.
- Sometimes goals get lower over time, and we justify them by saying things like “we’re doing relatively well considering XX” or “we’re not that far behind where we were last year”. The counter to this is to have goal invariance – goals not changing – or to trigger a positive spiral (focus on the positives over negatives and have increasing goals over time).
- When many agents are pulling a stock in different directions, it’s hard to pull the resource the way you want. Example: the war on drugs – the police try to deplete the stock (drugs), but every move they make is countered (sometimes harder) by the suppliers. A better strategy (if counterintuitive) would be to stop pulling on the stock and focusing your energy somewhere else. The counterforces to the stock will also stop pulling on it, and equilibrium remains without much effort.
- If you can direct the motive of all parties in the system towards the same goal, the benefit is greatest. Example: Sweden’s child policy vs Romania vs Hungary, or our pay having a component of value saved inbuilt in it.
Seeking the wrong goal
- When we seek to maximise a metric as a substitute for a goal, we find often that our end result is not what we hoped. For example, I track hours worked, whereas what I should be tracking is results gained. If education effectiveness is calculated by money per student, then that metric will be maximised, but it’s likely education wont be more effective. Doctors who have diagnosis targets will incorrectly diagnose people. Gross national product can be maximised for an economy while actually undermining the economy itself.
- The way out: “Specify indicators and goals that reflect the real welfare of the system. Be careful not to confuse effort with result else you will be designing a system that produces effort, not result.”
Gaming the rules
- Where there are rules, there are ways to exploit the rules. Think a ban on import grains being circumvented by importing roots instead. Departments do pointless spending to exhaust their budgets at the end of the year, otherwise they will get less next year. A mate’s old company enforced strict clock on times; so everyone shared passwords and logged on each other.
- Avoid this by redesigning the rules, or by explaining their intention, or by getting everyone to align in the same direction.
- When two parties retaliate against each other, they often retaliate in a larger and larger fashion. Soon it is very costly to both parties. For example; the cold war, Harry vs Snape, conflicts between rivals, price wars, political parties smearing each other.
- The best policy here is to not escalate, which may mean being less than the opponent. If you can bear it, this is a good policy. Else, some sort of agreement between the parties can also work, although both sides will not like it. It is better than the alternative.
Shifting the burden to the intervenor
- This is a reliance on the intervenor at the expense of the actual system. Countries grow to rely on foreign aid, leaving their own economy in shambles. People rely on welfare payments, losing their skills over time. Modern society shifts burden of health from the individual to doctors/medicine.
- Fix this by strengthening the ability of the system to deal with its own burdens.
Success to the successful
- This occurs when an initial success brings the ability to make future success easier, creating a feedback loop so that it gets harder and harder to break the cycle. For example, very rich people often have the lowest taxes, making them even richer, making their taxes even lower. Monopoly follows this pattern. Companies often do as well, without the presence of a regulator. Money gets interest, making more money in a loop. The poorest children often have the worst opportunities.
- The way out; diversification (the poor play a different game to the rich, companies move into a different niche); anti-competitive laws limiting how much benefit the rich get; level the playing field (restart); rewards that do not influence the next round of competition.
Chapter 6 – Finding leverage points in Systems
How do you fix a system? From least effective to most effective:
12 - Changing numbers – constants and parameters
- These things are often easy to change but provide no real benefit. The system is still going to produce the same output because the feedback loops and flows are there.
- Parameters are things like the amount of land set aside for conservation each year, the minimum wage, how much we spend on military or aids research, the allocation of funding, hiring and firing people. Parameters affect the rate of flow in a system.
- Changing the rate of flow often doesn’t change the system much. It’s the equivalent of changing prime ministers and expecting a lot to change, or “arranging deck chairs on the titanic”.
- The vast majority of our attention goes towards tweaking parameters – perhaps 90 – 99%, but there’s not a lot of leverage in them. Parameters are important, but often just locally, and changing them often does not change the overall behaviour of the system.
- The exception is when changing a number changes the feedback loops as well, or some of the other items on the list.. These are high leverage parameters – things like interest rates or birth rates.
11 – Buffers – The sizes of stabilsing stocks relative to their flows.
- Low buffers for stocks can be a big problem in the system. Soil becomes so eroded that it cannot support any new plants. Populations become too low to be self-sustaining. Increasing the levels of these stocks would change system behaviour.
- Stocks that are big relative to their flows are more stable. A lake has small relative flows, a river has large relative flows, and a river overflows or goes dry much more readily than a lake does.
- Staying close to the minimum level of a stock is dangerous and fraught with risk. Increasing the buffer can give system stability but also can come with costs. Think continuous inventory in a shop vs having a backroom with stock.
- It is often hard, not possible, or expensive to change the level of a stock. It can have big effects, but thats why this is low down the list.
10 - Changing the structure of the system
- Bad system structure can guarantee poor performance in a system. Just look at Sydney Trains and how the system is set up to run everything off a few train lines.
- Rebuilding the system can give vastly superior results. Rebuilding is expensive and complicated, takes a long time and is often infeasible.
9 – Delays – Lengths of time relative to rate of system changes.
- Delays are everywhere. The time between dumping trash in the ocean and the effect it has. Between building desalination plants and getting some water flow. Between ordering food and having it arrive. Between 5G network testing and its rollout across Australia.
- Delays that are too long mean that it takes a long time to react to information, like underfitting. Oscillations or really inefficient systems can result. If the system is fragile, or a system with dangerous boundaries, this could cause the collapse of the system. We might not be able to stop global warming before it is all too late.
- Delays that are too short mean that we overreact and shoot around wildly. This is like overfitting.
- Delays are not easily changed, hence why they are low down on the list. Slowing down change rates makes more difference.
- Where they can be changed they can have big effects – but make sure the change is in the right direction. For example it can be argued that increasing the speed of communication (email, text) has had an overall negative impact on the world.
8- Balancing feedback loops
- don’t remove balancing feedback loops, even if it looks like they’re not needed. An example is the safety system on a nuclear power plant. If we remove it then the range of conditions that the system can survive under is greatly reduced
By strengthening balancing feedback loops we can increase the resiliency of the system. Some examples of doing this:
- increasing our body’s natural ability to fight off germs and disease
- whistleblower protection
- reinforcing predators in an ecosystem to keep the pests down sustainably
- not price fixing and enabling the market to provide fair and balanced prices
7 – Reinforcing feedback loops
- Exponential systems get bigger quickly. By increasing the good ones and culling the bad ones we can keep the system within check.
- The flu will infect more and more people (reinforcing loop) unless we decrease its strength (more people take precautions)
- Big firms with data will take over more and more of the market (reinforcing loop) unless we legislate against it
- Soil erodes away to bedrock (reinforcing loop – it does so quicker the less soil there is) unless we take preventative measures by planting trees and dams.
Any kind of ‘success to the successful’ loop is a prime candidate to be checked.
6 – Information flows – who does and who doesn’t have information
- Adding information to the system is usually quick and cheap, especially when compared to what it would take to rebuild the system. You have to think about the form of the feedback closely though. If you tell everyone how many fish are left, then the fishers will be incentivised to get as much as possible as quick as possible because it will run out.
Adding responsibility to the actions is a good way of adding missing feedback, a form of information flows. Examples:
- politicans who declare war have to go on the front lines
- a towns water pipe must be downstream of their sewerage pipe if they both use the same water
- people who agree to report with powerpoint slides twice a week must make the slides themselves
5 – Changing the rules – incentives, punishments and constraints
- The behaviour of a system can completely change with new rules (eg Calvinball). If you are in a position to be able to alter the rules this is a great way to change a system.
Take a uni class and consider the changed behaviour for changing the rules
- suppose that the students got a collective mark instead of individual mark
- the teacher is also graded by the students (and can fail)
- the bottom 15% of the class are removed at every test
- there are no lectures, only workshops
- the final exam is oral and there can be a discussion about any part of the course
Get the beat of the system
- Get diagrams of the system’s behaviour over time, and if there aren’t any, make them. This will help you prove and disprove hypotheses and find out who is speaking truth and who isn’t.
Expose your mental models to the light of day
- Draw up the system, with all its inflows and outflows. Invite it for criticism from others – it gives you skin in the game.
- Your model should be able to change when the system changes, otherwise it is not flexible enough. Likewise you should be able to update your beliefs.
Honour, respect and distribute information
- Information asymmetries are responsible for many of the problems in systems. Spread information around the system and make sure it is unbiased, accurate and you are not withholding anything.
Watch what language you use
- The terms you use are what you focus on. This can be for good (links to Grit and the examples of the coaches installing particular phrases on working hard) or for bad (if a system is always talking about productivity but not on resilience, it will be productive but not resilient)
- This definitely applies to your own life. The terms you think to yourself in will be the things that shape and guide your life.
Pay attention to what is important, not just what is quantifiable
- It is easy to only keep track of the things you measure and assume that they are everything that is important within a system. The things you don’t track are assumed not to matter, and the system will optimise itself so that they are not present.
- How do you make a metric for trustworthiness? For fun? For happiness? For quality? Metrics will be gamed, but at least they are present for quantifiable things. It is much easier to track hours worked than quality of work produced, or effort exerted, or things learned, but the latter are much more important than the former.
Make feedback policies for feedback systems
- Systems should have the results of its actions clearly visible and as an input to a system. This means that they can change their behaviour over time. Otherwise they will just be repeating the same behaviours over and over again.
Listen to the wisdom of the system
- The system often knows what is wrong with it. Your stomach might instinctively point you away from certain foods that are bad for it. Third world people know what they need to do to get out of poverty. The employees within a company know what to do to turn around its fortunes. Neglecting the wisdom internally and bringing in external parties is often a terrible move.
Locate responsibility within the system
- Make sure the decision makers have enough power to make changes to the system. They also have to face the consequences of their actions. Rulers who go to war should be on the front lines. Tougher prison laws should have the MP responsible living through the prison. The employees of companies should be judged on their performance. Towns should drink downwater of their waste pipe. Medicare should not cover cases where seat belts are not worn. Tony Abbott should have to go somewhere by boat if he turns away refugees.
- The world is complicated and we like to ignore it and think in straight lines. But don’t do that! Celebrate the complexity!
Expand time horizons
- Consider the actions and effects of the system on the long term – over the next 5 years, 10 years, 20 years, 30 years, 50 years, 100 years. We are still paying the price of decisions made eons ago – global warming, overpopulation, the centralisation of power – and us in the future, and future generations, will also bear the burden of what we do today.
Defy the disciplines
- Systems are not neatly within a single discipline. To get the true nature of the system we need to look across all disciplines, and we need to learn and come with an open mind. We must throw away the desire to be correct and to look correct. Admit when you are wrong and take responsibility.
Don’t erode the goal of goodness
- So many leaders throw away their morals (trump, many executives who use being ‘busy’ as an excuse for treating others like trash) and the drift to low performance that is present means that we hear about these examples a lot more than we hear about the positive examples.
- Maintain your morals and your principles. But don’t forget to question which morals and principles you have generated and adopted yourself and which ones society has ingrained within you.