Thinking in Systems Part 2

October 03, 2024 · 8 mins read

This post is a continuation of our discussion on systems thinking. If you haven’t read the first part, I highly recommend doing so before diving into this one. In this post, we’ll explore common system traps and how to avoid them, along with methods to create systemic changes. Let’s dive in.

Common System Traps

System traps—like addiction, overproduction, or the tragedy of the commons—are well understood but difficult to avoid. The bounded rationality, non-linearities, delays, and other factors discussed in evaluating systems in the previous post contribute to making outcomes in these archetypes often seem inevitable.

Here are some of the most common system traps:

  1. Policy Disagreements - Bounded rationality often leads to policies that may solve short-term issues but create larger problems later. A classic example is Romania’s policy to ban abortions in an effort to increase population. While it worked initially, it led to long-term socio-economic challenges: increased illegal abortions, higher mortality among pregnant women, and a surge in abandoned children. These cascading effects eventually contributed to a revolution that overturned the government.
  2. Tragedy of the Commons - This occurs when individuals, acting in their own self-interest, deplete a shared resource. It’s a reinforcing feedback loop where short-term gains for individuals lead to long-term losses for everyone. Overfishing, deforestation, and pollution are all examples of this. Left unchecked, the system collapses because the resource is exhausted.
  3. Drift to Low Performance - When we measure performance against poor past outcomes, it creates a negative feedback loop. Over time, expectations lower, and people forget that better performance was once possible. To avoid this, you must set objective performance benchmarks or compare against better historical outcomes, not just the most recent ones.
  4. Escalation - Escalation results from reinforcing loops that push competing entities to outdo each other. In some cases, like industrial competition, this is beneficial. However, in others—such as arms races—it can be devastating. A softer example is in marketing: brands escalate the intensity of their messaging to stand out, but eventually, consumers become numb to all messaging.
  5. Success to the Successful - When those who succeed gain more resources to continue succeeding, it leads to a reinforcing loop. Monopoly in markets is a classic example: successful companies grow stronger, while smaller competitors struggle to survive. However, this dynamic eventually creates opportunities for new challengers to disrupt the system.
  6. Addiction - Addiction in systems doesn’t only apply to drugs—it can occur with subsidies, fuel, fertilizers, and other resources. When a system becomes overly dependent on an external input, it loses resilience. Over time, the system’s actors become less capable of functioning without the crutch.
  7. Rule Beating - This happens when people follow the letter of a rule but not its spirit. For instance, departments often spend leftover budget wastefully to avoid receiving a reduced budget next year. Rule beating can erode trust, destroy systems, and lead to unintended consequences.
  8. Poorly Defined Goals - When system goals aren’t clearly defined, the system operates on poorly aligned incentives. For instance, if national security is measured by the percentage of GDP spent, the system will focus on spending, rather than actually improving security. Clear, thoughtful goals are crucial to avoid misaligned behavior.

Creating Systemic Changes

Systemic changes happen when you find and act on the right leverage points. Below are some of the most common leverage points that can influence the behavior of a system.

  1. Growth - Systems thinker Jay Forrester once said that “growth” is one of the key drivers of change in any system. Growth brings both benefits and costs. Understanding both aspects is essential to leverage growth effectively.
  2. Buffers - Systems with large buffers are more stable. A large stock relative to its flows provides resilience. For instance, you hear about river floods but rarely about lake floods because lakes have large buffers. Buffers are hard to change but critical in ensuring system stability, such as maintaining a minimum population for endangered species.
  3. Physical Structure of Stocks and Flows - The structure of a system’s stocks and flows determines its behavior. Poorly designed systems require rebuilding to improve. The layout of physical structures in production, supply chains, or ecosystems becomes difficult to change after the fact, so thoughtful design is crucial from the start
  4. Delays - The timing of information and actions can make or break a system. Delays between action and feedback create inefficiencies. For example, the central planning in Soviet Russia struggled with delays, making the system inefficient. Addressing delays can significantly improve responsiveness and performance.
  5. Balancing Feedback Loops - Balancing loops keep systems stable. In nature, the thermostat or our ability to sweat when hot is an example. In societal systems, democratic elections act as a balancing loop, where governments are kept in check by the electorate. The presence of these loops allows for self-correction. These loops drive growth or decline. A flu epidemic spreads because one person infects others, who in turn infect more people. Reinforcing loops can cause exponential growth or collapse, depending on whether the loop is feeding a positive or negative outcome.
  6. Information Flows - Access to information affects system behavior. In a study of Dutch households, those with electric meters placed visibly in their homes consumed less electricity than those whose meters were hidden in basements. Simple changes in the flow of information can alter system outcomes.
  7. Self-Organization - Self-organizing systems are resilient because they adapt without central control. In ecosystems, for example, species co-evolve and balance each other out. Encouraging self-organization in human systems promotes flexibility and innovation.
  8. Goals - The system’s purpose, or goal, is one of the most critical leverage points. Every part of the system works in service of that goal. Clear, well-defined goals ensure that the system functions effectively.
  9. Paradigms - A system’s rules, goals, and behavior are based on its paradigm—the overarching mindset. For example, ancient Egyptians built pyramids because they believed in the afterlife. Today, we build skyscrapers because we believe in maximizing urban space. Paradigms shape what systems do and why.
  10. Transcending Paradigms - The most powerful leverage point is the ability to transcend paradigms. Recognizing that no single worldview is absolute allows us to stay flexible and adopt the most helpful paradigm for a given situation. This agility can lead to greater creativity and systemic change.

In summary, systems thinking helps us see the broader dynamics at play and how to influence change effectively. By understanding common system traps and the key leverage points, we can design better systems and anticipate unintended consequences.


I run a startup called Harmonize. We are hiring and if you’re looking for an exciting startup journey, please write to jobs@harmonizehq.com. Apart from this blog, I tweet about startup life and practical wisdom in books.