The Beer Game was invented by MIT Sloan School of Management in the 1960s and has since been used in academic and corporate fields of study alike to demonstrate on a practical level what it means to think in systems.
You start out as a beer retailer who orders from the wholesaler who in turn orders from the breweries. In particular there is a brand of boutique beer that you aim to maintain at least 20 cases of beer of in stock. The demand for this beer is 4 cases per week. You obviously try to keep costs as low as possible and the opportunity cost of being out of stock is twice the price of a normal order. Your order from the wholesaler takes 2 weeks to arrive and thus you keep your orders consistent at 4 cases every week. In the 4th week you suddenly see that 8 cases are sold as opposed to the fairly consistent 4. Happy at the sell, you don’t question the reason but instead, order 8 cases instead of 4 to be delivered in 2 weeks’ time. In the 5th week, another 8 cases get sold. Curious at the sudden increase, a quick investigation reveals that the brand was popularized by a well known band who used it in one of their music videos. It satisfies your interest. Suddenly you are much more alert to the fact that your stock now only has 12 cases left. Not panicking yet, you simply order an additional 4 cases to the 8 you have on order. In week 6, you are told that order from 2 weeks prior isn’t ready yet. Mildly irritated by the wholesaler’s lack of beer, you order another 4 cases. By the 7th week, the first additional order shows up. From demand, your stock has now dropped to 4 cases. By week 12 you have back orders amounting to approximately 20 cases of beer with not a sign of delivery or easing of the pain. In the frantic timing, you forget your orders are still consistent at 16 cases per week. You try to engage in a phone conversation with the wholesaler to see what is wrong. It seems they too are struggling to keep up as the brewery itself is falling way behind on orders. By week 20, your back order costs amount to 5 times the amount of usual ordering and a pile of beer arrives on your doorstep. You hardly have room for this and confront the delivery guy. He promptly points you to your orders, which make perfect sense. Just the very same day the beer rep from the brewery pops in to see how you are. With this heavy issue on the mind and heart, you inform him of the massive backlog and over supply. His face turns slightly pale at the sound as both of you quickly realize that the increasing back orders created a false demand to the brewery to brew more. Upon investigation you learn that the demand has remained steady at 8 cases per week from week 4 to 20 which essentially leaves you with very little hope of selling this anytime soon.
The ‘bullwhip’ effect has claimed another victim: you. You overacted to imperfect information and delays in the supply chain. In turn, the wholesaler overacted as did the brewery because of the increasing and consistent orders. Not only are you left with a huge pile of beer to sell but the brewery rep has to inform his MD and production team that the massive order to produce is non-existent.
Your first instinct is to blame the communication channels and the people involved. You realize that better communication and investigation would have solved more. Next in line for a beating is the ‘system’. However, pondering long and hard enough in an active effort on your way home a few nights, you realize that you make up part of the system and although you don’t have full control, you do have an influence. The system it seems, is also not to blame. Who is to blame then?
Systems thinking as a formal field of study dates back to 1930 and has its roots in Ludwig von Bertalannfy, an Austrian born biologist. It is now widely used in practically all strategic learning programmes. In starting to understand the theory behind it, initial logic brings home the first principal: you have to remove yourself and think about the bigger picture. This many times implies that no action is taken yet, no reactionary measures are put in place and no irrational bias is considered. It is a very deliberate action to think about the parts in the system as well as the interaction of those parts.
The question for us really is, if we are all functioning within a system, how much of our ‘remedies’ at work is purely false? As it has its foundation in not trying to understand the system but merely reacting as things go wrong along the way. How many reports do we file a month that has at its core a case load of superfluous data because our systems comprise of actions and reactions?
We could do well to try and understand the bigger picture before we plan and implement. If the system allows it.