Symptom:
Heroic Effort Required
Moving from high drama to a smooth flow allowed the margin to grow from 63% to 93%!
The Situation:
Our founder, James Brooks, worked with a team tasked with taking historical price indices from the US government and other sources and using them to forecast future commodity and labor prices for use in SaaS cost modeling software. The turnaround from the publication of the government data to the publication of the forecasts was two weeks, and this process was repeated quarterly.
This release process was viewed as the most difficult challenge in the company. Every night of Release Week, this team worked until at least midnight and then until at least 3 AM the night of the release. Photos of teammates who had fallen asleep at their desks would be part of the all-company email that praised this team's heroics every quarter.

Commodity price forecasts drive item price forecasts.

​
Understanding the Whole:
No one had ever mapped out the whole process of the release. If you named a step of the release, members of the team could tell you what the prerequisites of that step were. But no one could have the whole release in mind at once. During the release the team would realize that they hadn’t yet done a prerequisite step at the last minute, a prerequisite that could have been done weeks earlier, but no one thought to do it then.
​
James led the team through two process mapping sessions to lay out a Gantt chart of the entire process. In the third session, the team reflected on the process with this new visibility and made improvements:
-
A few steps were needed the first time a release happened, but did not have to be repeated every quarter. These were removed.
-
Long critical path chains could be shortened by challenging whether all the prerequisites we set really were strict prerequisites or certain ones could be decoupled.
-
Some tasks without prerequisites could be scheduled three weeks before the release while things were calm.
Single Piece Flow:
James noticed that during the week when the team was working until midnight, they often weren’t busy all day, but instead most team members would be waiting on someone else to finish one step before they could do their next step. That previous step might have started half a day behind schedule, and now everything was getting progressively later. The most common cause for the initial delay was the government agency deciding to discontinue an index which was part of the commodity library. When this happened, someone would have to find a similar index to use going forward. This would be the first delay. However, the entire system was prone to delays due to the batch process it followed. The computation that had to happen was implemented in spreadsheets. These spreadsheets were designed with the assumption that all raw data would be available at the same time. Therefore no forecasts could be generated until all historical indices were available. Even if an Item’s cost model used five commodities, and all were available, it had to wait until all 4000 commodity forecasts were ready. Since no price forecast could be generated, the commentary that the analyst had to write couldn’t be written yet. The analysts knew they had a major amount of work ahead of them, but they couldn’t get started.
​
Further complicating the process, there were several different spreadsheets, each depending on the complete data from the previous spreadsheet being available. If an error was later found, the entire batch had to start again.
Initial State: All Data Moves Together (Batch Process)
​

​
In Lean Manufacturing, the goal is “single piece flow”, being able to produce a single widget without batching it with other widgets. James envisioned this for the data - if a particular commodity had a history, it should be able to have its forecast computed. If an item was modeled by five commodity drivers, as soon as those five commodity forecasts were available, the item’s forecast should be computed. The first tool that allowed the team to get there was a simple Access database replacing the multiple spreadsheets. Logic that had been ingrained in the various spreadsheets was brought into the database and set up to run on a single index at a time, allowing progress on any cost model whose components were ready.
Improved State: Data Moves When Ready (Single Piece Flow)
​

​
The government still discontinues indices every quarter, but this no longer had a noticeable delay on the work - if five out of 4000 commodity indexes were discontinued, everything that wasn’t dependent on those five could progress on schedule. Out of 200 reports that had to be written, maybe one would be delayed, and once the replacement indices were decided upon, it was easy to expedite a single report.
The Outcome:
Within two releases, the overtime was eliminated. No more working until midnight. Importantly, the Economists who used to worry just about getting data, (any data!), out of the system now had enough time to do the job they were hired for - think about how our price forecasts should be influenced by the economic factors at play in the world.
James turned the release management over to other team members on a rotating basis. Soon the simplified release allowed the cost modeling software to transition from quarterly to monthly data releases, an offering that brought in increased revenue. The simplified process also allowed the offering to expand to a new industry vertical with very little extra human effort.
The margin for this product rose from 63% to 93%.
Importantly, the team learned that the working process handed down to them was not set in stone - they could find better ways to do their work and they continued the improvements after James had moved on.
Take-Aways:
-
Accepting that heroics are needed just to get the job done are a sure sign that processes are broken.
-
The human mind cannot hold even a moderately complex process in the needed detail. Processes must be explicitly mapped to be understood and improved.
-
Best practices that have been established to keep physical products flowing consistently through a factory can be applied to data flowing through a transformation process.