As a consultant, I have the privilege of working with leading organizations in a variety of industries around the world. Despite the wide diversity in geography, size, and industry of our client base, I can confirm that there is one thing that every single organization has in common – each is unique. In fact, at some point in the first 15 minutes of every sales call I have made, the prospective client gave me some form of warning along the lines of, “You have to understand, things around here are different than in other organizations.” And of course they are right – each organization is distinctive and requires tailored solutions.
So, if leaders recognize that their organizations have unique strengths and challenges, why do so many of them carry out major initiatives in ways that emphasize implementing generic “best practices” and “solutions” that worked for other organizations?
The reason cannot be that this approach works. Studies show most large-scale organizational programs fail to achieve their objectives. No, the reason is that it feels logical and comfortable. When tackling a significant opportunity, it certainly makes sense to research the issue and see what has worked in similar situations. The problem is that many leaders evaluate what worked for other organizations and then assume these ideas will be effective in their own organizations. And in most cases, success is measured by the completed implementation of ideas, not achievement of the outcomes the ideas were meant to deliver.
Here is an example: A technology organization was looking to grow sales of its newer products. Best practices gathered by the marketing group suggested that product “placemats” – one page laminated sheets that summarize key information related to the product – would be needed to educate customers regarding each product’s purpose, functionality and features, costs, and value. Months of effort were put into creating a placemat for each new product. They were attractive documents with accurate information, but unfortunately they did not bring about any great top-line sales growth.
Let me suggest an alternative approach to driving major initiatives. Start with the undeniable truth that your organization will have to figure out what works for your unique circumstances. Then, embrace the notion that planned experimentation must be the method that drives success. In other words, accept that the only way to identify what will work for you is to try out various ideas and carefully monitor the results – and based on the results, expand what works, modify ideas that do not move the needle, and continually try out new ideas until the required outcomes are achieved. It is only through this method that you will discover what works for your organization.
To be sure, disciplined testing of ideas can be difficult and time consuming in a large organization with many divisions, customers, products, processes, and so on. Therefore, it is important use three techniques to make rigorous experimentation work in complex environments: narrow scope, ambitious results goals, and short timeframes.
Here is an illustrative example related to sales growth: rather than attempt to grow revenues through all products in all geographies, select one line in two cities. Then set a measurable, challenging goal aimed directly at the desired result – revenue, sales volume, or market share might be used in this case. Lastly, give the team 100 days or fewer to achieve their goal.
The results-based goal forces the team to go beyond simple activities like creating marketing material, calling on potential customers, or adding product features they believe customers will value. Instead, they are accountable for results and are empowered to “do what is necessary” and determine what it takes. The narrow scope condenses their landscape and allows them to try out lots of innovations quickly to see what really works. And, the short timeframe keeps the pressure on and makes sure the initiative stays top of mind for them.
Several medical centers in New York City launched a series of these Rapid Results projects with remarkable outcomes:
- Reduced average recovery time from ambulatory surgery by 50 minutes,or about 27%.
- Reduced Operating Room (OR) turnover time by 30%.
- Discharged 80% of elective post-caths within 6 hours, instead of 8 hours or longer.
- Achieved a turnaround time of 1 hour or less for 65% of ICU patients.
In each of these cases, leaders commissioned small, cross-functional teams to deliver the actual outcomes required. They recognized that cookie-cutter solutions from other hospitals would not necessarily work for them. Now, teams did put known best practices, successes from outside case studies, and expertise from consultants into the catalogue of ideas to try out. But the team was on the hook for making sure they achieved the required gains. So if these ideas did not deliver, the team either modified them and tried again or discarded them and tried other innovations. To be clear, these projects stimulate experimentation and innovation, but they are not scientific experiments set up simply to confirm or refute a hypothesis of what might work. Instead, teams are committed to achieving the necessary outcomes and figuring out in real circumstances and real time what it takes through fast cycles of trying, measuring, learning, and trying again, which unfolds over a period of roughly three months. There are no “failed experiments” or “unsuccessful pilots” of predetermined solutions.
For instance, the team charged with reducing OR turnover time did research on the issue and found numerous best practices that other hospitals had used to make improvement, such as implementing a consistent paperwork flow through the pre-surgical evaluation area. The team began implementing some of these best practices, but, because they were concerned with actual results rather than implementing ideas, saw that improvements were not forthcoming. It turns out that in this hospital, anesthesiologists, OR nurses and surgeons had different understandings of surgery start and turnover times. The team abandoned their best practice implementation plans and quickly put together a workshop during which these stakeholders came to consensus on key definitions, brainstormed ways to work differently in order to bring turnover time down, and selected the most promising ideas to try out immediately. Over the next few weeks, these ideas were put into place and turnover times were carefully measured and reviewed. Some ideas were retained; others were adjusted or discarded. By the end of the project, they had achieved their turnaround time reduction goal and developed the new model for OR operation. What made the difference was the simple mindset shift from implementing “proven” best practices to discovering what actually works for this hospital.
Rapid Results projects have been used by organizations all over the world to make progress on their most important initiatives, including organic growth, performance improvement, cost reduction, post-merger integration, and strategy execution. They have a nearly 100% success rate in delivering outcomes – because they forgo implementation of “best practices” and put achievement of results in the organization’s context as the success criteria right from the start.
Many thought leaders are beginning to recognize the drawbacks to implementing best practices developed in other organizations. See the excerpt below from “One surgical checklist does not fit all, experts say” in Modern Healthcare.
[Dr. Peter J.] Pronovost said that in working on quality improvement with hospitals in Michigan, he discouraged hospitals from using the checklist developed by Johns Hopkins. Instead, he encouraged staff to customize the list to ensure that it addressed the specific needs of their system. For example, each hospital team should make sure that the roles outlined in the checklist are consistent with their current staff structure and that the checklist does not conflict with other hospital procedures. “Now every one of them thinks that their checklist is the best,” he said. “And it is, for their culture.”