Among the most frequently asked questions any skilled Operational Excellence/continuous improvement practitioner might find themselves responding to is the perennial question of <span style="color:#4a9f6e;">**why these programs consistently fail to take root**</span> after an almost circus-like initial round of excitement. However, despite the obvious importance of this discussion, it is far too common to hear experienced professionals attribute every single failure to "leadership" without first reviewing circumstantial evidence that points to actionable causes. <span style="color:#4a9f6e;">This is how we lose credibility as a profession. </span> It is therefore worth considering several common root causes in depth to examine the mechanisms that drive them, as well as countermeasures to overcome these barriers. This blog series will consider a range of root causes spanning strategic acumen, execution and sustainment. --- ## Part I: Core Strategic Acumen ### <span style="color:#e8c547;">**Failure Mode**: *No Strategic Doctrine*</span> Strategy is fundamentally the <span style="color:#4a9f6e;">*allocation of resources towards a defined policy objective*</span>. While this sounds like a simple concept to articulate, the advanced work of defining a doctrine—or a framework in which policies are evaluated, prioritized and subsequently managed in the first place—is rarely treated with the rigor and care it deserves. It is one thing to have a coherent vision. It is quite another to employ analytical techniques over prior <span style="color:#4a9f6e;">successes we may not wish to dig too deeply into, and prior failures we may wish to simply move on from.</span> Failing to reflect as an organization on core competencies (<span style="color:#4a9f6e;">what we do well</span>), weaknesses (<span style="color:#4a9f6e;">what we don't do so well</span>) and the <span style="color:#4a9f6e;">means by which we translate weakness into competencies</span> results almost universally in the creation of "lossy" strategies that cannot possibly account for reality experienced on the ground. It is likewise a common practice by leadership teams consumed with pressure to "outsource" this phase of collective cognition to third-parties whose incentives *rarely* align with those of their customers. <span style="color:#9f4a4a;font-family:Consolas;">Reflecting on prior success does **NOT** mean celebrating. Likewise, reflecting on failure does not mean self-immolation. Any cultural tendency towards over-personalization of wins and losses must be addressed without ceremony. </span> <span style="color:#9f4a4a;font-family:Consolas;">Failing to take a step back and consider the ideas, beliefs and egos that must be lost also presents a significant risk. </span> While it is absolutely the case that *Respect for People* represents a pillar of every contemporary improvement system, the notion we must accommodate every idea envisioned by influential leaders is <span style="color:#4a9f6e;">actively doing harm to our ability to introduce stable and reliable frameworks into strategic planning.</span> We operate in dynamic and competitive environments that demand reliable judgement calls on behalf of the people we employ. There is <span style="color:#4a9f6e;">**NO**</span> reason to retain insufficient ideas, assumptions and influences that detract from the core purpose of establishing these systems. ### <span style="color:#e8c547;">Countermeasure:</span> It is vitally-important to define, communicate, codify, integrate and update an organization's decision framework according to an equally-reliable and repeatable process. It is likewise important to stress that this process <span style="color:#4a9f6e;">must be executed routinely</span> to ensure stale beliefs may not influence future actions. <span style="color:#e8c547;">•</span> Individual beliefs must be affinitized and documented against the organization's fundamental vision. For this, a <span style="color:#4a9f6e;">SWOT analysis</span> provides the necessary surface area and logic. Whether or not the beliefs in the SWOT are entirely accurate does not matter, and it just isn't worth discussing in the moment. The *point* is to put our collective cards onto the table, not to descend into debate or semantic pontification. Nor is the point to influence the room in favor of any particular belief in the moment. Beyond this, expressions of dominance often undermine the argument or the individual advancing it. Nobody appreciates a loudmouth with nothing to say. <span style="color:#e8c547;">•</span> The organization must define and weight the critical variables through which decisions are evaluated. This **MUST** be done in <span style="color:#4a9f6e;">rank fashion</span> to ensure forced prioritization order over the set of variables (e.g., on time delivery, labor effectiveness to budget, etc). These variables **MUST** either exist within the current measurement system, or a plan to reliably capture performance must be defined and managed. <span style="color:#e8c547;">•</span> Leaders must be honest about (and document) where tradeoffs can, should and must be accepted. If progress cannot be made in one area, or if resources are too heavily constrained to proceed, then clear contingency plans must be present and accessible for the individuals tasked with executing the strategy outputs. <span style="color:#e8c547;">•</span> The tactics through which the strategy will be carried out (i.e., the specific operations employed to achieve the outcome) must similarly be documented to be fully understood in terms of resource commitment, risk and confidence in execution. This goes far beyond simply declaring a five-day agenda for Kaizen events or internalizing a belief that the engineering team will figure out line-side tolerance testing. If the tactical methods are poorly understood, then <span style="color:#4a9f6e;">it won't matter how accurately a strategy reflects an organization's true goals</span>, the translation layer from intent to execution must exist before strategy can be deployed with any degree of confidence. While it's commonly-held that such "meta standards" offer little in the way of impact, organizations familiar with costly engineering mistakes, mis-allocated capital and consultancy fever dreams understand the visceral embarrassment of missing first deliveries because no one thought through what the equipment installation would look like. <span style="color:#e8c547;">•</span> The specific methods employed to analyze data, derive conclusions and communicate judgements must also be defined and understood. Almost every organization now includes at least one "Excel superhero" or BI whiz tasked with performing complex analyses and delivering "insights" up the chain of command. These individuals often possess a delightful combination of intellect and creativity that all but ensures no two analyses look alike until IT gets around to writing a new report. #### <span style="color:#4a9f6e;">Key Insight:</span> It does not take an elegant or over-engineered system to produce a reliable framework. Managing for Daily Improvement (MDI) offers a near-zero cost alternative to highly-integrated operational analytics that requires nothing of leadership beyond a presence in the *Gemba*. <span style="color:#4a9f6e;">Paper charts, whiteboards, handwritten standards and direct human-to-human interaction are in fact superior in *most* cases to technology-dependent analytical frameworks that anesthetize pain over months of averages. </span> --- ### <span style="color:#e8c547;">**Failure Mode**: *Unsustainable Intent*</span> Corporate leaders are tasked with <span style="color:#4a9f6e;">designing, executing and overseeing company policy</span>, as well as the translation of that policy to executable operations throughout the organization. In practice, this should reflect the <span style="color:#4a9f6e;">intentional allocation of resources after well-defined and measurable objectives within their known capabilities and capacity. In reality, strategic plans more often reflect a mosaic of opinions folded into a singularity of conceptually-agreeable ideas that rarely survive first contact with a politically-charged environment. If the absence of doctrine results in a loss of judgement, absence of sustainable intent results in the introduction of unsound assumptions into an otherwise sound framework. If assumptions are never checked going in, or if we defer too confidently to data whose source process we can't explain, then the application of resources may follow a reliable arc, but *where* we deploy those resources may actually undermine our intent. Organizations are simply groups of people, assets and ideas with aligned roles and goals who execute processes to deliver products and services demanded by customers. Those resources are *constrained* by factors such as time, innate capacity and a shared need for survival that transcend the "breakthrough" fantasies that envision someone else doing the work to realize the benefit. It is incredibly difficult work to fully understand the current state capabilities within an organization. <span style="color:#4a9f6e;">It is even more costly for the system as a whole to forego this analysis completely. </span> ### <span style="color:#e8c547;">Countermeasure:</span> Before any serious discussions on strategic planning can be had, <span style="color:#4a9f6e;">the current state constraints, capabilities and known risks must be quantitatively understood</span> and documented. <span style="color:#9f4a4a;font-family:Consolas;">It must be pointed out that friction encountered at this stage often indicates the absence or insufficiency of current state measurement systems. This gap must be resolved prior to proceeding.</span> #### <span style="color:#4a9f6e;">Phase I: Doctrinal Refresh</span> The organization must first review and refresh the core doctrine to be used. As described above, this means defining priorities in terms of unambiguous measurables and tradeoffs when priorities are required by the environment to shift. #### <span style="color:#4a9f6e;">Phase II: Organizational MSA</span> An organizational Measurement Systems Analysis (MSA) must be performed. This does **NOT** mean performing several thousand Gage R&R studies over sensors and devices. It means assessing the reliability of inputs to produce an accurate reflection of reality over the prioritized measurables. Often times, simple exploratory data analysis (EDA) is sufficient for highlighting unrealistic values, distribution bias and collinearity that otherwise degrade measurement system reliability. Bias can be corrected at the source, unrealistic values can be removed and collinearity can be demystified. It should be stressed once more that the precise methods employed to analyze and judge based on available data **MUST** be standard and known. Instability here all but ensures assumptions are made that cannot stand up to reality. #### <span style="color:#4a9f6e;">Phase III: Test Beliefs Against Reality</span> In his 1994 defense of OJ Simpson, the legal scholar Alan Dershowitz asserted that "only one in a thousand abusive husbands eventually murder their wives." Had Mr. Dershowitz updated that *prior belief* with the relevant fact that in this case, the wife had *already* been murdered, then the likelihood of innocence would have appeared decidedly less in favor of Mr. Simpson and his ill-fitting gloves—a swing from 99.9% innocence to ~97% confidence in his guilt. Once beliefs are established, they must be removed from their hosts, exposed to the elements and selected for their *resilience* to the conditions they must stand up to. That is, individual claims from SWOT analyses, in response to affinity exercise prompts and in passing comments taken up by the collective mind **MUST** be disproven with cold and calculating efficiency. Assumptions regarding behavioral patterns, the tradeoffs individuals are willing to make and unverifiable fantasies **MUST** be rejected without ceremony or apology. Leaders **MUST NEVER** be so unconstrained as to be free to inject without challenge conspiratorial sludge into collective judgement. Assumptions regarding performance, capacity and current state must be filtered through a repeatable and reliable measurement system before their introduction as fact into a strategic plan. This must also comprise sufficient rigor to ensure "lossy" claims can be verified. Any claim that cannot be verified must either be translated into a measurement system or rejected. #### <span style="color:#4a9f6e;">Key Insight:</span> Strategic intent most often fails when beliefs are formed, and subsequently when they are later accepted and folded into the methods, risk profiles and decisions that define how an organization operates. Stabilizing inputs, exposing beliefs to true reality and maintaining a dynamic and written doctrine that provides guidance for the strategic planning process ensures that hypotheses for the future state are subject to minimal error. --- ### <span style="color:#e8c547;">**TL;DR**</span> Strategy is the <span style="color:#4a9f6e;">allocation of resources towards a well-defined policy objective</span>. To have a shot at producing a sound strategy, three things **MUST** be present: 1. A doctrine or system 2. A reliable measurement system that reflects true current state in near real time 3. A standard means by which to evaluate beliefs against reality The specific method used to formulate the strategy is rarely a failure mode, unless the method itself introduces confusion. While this is not an intended as an exhaustive list of *every* failure mode to have ever crippled a transformation program, these two strategic failures offer the potential for catastrophic results if ambiguity and unsustainability are not addressed in execution. Indeed, applying personal confidence to professional strategy without the evidence and exposure to back it up risks silent introduction of a new belief that at the very least will follow the plan throughout its lifecycle.