Lean Execution

OEE – A Race Against Time

The printer Benjamin Franklin contributed grea...

Image via Wikipedia

Background

If “Time is Money”, is it reasonable for us to consider that “Wasting Time is Wasting Money?”

Whether we are discussing customer service, health care, government services, or manufacturing – waste is often identified as one of the top concerns that must be addressed and ultimately eliminated.  As is often the case in most organizations, the next step is an attempt to define waste.  Although they are not the focus of our discussion, the commonly known “wastes” from a lean perspective are:

  • Over-Production
  • Inventory
  • Correction (Non-Conformance  – Quality)
  • Transportation
  • Motion
  • Over Processing
  • Waiting

Resourcefulness is another form of waste often added to this list and occurs when resources and talent are not utilized to work at their full potential.

Where did the Time go?

As a lean practitioner, I acknowledge these wastes exist but there must have been an underlying element of concern or thinking process that caused this list to be created.  In other words, lists don’t just appear, they are created for a reason.

As I pondered this list, I realized that the greatest single common denominator of each waste is TIME.  Again, from a lean perspective, TIME is the basis for measuring throughput.  As such, our Lean Journey is ultimately founded on our ability to reduce or eliminate the TIME required to produce a part or deliver a service.

As a non-renewable resource, we must learn to value time and use it effectively.  Again, as we review the list above, we can see that lost time is an inherent trait of each waste.  We can also see how this list extends beyond the realm of manufacturing.  TIME is a constant constraint that is indeed a challenge to manage even in our personal lives.

To efficiently do what is not required is NOT effective.

I consider Overall Equipment Effectiveness (OEE) to be a key metric in manufacturing.  While it is possible to consider the three factors Availability, Performance, and Quality separately, in the context of this discussion, we can see that any impediment to throughput can be directly correlated to lost time.

To extend the concept in a more general sense, our objective is to provide our customers with a quality product or service in the shortest amount of time.  Waste is any impediment or roadblock that prevents us from achieving this objective.

Indirect Waste and Effectiveness

Indirect Waste (time) is best explained by way of example.  How many times have we heard, “I don’t understand this – we just finished training everybody!”  It is common for companies to provide training to teach new skills.  Similarly, when a problem occurs, one of the – too often used – corrective actions is “re-trained employee(s).”  Unfortunately, the results are not always what we expect.

Many companies seem content to use class test scores and instructor feedback to determine whether the training was effective while little consideration is given to developing skill competency.  If an employee cannot execute or demonstrate the skill successfully or competently, how effective was the training?  Recognizing that a learning curve may exist, some companies are inclined to dismiss incompetence but only for a limited time.

The company must discern between employee capability and quality of training.  In other words, the company must ensure that the quality of training provided will adequately prepare the employee to successfully perform the required tasks.  Either the training and / or method of delivery are not effective or the employee may simply lack the capability.  Let me qualify this last statement by saying that “playing the piano is not for everyone.”

Training effectiveness can only be measured by an employee’s demonstrated ability to apply their new knowledge or skill.

Time – Friend or Foe?

Lean tools are without doubt very useful and play a significant role in helping to carve out a lean strategy.  However, I am concerned that the tendency of many lean initiatives is to follow a prescribed strategy or formula.  This approach essentially creates a new box that in time will not be much different from the one we are trying to break out of.

An extension of this is the classification of wastes.  As identified here, the true waste is time.  Efforts to reduce or eliminate the time element from any process will undoubtedly result in cost savings.  However, the immediate focus of lean is not on cost reduction alone.

Global sourcing has assured that “TIME” can be purchased at reduced rates from low-cost labour countries.  While this practice may result in a “cost savings”, it does nothing to promote the cause of lean – we have simply outsourced our inefficiencies at reduced prices.  Numerous Canadian and US facilities continue to be closed as workers witness the exodus of jobs to foreign countries due to lower labor and operating costs. Electrolux closes facility in Webster City, Iowa.

I don’t know the origins of multi-tasking, but the very mention of it suggests that someone had “time on their hands.”  So remember, when you’re put on hold, driving to work, stuck in traffic, stopped at a light, sorting parts, waiting in line, sitting in the doctors office, watching commercials, or just looking for lost or misplaced items – your time is running out.

Is time a friend or foe?  I suggest the answer is both, as long as we spend it wisely (spelled effectively).  Be effective, be Lean, and stop wasting time.

Let the race begin:  Ready … Set … Go …

Until Next Time – STAY lean!

Vergence Analytics

Twitter:  @Versalytics
Posted in Advanced Lean Manufacturing, Eliminate Waste, Lean Metrics, Lean Mindset, Theory of Constraints, Training Tagged with: , , , , , , , , ,

Lean – OEE and Pareto’s Law

Typical Application to Analyze Quality Defects

The Premise:  Pareto’s Law

The late Josheph Juran introduced the world to Pareto’s Law, aptly named after Italian economist Vilfredo Pareto.  Many business and quality professionals alike are familiar with Pareto’s law and often refer to it as the 80 / 20 rule.  In simple terms, Pareto’s Law is based on the premise that 80% of the effects stem from 20% of the causes.

As an example, consider that Pareto’s Law is often used by quality staff to determine the cause(s) responsible for the highest number of defects as depicted in the chart to the right.  From this analysis, teams will focus their efforts on the top 1 or 2 causes and resolve to eliminate or substantially reduce their effect.

In this case, the chart suggests that highest number of defects are due to shrink followed by porosity.  At this point a problem solving strategy is established using one of the many available tools (8 Discipline Report, 5 Why, A3) to resolve the root cause and eliminate the defect.  Over time and continued focus, the result is a robust process that yields 100% quality, defect free, products.

In practice, this approach seems logical and has proven to be effective in many instances.  However, we need to be cognizant of a potential side effect that may be one of the reasons why new initiatives quickly wane to become “the program of the day.”

The Side Effects:  Burnout and Apathy

Winning the team’s confidence is often one of the greatest challenges for any improvement initiative.  A common strategy is to select a project where success can be reasonably assured.  If we apply Pareto’s Law to project selection, we are inclined to select a project that is either relatively easy to solve, offers the greatest savings, or both.

In keeping with the example presented in the graphic, resolving the “shrink” concern presents the greatest opportunity.  However, we can readily see that, once resolved, the next project presents a significantly lower return and the same is true for each subsequent project thereafter.

Clearly, as each problem is resolved, the return is diminished.  To compound matters, problems with lower rates of recurrence are often more difficult to solve and the monies required to resolve them cannot be justified due to the reduced return on investment.  In other words, we approach a point where the solution is as elusive as “the needle in a haystack” and once found, it simply isn’t cost effective to resolve it.

The desire to resolve the concern is significantly reduced with each subsequent challenge as the return on investment in time and money diminishes while the team continues to expend more energy.  Over extended periods of time the continued pursuit of excellence leads to apathy and may even lead to burnout.  As alluded to earlier, adding to the frustration is the inability to achieve the same level of success offered by the preceding opportunities.

The Solution

One of the problems with the approach as presented here is the focus on resolving the concern or defect that is associated with the greatest cost savings.  To be clear, Pareto Analysis is a very effective tool to identify improvement opportunities and is not restricted to just quality defects.  A similar Pareto chart could be created just as easily to analyze process down time.

Perhaps the real problem is that we’re sending the wrong message:  Improvements must have an immediate and significant financial return.  In other words, team successes are typically recognized and rewarded in terms of absolute cost savings.  Not all improvements will have a measurable or immediate return on investment.  If a condition can be improved or a problem can be circumvented, employees should be empowered to take the required actions as required regardless of where they fall on the Pareto chart.

To assure sustainability, we need to focus on the improvement opportunities that are before us with a different definition of success, one with less emphasis on cost savings alone.  Is it possible to make improvements for improvements sake?  We need to take care of the “low hanging fruit” and that likely doesn’t require a Pareto analysis  to find it.

Finally, not all improvement strategies require a formal infrastructure to assure improvements occur.  In this regard, the ability to solve problems at the employee level is one of the defining characteristics that distinguishes companies like Toyota from others that are trying to be like them.  Toyota and the principles of lean are not reliant on tools alone to identify opportunities to improve.

As suggested earlier, Pareto Analysis is useful to resolve availability, performance, and quality concerns that will most certainly improve Overall Equipment Effectiveness (OEE) and your bottom line.

Until Next Time – STAY lean!

Vergence Analytics
Posted in Advanced Lean Manufacturing, Pareto's Law, Quality Tagged with: , , , , , , , ,

Scorecards and Dashboards

Interior of the 2008 Cadillac CTS (US model sh...

Image via Wikipedia

I recently published, Urgent -> The Cost of Things Gone Wrong, where I expressed concern for dashboards that are attempting to do too much.  In this regard, they become more of a distraction instead of serving the intended purpose of helping you manage your business or processes.  To be fair, there are at least two (2) levels of data management that are perhaps best differentiated by where and how they are used:  Scorecards and Dashboards.

I prefer to think of Dashboards as working with Dynamic Data.  Data that changes in real-time and influences our behaviors similar to the way the dashboard in our cars work to communicate with us as we are driving.  The fuel gauge, odometer, two trip meters, tachometer, speedometer, digital fuel consumption (L/100 km), and km remaining are just a few examples of the instrumentation available to me in my Mazda 3.

While I appreciate the extra instrumentation, the two that matter first and foremost are the speedometer and the tachometer (since I have a 5 speed manual transmission).  The other bells and whistles do serve a purpose but they don’t necessarily cause me to change my driving behavior.  Of note here is that all of the gauges are dynamic – reporting data in real time – while I’m driving.

A Scorecard on the other hand is a periodic view of summary data and from our example may include Average Fuel Consumption, Average Speed, Maximum Speed, Average Trip, Maximum Trip, Total Miles Traveled and so on.  The scorecard may also include other items such as driving record / vehicle performance data such as Parking Tickets, Speeding Tickets, Oil Changes, Flat Tires, Emergency and Preventive Maintenance.

One of my twitter connections, Bob Champagne (@BobChampagne), published an article titled, Dashboards Versus Scorecards- Its all about the decisions it facilitates…, that provides some great insights into Scorecards and Dashboards.  This article doesn’t require any further embellishment on my part so I encourage you to click here or paste the following link into your browser:  http://wp.me/p1j0mz-6o.  I trust you will find the article both informative and engaging.

Next Steps:

Take some time to review your current metrics.  What metrics are truly influencing your behaviors and actions?  How are you using your metrics to manage your business?  Are you reacting to trends or setting them?

It’s been said that, “What gets measured gets managed.”  I would add – “to a point.”  It simply isn’t practical or even feasible to measure everything.  I say, “Measure to manage what matters most”.

Remember to get your free Excel Templates for OEE by visiting our downloads page or the orange widget in the sidebar.  You can follow us on twitter as well @Versalytics.

Until Next Time – STAY lean!

Vergence Analytics
Posted in Execution, Lean Metrics, Performance Tagged with: , , , , , , , ,

Toyota’s Culture – Inside Out

Image by opensourceway via Flickr

As discussed on our Lean Roadmap page, the culture that exists inside your company will determine the success or failure of your lean initiatives in the long-term.  So, how do we cultivate and nurture this culture that we desire to achieve?

Fortunately, I found a great article,  How to implement “Lean Thinking” in a Business: Pathway to creating a “Lean Culture”, written by one of my recent twitter connections (lean practitioner and former Toyota employee) that briefly describes the process embraced by Toyota.

I will not paraphrase the content of the article if only to preserve the essence of the presentation and passion that is conveyed in its writing.

As an aside, it is interesting to note that Toyota does not typically refer to their methods as lean.  Lean is not a set of tools but rather a manner of thinking and focus on a seemingly elusive target to achieve one piece flow.

The spirit of Lean, like synergy, cannot be taught – only experienced.

An innate ability exists and continues to evolve where team members operate with a high level of synergy and are able to identify and respond to concerns in real-time.  Steven Spear also discusses various characteristics or attributes of high performance teams from a different perspective and much wider range of industries in his book “The High Velocity Edge“.

Toyota Recall – Update

Following the release of the NHTSA investigation, Bloomberg Businessweek published an article titled “Toyota, The Media Owe You an Apology“.  The article clarifies a number of allegations against Toyota, however, I am reminded that the government’s investigation did not completely exonerate Toyota from having any responsibility.

Whether the failure is mechanical or electronic is moot considering the tragic results that ensued for some.  I think the real concern is whether the problem itself has been identified and resolved regardless of fault.

Since we are on the topic of culture, consider the media’s role in reporting the events surrounding the recall.  What was your overall sense of the media’s reporting and perspective on this issue?

As you ponder this question, your answer will reveal how quickly events and people of influence can shape our culture.  On a much larger scale, consider the current events in Egypt or the last Presidential election in the United States.

As always, I appreciate your feedback – leave a comment or send us an e-mail.

Until Next Time – STAY lean!

Vergence Analytics

Twitter:  @Versalytics
Posted in Advanced Lean Manufacturing, Culture Tagged with: , , , , , , ,

Lean Is …

A scrapyard.

Image via Wikipedia

What is lean?  The following definition is from the Oregon Manufacturing Extension Partnership website, http://www.omep.org:

Lean Is

A systematic approach for delivering the highest quality, lowest cost products with the shortest lead-times through the relentless elimination of waste.

The eights wastes that accompanied this definition include:

  1. Overproduction
  2. Waiting
  3. Transportation
  4. Non-Value-Added Processing
  5. Excess Inventory
  6. Defects
  7. Excess Motion
  8. Underutilized People

It is very easy to become overwhelmed by the incredible amount of information on the subject of Lean.  I always like to refer back to the basic tenets of lean to keep things in perspective.

Until Next Time – STAY lean!

Vergence Analytics

Twitter:  @Versalytics

Posted in Lean, Lean Mindset Tagged with: , , , , , , ,

What are we Changing?

N(0,1) normal distribution curve, mean and sta...

Image via Wikipedia

Our process improvement strategy is founded on the Theory of Constraints where improvement initiatives are supported by lean and six sigma tools.  Process disruptions affecting flow and task execution all contribute to variance and the efforts to eliminate or reduce them are evidenced by increased stability, increased throughput over time, and increased profits.

So, our main goal in production is to improve flow by focusing our efforts to reduce and eliminate variation in our processes.  This is also the message behind our previous two posts, OEE in an Imperfect World and Variation:  OEE’s Silent Partner.  The effects of our actions will be reflected by the metrics we have chosen to measure our performance.

The following videos further the cause for the Theory of Constraints and Improving Flow:

Standing on the Shoulders of Giants by Dr. Eliyahu M. Goldratt

http://www.youtube.com/watch?v=C3RPFUh3ePQ

The following video discusses “What to Change?”

http://www.youtube.com/watch?v=prrA-onO0Nc&NR=1

Stories can be the best teachers and when the topic is manufacturing, production, or operations, I highly recommend “The Goal”, an industry standard, and the recently released “Velocity“.  Both novels present an all too common manufacturing dilemma – resource capacity and scheduling constraints – to teach the Theory of Constraints.  Velocity is a continuation of The Goal and expands the discussion to include Lean and Six Sigma.

For additional resources and reading recommendations, visit our Book Page.

The message is simple:  Change drives Change.  What are your thoughts?

Until Next Time – STAY lean!

Vergence Analytics

Twitter:  @Versalytics

Posted in Advanced Lean Manufacturing, Theory of Constraints Tagged with: , , , , , , , , , ,

OEE in an imperfect world

Image via Wikipedia

Background: This is a more general presentation of “Variation:  OEE’s Silent Partner” published on January 31, 2011.

In a perfect world we can produce quality parts at rate, on time, every time.  In reality, however, all aspects of our processes are subject to variation that affects each factor of Overall Equipment Effectiveness:  Availability, Performance, and Quality.

Our ability to effectively implement Preventive Maintenance programs and Quality Management Systems is reflected in our ability to control and improve our processes, eliminate or reduce variation, and increase throughput.

The Variance Factor

Every process and measurement is subject to variation and error.  It is only reasonable to expect metrics such as Overall Equipment Effectiveness and Labour Efficiency will also exhibit variance.  The normal distribution for four (4) different data sets are represented by the graphic that accompanies this post.  You will note that the average for 3 of the curves (Blue, Red, and Yellow) is common (u = 0) and the shapes of the curves are radically different.  The green curve shows a normal distribution that is shifted to the left, the average (u) is -2, although we can see that the standard deviation for this distribution is better than that of the yellow and red curves.

The graphic also allows us to see the relationship between the Standard Deviation and the shape of curve.  As the Standard Deviation increases, the height decreases and the width increases.  From these simple representations, we can see that our objective is to reduce to the standard deviation.  The only way to do this is to reduce or eliminate variation in our processes.

We can use a variety of statistical measurements to help us determine or describe the amount of variation we may expect to see.  Although we are not expected to become experts in statistics, most of us should already be familiar with the normal distribution or “bell curve” and terms such as Average, Range, Standard Deviation, Variance, Skewness, and Kurtosis.  In the absence of an actual graphic, these terms help us to picture what the distribution of data may look like in our mind’s eye.

Run Time Data

The simplest common denominator and readily available measurement for production is the quantity of good parts produced.  Many companies have real-time displays that show quantity produced and in some cases go so far as to display Overall Equipment Effectiveness (OEE) and it’s factors – Availability, Performance, and Quality.  While the expense of live streaming data displays can be difficult to justify, there is no reason to abandon the intent that such systems bring to the shop floor.   Equivalent means of reporting can be achieved using “whiteboards” or other forms of data collection.

I am concerned with any system that is based solely on cumulative shift or run data that does not include run time history.  As such, an often overlooked opportunity for improvement is the lack of stability in productivity or throughput over the course of the run.  Systems with run time data allow us to identify production patterns, significant swings in throughput, and to correlate this data with down time history.  This production story board allows us to analyze sources of instability, identify root causes, and implement timely and effective corrective actions.  For processes where throughput is highly unstable, I recommend a direct hands-on review on the shop floor in lieu of post production data analysis.

Overall Equipment Effectiveness

Overall Equipment Effectiveness and the factors Availability, Performance, and Quality do not adequately or fully describe the capability of the production process.  Reporting on the change in standard deviation as well as OEE provides a more meaningful understanding of the process  and its inherent capability.

Improved capability also improves our ability to predict process throughput.  Your materials / production control team will certainly appreciate any improvements to stabilize process throughput as we strive to be more responsive to customer demand and reduce inventories.

Until Next Time – STAY lean!

Vergence Analytics

Twitter:  @Versalytics

Posted in Advanced Lean Manufacturing, Availability, Performance, Problem Solving, Process Control and OEE, Quality Tagged with: , , , , , , , , ,

Variance – OEE’s Silent Partner (Killer)

Image via Wikipedia

I was recently involved in a discussion regarding the value of Overall Equipment Effectiveness (OEE).  Of course, I fully supported OEE and confirmed that it can bring tremendous value to any organization that is prepared to embrace it as a key metric.  I also qualified my response by stating that OEE cannot be managed in isolation:

OEE and it’s intrinsic factors, Availability, Performance, and Quality are summary level indices and do not measure or provide any indication of process stability or capability

As a top level metric, OEE does not describe or provide a sense of actual run-time performance.  For example, when reviewing Availability, we have no sense of duration or frequency of down time events, only the net result.  In other words we can’t discern whether downtime was the result of a single event or the cumulative result of more frequent down time events over the course of the run.  Similarly, when reviewing Performance, we cannot accurately determine the actual cycle time or run rate, only the net result.

As shown in the accompanying graphic, two data sets (represented by Red and Blue) having the same average can present very different distributions as depicted by the range of data, height of the curve (kurtosis), width or spread of the curve (skewness), and significantly different standard deviations.

Clearly, any conclusions regarding the process simply based on averages would be very misleading.  In this same context, it is also clear that we must exercise caution when attempting to compare or analyse OEE results without first considering a statistical analysis or representation of the raw process data itself.

The Missing Metrics

Fortunately, we can use statistical tools to analyse run-time performance to determine whether our process is capable of consistently producing parts just as Quality Assurance personnel use statistical analysis tools to determine whether a process is capable of producing parts consistently.

One of the greatest opportunities for improving OEE is to use statistical tools to identify opportunities to reduce throughput variance during the production run.

Run-Time or throughput variance is OEE’s silent partner as it is an often overlooked aspect of production data analysis.  Striving to achieve consistent part to part cycle times and consistent hour to hour throughput rates is the most fundamental strategy to successfully improve OEE.  You will note that increasing throughput requires a focus on the same factors as OEE: Availability, Performance, and Quality.  In essence, efforts to improve throughput will yield corresponding improvements in OEE.

Simple throughput variance can readily be measured using Planned versus Actual Quantities produced – either over fixed periods of time and is preferred or cumulatively.  Some of the benefits of using quantity based measurement are as follows:

  1. Everyone on the shop floor understands quantity or units produced,
  2. This information is usually readily available at the work station,
  3. Everyone can understand or appreciate it’s value in tangible terms,
  4. Quantity measurements are less prone to error, and
  5. Quantities can be verified (Inventory) after the fact.

For the sake of simplicity, consider measuring hourly process throughput and calculating the average, range, and standard deviation of this hourly data.  With reference to the graphic above, even this fundamental data can provide a much more comprehensive and improved perspective of process stability or capability than would otherwise be afforded by a simple OEE index.

Using this data, our objective is to identify those times where the greatest throughput changes occurred and to determine what improvements or changes can be implemented to achieve consistent throughput.  We can then focus our efforts on improvements to achieve a more predictable and stable process, in turn improving our capability.

In OEE terms, we are focusing our efforts to eliminate or reduce variation in throughput by improving:

  1. Availability by eliminating or minimizing equipment downtime,
  2. Performance through consistent cycle to cycle task execution, and
  3. Quality by eliminating the potential for defects at the source.

Measuring Capability

To make sure we’re on the same page, let’s take a look at the basic formulas that may be used to calculate Process Capability.  In the automotive industry, suppliers may be required to demonstrate process capability for certain customer designated product characteristics or features.  When analyzing this data, two sets of capability formulas are commonly used:

  1. Preliminary (Pp) or Long Term (Cp) Capability:  Determines whether the product can be produced within the required tolerance range,
    • Pp or Cp = (Upper Specification Limit – Lower Specification Limit) / (6 x Standard Deviation)
  2. Preliminary (Ppk) or Long Term (Cpk) Capability:  Determines whether product can be produced at the target dimension and within the required tolerance range:
    • Capability = Minimum of Either:
      • Capability Upper = (Average + Upper Specification Limit) / (3 x Standard Deviation)
      • Capability Lower = (Lower Specification Limit – Average) / 3 x Standard Deviation)

When Pp = Ppk or Cp = Cpk, we can conclude that the process is centered on the target or nominal dimension.  Typically, the minimum acceptable Capability Index (Cpk) is 1.67 and implies that the process is capable of producing parts that conform to customer requirements.

In our case we are measuring quantities or throughput data, not physical part dimensions, so we can calculate the standard deviation of the collected data to determine our own “natural” limits (6 x Standard Deviation). Regardless of how we choose to present the data, our primary concern is to improve or reduce the standard deviation over time and from run to run.

Once we have a statistical model of our process, control charts can be created that in turn are used to monitor future production runs.  This provides the shop floor with a visual base line using historical data (average / limits) on which improvement targets can be made and measured in real-time.

Run-Time Variance Review

I recall using this strategy to achieve literally monumental gains – a three shift operation with considerable instability became an extremely capable and stable two shift production operation coupled with a one shift preventive maintenance / change over team.  Month over month improvements were noted by significantly improved capability data (substantially reduced Standard Deviation) and marked increases in OEE.

Process run-time charts with statistical controls were implemented for quantities produced just as the Quality department maintains SPC charts on the floor for product data.  The shop floor personnel understood the relationship between quantity of good parts produced and how this would ultimately affect the department OEE as well.

Monitoring quantities of good parts produced over shorter fixed time intervals is more effective than a cumulative counter that tracks performance over the course of the shift.  In this specific case, the quantity was “reset” for each hour of production essentially creating hourly in lieu of shift targets or goals.

Recording / plotting production quantities at fixed time intervals combined with notes to document specific process events creates a running production story board that can be used to identify patterns and other process anomalies that would otherwise be obscured.

Conclusion

I am hopeful that this post has heightened your awareness regarding the data that is represented by our chosen metrics.  In the boardroom, metrics are often viewed as absolute values coupled with a definitive sense of sterility.

Run-Time Variance also introduces a new perspective when attempting to compare OEE between shifts, departments, and factories.  From the context of this post, having OEE indices of the same value does not imply equality.  As we can see, metrics are not pure and perhaps even less so when managed in isolation.

Variance is indeed OEE’s Silent Partner but left unattended, Variance is also OEE’s Silent Killer.

Until Next Time – STAY lean!

Vergence Analytics

Twitter:  @Versalytics

Posted in Advanced Lean Manufacturing, Lean Metrics, Performance, Problem Solving Tagged with: , , , , , , ,

Achieve Sustainability Through Integration

Image via Wikipedia

It’s no secret that lean is much more than a set of tools and best practices designed to eliminate waste and reduce variance in our operations.  I contend that lean is defined by a culture that embraces the principles on which lean is founded.  An engaged lean culture is evidenced by the continuing development and integration of improved systems, methods, technologies, best practices, and better practices.  When the principles of lean are clearly understood, the strategy and creative solutions that are deployed become a signature trait of the company itself.

Unfortunately, to offset the effects of the recession, many lean initiatives have either diminished or disappeared as companies downsized and restructured to reduce costs.  People who once entered data, prepared reports, or updated charts could no longer be supported and their positions were eliminated.  Eventually, other initiatives also lost momentum as further staffing cuts were made.  In my opinion, companies that adopted this approach simply attempted to implement lean by surrounding existing systems with lean tools.

Some companies have simply returned to a “back to basics” strategy that embraces the most fundamental principles of lean.  Is it enough to be driven by a mission, a few metrics, and simple policy statements or slogans such as “Zero Downtime”, “Zero Defects”, and “Eliminate Waste?”  How do we measure our ability to safely produce a quality part at rate, delivered on time and in full, at the lowest possible cost?  Regardless of what we measure internally, our stakeholders are only concerned with two simple metrics – Profit and Return on Investment.  The cold hard fact is that banks and investors really don’t care what tools you use to get the job done.  From their perspective the best thing you can do is make them money!  I agree that we are in business to make money.

What does it mean to be lean?  I ask this question on the premise that, in many cases, sustainability appears to be dependent on the resources that are available to support lean versus those who are actually running the process itself.  As such, “sustainability” is becoming a much greater concern today than perhaps most of us are likely willing to admit.  I have always encouraged companies to implement systems where events, data, and key metrics are managed in real-time at the source such that the data, events, and metrics form an integral part of the whole process.

Processing data for weekly or monthly reports may be necessary, however, they are only meaningful if they are an extension of ongoing efforts at shop floor / process level itself.  To do otherwise is simply pretending to be lean.  It is imperative that data being recorded, the metrics being measured, and the corrective actions are meaningful, effective, and influence our actions and behaviors.

To illustrate the difference between Culture and Tools consider this final thought:  A carpenter is still a carpenter with or without hammer and nails.

Until Next Time – STAY lean!

Vergence Analytics

Twitter:  @Versalytics

Posted in Advanced Lean Manufacturing, Culture, Execution, Lean, Lean Metrics, Lean Mindset Tagged with: , , , , , , , , , , ,

Lean Paralysis

Lean – Breaking Through Paralysis

Significant initiatives, including lean, can reach a level of stagnation that eventually cause the project to either lose focus or disappear altogether.  Hundreds of books have already been written that reinforce the concept that the company culture will ultimately determine the success or failure of any initiative.  A sustainable culture of innovation, entrepreneurial spirit, and continual improvement requires effective leadership to cultivate and develop an environment that supports these attributes.

When launching any new initiative, we tend to focus on the many positive aspects that will result.  Failure is seldom placed on the list of possible outputs for a new initiative.  We are all quite familiar with the typical Pro’s and Con’s, advantages versus disadvantages, and other comparative analysis techniques such as SWAT > Strengths, Weakness, Alternatives, Threats)

A well defined initiative should address both the benefits of implementation AND the risks to the operation if it is NOT.

Back on Track

The Vision statement is one starting point to re-energize the team.  Of course, this assumes that the team actually understands and truly embraces the vision.

Overcoming Road Blocks

The Charter: Challenge the team to create and sign up to a charter that clearly defines the scope and expectations of the project.  The team should have clearly defined goals followed by an effective implementation / integration plan.  The charter should not only describe the “Achievements” but also the consequences of failure.  Be clear with the expectations:  Annual Savings of $xxx,xxx by Eliminating “Task A – B – C”, Reducing Inventory by “xx” days, and by  reducing lead times by “xx” days.

Defining Consequences:  Competitive pricing compromised and will lead to loss of business.  This could be rephrased using the model expression:  We must do “THIS” or else “THIS”.  It has been said that the pain of change must be less than the pain of remaining the same.  If not, the program will surely fail.

The Plan: An effective implementation strategy requires a time line that includes reporting gates, key milestones, and the actual events or activities required.  The time line should be such that momentum is sustained.  If progress suggests that the program is ahead of schedule, revise timings for subsequent events where possible.  Extended “voids” or lags in event timing can reduce momentum and cause the team to disengage.

Focus: Often times, we are presented with multiple options to achieve the desired results.  An effective decision making process is required to reduce choices or to create a hybrid solution that encompasses several options.  The decision process must result in a single final solution.

Consequences: As mentioned earlier, a list of consequences should become part of the Charter process as well.  Failure suggests that a desired expectation will not be realized.  It is not enough to simply return to “the way it was”.  The indirect implication is that every failure becomes a learning experience for the next attempt.  In other words, we learn from our failures and stay committed to the course of the charter.

Example:

Almost all software programs are challenged to sort data.  We don’t really think about the “method” that is used.  We just wait for the program to do it’s task and wait for the results to appear.  At some time, the software development team must have chosen a certain method, also known as an algorithm, to sort the data.

We were recently challenged in a similar situation to decide which sort method would be best suited for the application.  You may be surprised to learn that there are many different sorting algorithms available such as:

  1. Bubble Sort
  2. Quick Sort
  3. Heap Sort
  4. Comb Sort
  5. Insertion Sort
  6. Merge Sort
  7. Shaker Sort
  8. Flash Sort
  9. Postman Sort
  10. Radix Sort
  11. Shell Sort

This is certainly quite a selection and more methods are certain to exist.  Each method has it’s advantages and disadvantages.  Some sorting methods require more computer memory, some are stable, others are not.  Our goal was to create a sorted list without duplicates.  We considered adding elements and maintaining a sorted “duplicate free” list in real-time.  We also considered reading all the data first and sorting the data after the fact.

The point is that of the many available options, one solution will eventually be adopted by the team.  Using the “wrong” sorting method could result in extremely slow performance and frustrated users.  In this case the users of the system may abandon a solution that they themselves are not a part of creating.  While a buble sort may produce the intended result, it is usually not the most efficient.

Another aspect of effective development is to document the analysis process that was used to arrive at the final solution.  In our example, we could run comparative timing and computer resource requirements to determine which solution is most suitable to the application.  Some algorithms work better on “nearly sorted” lists versus others that work better with “randomly ordered” data.

Engage the Team: The team should be represented by multiple disciplines or departments within the organization.  Using the simple example from above, the development team may create a working solution that is later abandoned by the ultimate users of the system due to it’s poor performance.  The charter should be very clear on the desired expectations and performance criteria of the final solution.

Creating a model or prototype to represent the solution is common place.  This minimizes the time and resources expended before arriving at the final  solution for implemention.

Vision: Leadership must continue to focus beyond the current steps.  A project or program is not the means to an end.  Rather it should be viewed as the foundation for the next step of the journey.  Lean, like any other initiative, is an evolutionary process.  Lean is not defined by a series of prescriptions and formulas.  The pursuit and elimination of waste is a mission that can be achieved in many different ways.

Management / Review

Regular management reviews should be part of the overall strategy to monitor progress and more so to determine whether there are any impediments to a successful outcome.  The role of leadership is to provide direction to eliminate or resolve the road blocks and to keep the team on track.

Breaking Through Paralysis

The objective is clear – we need to keep the initiative moving and also learn to identify when and why the initiative may have stopped.  Running a business is more than just having good intentions.  We must be prudent in our execution to efficiently and effectively achieve the desired results.

Until Next Time – STAY Lean!

 

Posted in Advanced Lean Manufacturing, Lean Metrics, Lean Mindset Tagged with: , , , , , , ,
Sign Up and JOIN our TEAM

Make it Yours

Namecheap

Free Downloads

December 2017
S M T W T F S
« Sep    
 12
3456789
10111213141516
17181920212223
24252627282930
31  

Categories

Archives

Twitter

error: Content is protected !!