Metrics for Diagnostic Purposes. Steven Kaplan Global MES Administrator

Size: px
Start display at page:

Download "Metrics for Diagnostic Purposes. Steven Kaplan Global MES Administrator"

Transcription

1 Metrics for Diagnostic Purposes Steven Kaplan Global MES Administrator North American Plant-to-Enterprise Conference September 21-23, Orlando, FL

2 The following Strategic Initiatives of MESA International are associated with this presentation: Lean Manufacturing Quality & Regulatory Compliance Real-Time Enterprise

3 Using Metrics to Diagnose Issues = Lower Costs High Cost Of Resolution Recall COST OF RESOLUTION Complaint Low Trend NCR CAPA CAPA: Corrective and Preventive Action NCR: Nonconformance Report Immediate DISCOVERY TIME Extended

4 Study Background MESA Metrics sub-group: Several companies, different disciplines and manufacturing methods What was successful Common metrics or current base line metrics Unique metrics Why utilized, drivers, how metric was developed Key: What these companies do with the metrics.

5 Contributing Companies Murata Power Solutions - electronics National Starch - starch RobMax BiWMetrics - robotics Ablestik - adhesives Teknikum Group Ltd. rubber hoses Camstar software Note: others participated in discussions & review

6 Performance Management in Manufacturing Corporate Goals Manufacturing ranked KPI's Benchmark Data Data Collection Support Systems Frequency of Reporting Additional Data Input Pattern Detection Presentation of Data Check Results Corrective Actions Root Cause Analysis Trends Preventive Actions Alarm System Deviations

7 Definitions A metric is a quantitative value measuring or assessing a given process (derived from wikipedia). Some people simply call metrics: Performance Indicators. A KPI is a specialized metric assessing performance to a corporate goal (thus the word Key in KPI). Aggregate data from multiple systems Set by departments to meet goal Prioritized by importance

8 OEE: A Proven Aggregate Metric Overall Equipment Effectiveness OEE = Availability x Performance x Quality / % Availability x 93.0% Performance x 95.0% Quality = 76.6% OEE

9 Corporate Corporate goals are not only financial Balanced Scorecard: customer; internal business processes; learning and growth; financial Promote a Non Penalizing culture Processes must have owners at each area Plant managers should own the process Everyone needs to be looking at the same metrics Definition Calculation Input variables

10 Support Systems Defined as: Systems to collect, measure, evaluate and report data and corporate processes and process controls Should include Web-based tools Access systems Access points Calculation tables Achieve greater results with: Machine, device or tool integrated visual detection / bar code system

11 Frequency of Reporting: Considerations 1. Real Time varies by plant: 1X/day, by usage 2. Question: is manual input needed for data generation or processing? 3. Frequency ok as a criteria, if the data is truly also evaluated and studied on the same timescale. 4. The frequency of reporting can be driven higher, if data is available and evaluation will leverage this addition.

12 Benchmark Data Categorize By: Departmental, Corporate, Industry, and the World Utilize single measurements that encompass key KPI's Throughput - Score Carding Averaging Metrics Use historic data and industry benchmarks to set goals Set higher expectations as progress Incorporate Lean Manufacturing Principles Establish Six Sigma Methodology identifies areas of concern

13 Data Collection From production line / machine / man Error messages by department Quality assurance data: electronic & manual Transactional data from MES systems, LIMS, ERP, etc.

14 Alarms A method of signaling the occurrence of some undesirable event Text Messages Visual Alerts Audio Alerts Automated Actions

15 Alarm Triggers: Ability to Proactively Prevent Problems Review of OEE Analytics Root Cause Analysis A trigger is set to notify when downtime exceeds 30 minutes in an 8 hour period. With the trigger, we could have the notifications 2 days sooner. Note the downtime occurred 3 days without a trigger for action This would eliminate downtime by almost 2.5 hours

16 Setting The Goal Normalizing a driven goal to a percentage for manufacturing is an obstacle to overcome. REDUCE SCRAP BY 10%

17 Presenting the Data

18 Presenting the Data

19 Visual Score Card KPIs This takes all the KPI's in a plant and sets real expectations against them. With this, numerical scores are determined per KPI. In a multi-plant environment these scores determine who's 'best', to generate improvement via competition.

20 Daily Dashboard Metrics Why To develop daily, forwardlooking metrics that directly correlate with OTS goals Demand Forecast Production Efficiency Schedule Adherence Expectations Daily Dashboard is updated each morning Posted daily in each manufacturing department Reviewed once daily at production planning meetings

21 Daily Dashboard Metrics Why To develop daily, forwardlooking metrics that directly correlate with OTS goals Demand Forecast Production Efficiency Schedule Adherence Expectations Daily Dashboard is updated each morning Posted daily in each manufacturing department Reviewed once daily at production planning meetings

22 Root Cause Analysis & Pattern Detection Root Cause Analysis Strong Portals with good BI tools can drill down to cause of a trend Asking 5 times WHY? Strong failure analysis team, empowered to take corrective measures Pattern Detection Weekly, monthly and quarterly multi-discipline team reviews Productivity by "shifts", with training implied for those with low efficiency Compare past data base info on deviations and root causes

23 Corrective Action Drive negative trends back to area of cause then corrective action Nominate responsible person Give deadlines Tiger team selects actions based on impact & speed of implementation Provide on-line help Auto-trigger with manual intervention option to confirm "corrective action and follow up" etc... Review corrective actions in one plant in detail prior to implementing in other plants

24 Check Results Compare the results achieved with benchmarks before the project was implemented. Review benchmarks in a global manner. Revisit KPIs and metrics definitions to improve accuracy and ensure they represent the process Performance Management is a continuous cycle Plan The Deming Cycle Act Do Check

25 KPI s In Action The 3 P s Predictive Automatically and manually analyze patterns and events in real time Continuously monitor manufacturing and quality information Proactive Allow automatic and manual intervention and adjustment of processes Provide actionable Intelligence Preventive Elevate information to knowledge Best practices and processes based on process understanding Closing the loop

26 100% Ablestik 2008 Right First Time Your KPIs Effect Your Customers Results Ablestik supplies Murata-PS with Epoxies & Pre-forms Both companies participated in this study Right First Time (Viscosity): Murata-PS Hybrid First Inspection 98% 96% 94% PPM % % 88% YTD Jan Feb Mar Apr May Jun Jul Aug Sept Oct Nov Dec 11/2/ /16/ /30/ /14/ /28/2007 1/11/2008 1/25/2008 2/8/2008 2/22/2008 3/7/2008 3/21/2008 4/4/2008 4/18/2008 China Japan Korea UK USA Global Note: To improve RFT in USA, adjustment data is being tracked by product and batch, enabling the development of a pareto analysis to determine the worst offending products.

27 Ablestik Your KPIs Effect Your Customers Results Right First Time (Viscosity): 2008 Right First Time Murata-PS 100% Hybrid First Inspection 98% % % PPM % % 88% YTD Jan Feb Mar Apr May Jun Jul Aug Sept Oct Nov Dec 11/2/ /16/ /30/ /14/ /28/2007 1/11/2008 1/25/2008 2/8/2008 2/22/2008 3/7/2008 3/21/2008 4/4/2008 4/18/2008 China Japan Korea UK USA Global Coincidence?

28 Challenges People & Process Operation that finds a fault is not always the operation causing it Queue & Lead times can prevent quick resolutions to a problem Set Up Times not measured IT Systems Multiple systems are very fragmented Separate quality control system for corrective action Some machines do not have an access point to automatically collect data No Communication between the reporting systems

29 Comments Performance management = processes, not a tool or ability to see the KPI Actionable intelligence is the goal Defining & extracting metrics is toughest Auto presentation of the data is the 'minor' part of the battle Junk wrapped in a pretty facade is still junk Trends matter more than actual Focus on the whole context not the metric Companies missing aspects will fall short

30 Metrics for Diagnostic Purposes Working Group Company Profiles & Submissions: Steve Kaplan, Murata Power Solutions Jian Xu, National Starch Nicolaus von Baillou, RobMax_BiWMetrics Greg Agnew, Ablestik Vesa vihavainen, Teknikum Group Ltd Gilad Langer, Camstar Metrics Co-Chairs: Julie Fraser, Principal Analyst, Cambashi Jonathan Siudut, Exec. Software Proj Mgr, IBM Steven Kaplan, MES Administrator, Murata Power Solutions MESA Contact: Brandy Richardson

31 Next Up For Metrics Metrics for Diagnostic Purposes White Paper Soon to be available at: MESA Member Contact: Steven Kaplan, Murata Power Solutions Operational Metrics Ties to Financial Metrics & Outcomes MESA Member Contact: Darren Riley, Rockwell Automation Plant-Warehouse Metrics for End-to-End Execution Success MESA Member Contact: Julie Fraser, Cambashi

32