LeanTest key: Test coverage analysis powered by traceability

Size: px
Start display at page:

Download "LeanTest key: Test coverage analysis powered by traceability"

Transcription

1 LeanTest key: Test coverage analysis powered by traceability Christophe LOTZ ASTER Technologies IEEE 11 th International Board Test Workshop 1

2 Targets: Key objectives Our targets are to provide tools that: ü Create an effective environment to continually improve the delivered quality of manufacturing processes. ü Assist in reducing costs of assembly, test, rework, scrap and warranty. ü Help improve line utilization and reduce cycle time. ü Allow manufacturers to better prioritize the deployment of constrained resources. ü Allow manufacturers to benchmark their DPMO rates to others in the industry. regardless of board complexity. 2

3 Test coverage and traceability Ä Good products must be defect-free and cheap. Ä How to detect or prevent all faults on the product so that only good products are shipped? Defect Prevention Defect Detection Ä Test coverage is a key metric as it will be the quality warranty and the main driving factor for LeanTest. Ä This paper describes how traceability tools should be used in order to improve test coverage understanding. 3

4 Defect Universe Ä Identify the faults that can occur. Solder X-Ray (unpowered) Insufficient Excess Cold Solder Marginal Missing Joints Gross Shorts Voids Lifted Leads Bent Leads Extra Part Bridging Tombstone Misaligned Placement Polarity (PCAP) Polarity AOI (unpowered) In-Circuit Shorts Open Inverted Wrong Part Dead Part Bad Part In-System Programming Functionally Bad Short/Open on PCB JTAG At-speed memory tests At-speed interconnect Fault Insertion Gate level diagnosis Material 4

5 Defect Universe Ä Typical manufacturing defects: Missing components Misalignment Wrong value Open Circuits Broken Incorrect Polarity Insufficient solder Excessive solder components Tombstone Short circuits Ä We need to group defects into categories, to understand what defects can be captured by a particular test strategy. Material (Supply chain) Placement Solder 5

6 Test coverage Ä The ability to detect defects can be expressed with a number: coverage. Ä Each defect category fits with its test coverage: MPSF [1] PPVSF PCOLA/SOQ /FAM Material Value Correct Live Placement Presence Presence Alignment Polarity Orientation Solder Solder Short Open Quality Function Function Feature At-Speed Measure

7 Test coverage by defect category Ä For each category (Material, Placement, Solder) of defects (D), we associate the corresponding coverage (C). Effectiveness = Σ D M C M + Σ D P C P + Σ D S C S + Σ D F C F Σ D M + Σ D P + Σ D S + Σ D F Ä The test efficiency is based on a coverage balanced by the defects opportunities. Coverage DPMO We need a better coverage where there are more defect opportunities! 7

8 Test coverage by defect category Ä Each test technique brings a certain ability to detect the defects defined within defect universe. Ä No single solution is capable of detecting all the defects. M P S F M M P P S S F F M P S F Ä Good coverage = combination of tests. 8

9 Case 1: Faulty boards at system level Ä Electronic plants, in charge of board integration, often discover a significant amount of defective boards at system test. AOI ICT BST FT Ä How is it possible to get failures at system level if we only buy good boards? ü The defect appears during packing and transportation (vibration, extreme temperatures, moisture). ü The defect is a dynamic problem which is revealed by the integration of the board in the complete system. ü The reality is usually more simple 9

10 Case 1: Faulty boards at system level Ä If the board is failing at system test, it is usually because the escape rate (or split) is higher than expected. Good Pass FPY Slip Bad Products shipped Test Fail False reject Good FOR Products repaired Bad Ä There are only two possibilities: ü The combined coverage is lower than optimal. ü The DPMO figures are higher than expected. 10

11 Case 1: Faulty boards at system level Ä The auditing conclusions were: ü Wrong or inadequate coverage metrics are produced: Example: confusion between accessibility and testability; coverage by component only - without incorporating solder joint figures ; Over optimistic report (marketing driven report), ü Wrong DPMO figures due to limited traceability or incorrect root cause analysis (Example: confusion between fault message and root cause/defect). Coverage estimation Coverage measurement Selected strategies 11

12 Case 2: DPMO estimation Ä Going beyond solving surface issues. Material Short Open Polarity Tombstone Misalignment Solder Insufficient solder Missing components Broken leads wrong value Placement 12

13 Case 2: DPMO estimation Ä The weighted coverage (with DPMO) is a key factor to estimate the production model. Ä We need an accurate value for DPMO if we want realistic production models. 13

14 Case 2: DPMO estimation Ä Production model in a test line AOI ICT FT 14

15 Case 2: DPMO estimation Ä Basic analysis uses average numbers coming from inemi or the PPM-Monitoring.com web site. It does not reflect the reality, but it is much better than considering all defects as equally probable! Ä The best approach is to use the traceability database in order to extract a table including parameters such as partnumber, shape, mounting technology, pitch, number of pins, function/class, DPMO per category Database (MPSF). Repair Test, inspection & other machines Assembly machines 15

16 Case 2: DPMO estimation Ä Define data collection methods around existing IPC standards. ü IPC 9261 In-Process DPMO and Estimated Yield. ü IPC 7912 Calculation of DPMO and Manufacturing Indices for PCBAs. Ä Define data stratification and classification methods. Ä Combine the data into a single database: ü DPMO for Material (Part number). ü DPMO for Placement (Package type). ü DPMO for Soldering (Reflow & Wave). It requires good cooperation between test and quality services. 16

17 Case 3: DPMO estimation Ä Range and standard deviation for any DPMO statistic. Ä Compare actual yield to estimated yield: ü By test step. ü Full test line. Ä Correlation of test coverage/strategy to DPMO rates. 17

18 Case 2: DPMO estimation Ä During test coverage analysis, TestWay uses various algorithms to estimate the DPMO. ü J Same Part Number, ü K Same shape for placement DPMO, ü L Same pitch and number of pins. Ä With an accurate DPMO representation, it is possible to: ü Estimate the yield and the escape rate. Two key factors used to select the best Contract Manufacturer or EMS - DPMO figures per EMS site. ü Identify the real overlap for test/inspection optimization. DfT becomes one of the principal contributors in cost reduction. 18

19 Case 3: Test Repeatability Ä During first production run, we selected a set of boards where a SPC analysis has been conducted in GR&R context. Ä Gage R&R (Gage Repeatability and Reproducibility) is the amount of measurement variation introduced by a measurement system, which consists of the measuring instrument itself and the individuals using the instrument. A Gage R&R study is a critical step in manufacturing Six Sigma projects, and it quantifies : ü Repeatability variation from the measurement instrument. ü Reproducibility variation from the individuals using the instrument. 19

20 Case 3: Test Repeatability Ä Quality and traceability analysis helps to compute the classic C p, C pk and CmC. Ä CmC means Calibration and Measurement Capability. CmC = Tolerance / k λ (k = 6 for critical components). Ä A test which is not repeatable cannot claim to qualify a component. So CmC is used to weight the Correctness coverage. Ä In addition, the Failure Mode and Effects Analysis (FMEA) gives the criticity per component which limit the oversized test tolerance. 20

21 Case 3: Test Repeatality Ä Passive measurements ü Correct value: Value is tested at 100%. ü Minor deviation: Value is tested at 95%. ü Medium deviation : Value is tested at 50%. ü Major deviation: Value is not tested. ü Incorrect value: Component is not tested. For more accuracy: Compare CAD value and tolerances against minimum and maximum tested values 95% 50% 0% 21

22 Case 4: Real contribution of AOI/AXI Ä AOI and AXI are inspection techniques which are checking for deviations. Ä When deviation is big enough, it should become a defect. Ä When a test line includes an inspection machine and an electrical tester (ICT, BST), it is difficult to agree on test contribution. 95% 47% 53% AXI BST FT 22

23 Case 4: Real contribution of AOI/AXI Ä With a traceability system that collects defects/ repair information in real time, we are able to record that a fault has been detected and how it has been diagnosed (ie: Root cause analysis). Ä We compare the defects that have been detected with ICT and FT against the defects detected by AOI in order to adjust real coverage. Database Test Coverage Analysis AOI ICT FT Diagnosis/ Repair 23

24 Ä Continual reassessment of capability metrics. Ä Improved accuracy of quality estimations. Ä Enhanced defect detection rate by increasing the understanding of test coverage. Ä Reduced escape rate (bad boards to the customer). Zero-defect road 24

25 25