UWSpace is currently experiencing technical difficulties resulting from its recent migration to a new version of its software. These technical issues are not affecting the submission and browse features of the site. UWaterloo community members may continue submitting items to UWSpace. We apologize for the inconvenience, and are actively working to resolve these technical issues.
 

Statistical Methods in the Search for a Dominant Cause of Variation

Loading...
Thumbnail Image

Date

2023-12-15

Authors

Panahi, Mahsa

Journal Title

Journal ISSN

Volume Title

Publisher

University of Waterloo

Abstract

Excessive variation in critical-to-quality characteristics, referred to as process outputs in this thesis, is a common issue in manufacturing industries. Most variation reduction frameworks initially investigate the process to identify the cause(s) of output variation and then seek a solution to eliminate the effect of the identified cause(s). However, among all causes, usually, only a few have a large contribution to the overall variability. The literature refers to them as the dominant cause(s). Identifying the dominant cause(s) is an effective and recommended initial step in reducing output variation; however, it is often not straightforward. An effective strategy for this step is to employ the method of elimination, i.e., starting with a large number of suspect causes and progressively, after each investigation, eliminating groups of suspects, thereby homing in on the identity of the actual dominant cause(s). Once the dominant cause(s) is identified, verifying it before proceeding with corrective actions is crucial. Although identifying and verifying a dominant cause is the recommended first step in variation reduction projects, we believe some of the employed statistical tools for these purposes are not the most efficient and lack a thorough scientific analysis. This thesis aims to bridge this gap by proposing study designs and analysis methods that retain valuable ideas from the existing literature but are better suited for the goal of identifying or verifying the dominant causes(s). Our objective is to contribute to the enrichment of the field of statistics in problem-solving and variation reduction, which needs further development. This thesis is structured in an integrated format, comprising five chapters: the introduction, three papers, and the conclusion. Chapter 1 is devoted to a literature review and provides some background on some important variation reduction approaches such as the Taguchi method, Six Sigma, the Shainin System, and the Statistical Engineering algorithm. The focus and main interest of this thesis are on the Statistical Engineering algorithm and the Shainin System. Chapter 2 is devoted to a critical examination of group comparison, an investigation type often used in the method of elimination to help identify the dominant cause(s). With group comparison, we select two groups of six or more parts, one group consisting of parts with large output characteristic values and the other group consisting of parts with low output characteristic values. For these selected parts, we measure as many input characteristics as possible that are still suspect dominant causes (and possible to determine after observing the output). If an input is a dominant cause, its values must differ substantially between the two groups. The existing analysis procedures frame the group comparison investigation as a hypothesis test, which we demonstrate is unreliable and inefficient. Instead, we frame the question as an estimation problem based on maximum likelihood. A critical evaluation reveals that our proposed method is superior. Chapter 3 is devoted to a critical assessment of component swapping, another investigation type often used in the method of elimination. The component swapping investigation is applicable when assembled products can be disassembled and reassembled without significant damage. Component swapping consists of a series of studies to determine whether the dominant cause(s) acts within the assembly process or within one or more of the components. It selects two products, one with a high and one with a low output value, and then conducts a two-phased investigation to identify the home of the dominant cause. Although the investigation plan is valuable, we demonstrate that the existing analysis procedures are unreliable. This chapter explores an improved plan and analysis procedure. We proposed a reliable alternative analysis procedure based on maximum likelihood. It also addresses a critical gap in the existing literature by effectively alerting users to possible important interactions either between assembly and components or among individual components. Chapter 4 investigates how to verify a dominant cause effectively. All existing analysis procedures only use a randomized controlled experiment for the verification study. However, we demonstrate that experimental studies alone cannot provide all the required information, and we also require observational studies. This chapter lists some viable composite study designs, assesses their relative merits, and recommends proper sample sizes. It also investigates how to systematically conduct a verification study in the era of smart manufacturing. In Chapter 5 we conclude and present some directions for future research.

Description

Keywords

variation reduction, process improvement, statistical engineering, applied statistics, group comparison, component swapping, verification study

LC Keywords

Citation