Download The MotoTech Manufacturing Company: Process Control and

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
Additional information, including supplemental material and rights and permission policies, is available at http://ite.pubs.informs.org.
Vol. 10, No. 2, January 2010, pp. 90–94
issn 1532-0545 10 1002 0090
informs
®
doi 10.1287/ited.1090.0041tn-a
© 2010 INFORMS
I N F O R M S
Transactions on Education
Teaching Note
The MotoTech Manufacturing Company:
Process Control and Improvement
Prakash Mirchandani
Katz Graduate School of Business, University of Pittsburgh, 358 Mervis Hall, Pittsburgh, Pennsylvania 15260,
[email protected]
D
istribution: To maintain the integrity and usefulness of cases published in ITE, distribution of these teaching
notes to any other party is prohibited. Please refer interested instructors to ITE for access to the teaching
notes.
Key words: process control charts; X-bar chart; capability indices; six sigma
History: Received: June 2009; accepted: November 2009.
Part A
help develop readiness plans for use whenever the
need arises.
Should John Tagole have “waited out the storm,” hoping
that Semicon will relax its new, more stringent, specifications? Would this strategy have worked in the short term?
in the long term?
Business environments are becoming increasingly
more competitive and global in every industry. Even
in industries typically thought of as being high technology, such as electronics, medical diagnostics, and
computers, in which U.S. companies had an advantage, the center of production is shifting to China and
other Far-Eastern nations. Semicon, therefore, is likely
facing competitive pressures of its own to improve
its product offerings. It is therefore highly improbable that Semicon would be willing or able to relax its
proposed more stringent requirements.
Even if we assume that Semicon’s threat is only
partially warranted at this stage, the future may not
bode well for MotoTech Manufacturing (MM) unless
it is proactive in responding to Semicon’s concerns.
Semicon might give preference to suppliers with more
advanced processes when it introduces its next generation of products. Given that electronics products
have a short technological life cycle, it might not be
long before this happens.
In any case, it makes sense for John Tagole to investigate and find out exactly what Semicon is looking
for in its proposed specifications and what changes
are needed at MM for meeting those specifications.
Even if the situation does not warrant immediate process changes, John Tagole’s proactive approach will
Part B
John Tagole provides your group, The Famous Five, with
the data set in the worksheet “Pre_Improvement.” The data
in the worksheet were taken from the diffusion process.
Samples of three wafers, were randomly selected from each
batch of 200 wafers, and the thickness of the silicon dioxide
deposition was measured on the wafers. The data are from
72 batches.
Answer the following questions for this data set.
1. For this data set, construct the X-bar and range control charts. Attach a printout showing your X-bar and
range control charts.
See the attached Excel file, MotoTech (PCI) Solution.
xls, for the control charts.
2. Use the range chart to determine if the process variation is in control.
The process variation is not in control. In particular,
the sample range for samples 8 and 40 exceeds the
upper control limit of the range chart. Because we
have already determined that the range chart is out
of control, it is not necessary to check whether the
sample data violates any of the other rules that may
indicate that the process variation is out of control.
3. Use the X-bar chart to determine if the process mean
is in control.
The process mean is out of control because several
sample means fall outside of the control limits of the
90
Mirchandani: Teaching Note: The MotoTech Manufacturing Company: Process Control and Improvement
Additional information, including supplemental material and rights and permission policies, is available at http://ite.pubs.informs.org.
INFORMS Transactions on Education 10(2), pp. 90–94, © 2010 INFORMS
X-bar chart. (For example, observations 5, 27, 40, and
41 are over the upper control limit, and 32, 33, 61, 64,
70, and 72 are under the lower control limit). However, see the answer to Part B.4 below.
4. If the process variation is found to be out of control in
Part B2, would the control limits of the X-bar chart have
been valid? Why or why not?
If the process variation is out of control, the control
limits of the X-bar chart do not have any meaning.
This is because the control limits of the X-bar chart
depend on the mean range value. The much higher
value of the range for sample 40 skews the computation of the mean range value and, thus, the upper
and lower control limits of the X-bar chart. Therefore,
when the range chart is out of control, as is the case
here, the X-bar chart should not be constructed.
Part C
After observing the control charts in Part B, The Famous
Five investigate the reasons for the identifiable causes of
variation. They find that the process went out of control
during times when the plant air conditioning system was
shut down for preventive maintenance. They recommend
that a back-up air conditioner be installed and the temperature in the diffusion room be maintained at 60 F. This
recommended temperature setting is based on The Famous
Five’s general experience, although local atmospheric conditions and raw-material composition can also potentially
affect the recommended temperature. Until the new air
conditioner can be installed, The Famous Five recommend
that the preventive maintenance be carried out on weekend
nights, when the diffusion process is stopped. They ask John
Tagole to collect data for an additional 72 batches under
these controlled conditions. These data are enclosed in the
worksheet “Post_Improvement.”
1. For this data set, construct the X-bar and range
charts. Attach a printout showing your control charts.
See the attached Excel file, MotoTech (PCI) Solution.xls, for the control charts.
2. Use the range chart to determine whether the process
variation is in control.
If we check whether or not the sample range values
are within the upper and lower control limits of the
range control chart, we do not see any evidence of the
process being out of control. However, several other
rules can indicate that the process is likely to be out of
control. These rules all look for a pattern of some sort,
and the basic underlying idea is that these patterns all
have the same probability of occurring if the process
is in control.
Because there are many such rules, I only ask
my students to check three others (beyond the basic
one that checks for the sample statistic falling outside of the upper and lower control limits). The
other rules require that the region between the upper
91
control limit and the center line be split up into
three regions; similarly, the region between the lower
control limit and the center line also needs to be
split up.
The three rules that I do ask the students to check
are as follows:
(a) Nine consecutive points that are all above or
all below the center line on either the either the range
or the X-bar chart1 ;
(b) Six consecutively increasing or decreasing
observations on either the range or the X-bar chart;
and
(c) Fourteen consecutive points that alternately
increase and decrease.
Using these three rules, the process variation is in
control.
3. Use the X-bar chart to determine whether the process
mean is in control.
Using the same rules as above the process mean is in
control.
Sometimes, I change the data so that one of the
sample means or one of the range values falls close
to control limits (either within or outside the control
limits). I want students to think through and recognize that there is a 0.0027 chance that a sample mean
will fall outside of the control limits, even when the
process is in control. If there are several points that
are, for instance, close to, but do not fall beyond, the
control limits, none of the rules discussed above for
identifying an out-of-control process may be violated.
Yet, the process may be out of control. Therefore, the
idea is that students should use judgment along with
the statistics in determining whether a process is in
control or out of control.
Part D
Based on your analysis in Parts B and C, what would you
recommend?
The analysis in Parts C and D seems to indicate that temperature in the diffusion room is an
important factor in keeping the process under control. Therefore, a first (short-term) recommendation
is keep the diffusion room temperature at 60 F. This
can be done by scheduling preventive maintenance on
weekends or by upgrading the air conditioning system. MM should implement tighter temperature controls, so that the identifiable or special cause variation
associated with temperature does not cause quality
problems.
There are some other issues that the Famous Five
need to consider. They should check whether the differences seen between Parts C and D were indeed
1
If the process is centered, the probability of this happening is
2∗ 1/29 = 00039. This is about the same as the probability of an
observation falling outside of the upper and the lower control chart
limits (0.0027).
Additional information, including supplemental material and rights and permission policies, is available at http://ite.pubs.informs.org.
92
Mirchandani: Teaching Note: The MotoTech Manufacturing Company: Process Control and Improvement
attributable to stabilizing the air conditioning system
and are not attributable to some other factor (such as
raw material supplies, or a change in one of the manufacturing stages that precedes the diffusion stage).
The case does not mention any of these factors. However, it is important to determine that none of the
other factors changed between Parts C and D, before
a cause-and-effect relationship between temperature
and diffusion quality can be ascertained. Some students immediately jump to conclusions about causeand-effect relationships with the data provided, not
recognizing that there may be a confounding factor
that is affecting the results.
Another point that students are expected to make is
that, even if the relationship between temperature and
diffusion quality holds, and the process is in control
as in Part D, MM will have to make further improvements. That is, even when the process is in control,
its output need not meet the specifications: a stable
process is not necessarily capable. For this reason,
once the process is brought under control, one should
check the capability of the process. One can do so by
determining the proportion of output falling outside
of the control limits, computing the Cpk for the process, or by computing the sigma level of the process.
We compute these metrics in Part E.
Finally, is 60 F the best temperature? Does 58 F, or
62 F, produce better output than 60 F? What other factors affect the quality, and what are the best settings
for these other factors? Some statistics courses cover
design of experiments. A discussion of how to find
the best temperature and its interaction with other
factors is a good lead-in to design of experiments.
A companion case (MotoTech Manufacturing Company: Design of Experiments/ANOVA) covers that
issue.
Part E
In Part C, we found that the process is in control
when we set the temperature in the diffusion room to
60 F. The Famous Five construct a histogram of the
“Post_Improvement” observations and conclude that the
distribution is normal.2 Answer the following questions for
the data used in Part C.
1. What is the cumulative probability for a single wafer
to have a thickness of 3000 angstroms?
The process mean is 3061.3 angstroms, and the process standard deviation is 38.3 angstroms. Therefore,
2
If checking for normality has been covered, then the instructor
can ask the students to fit the process output data. Although Excel
does not do so, commercial packages such as SPSS and SAS will
construct the P-P (acronym for probability-probability) plot or perform statistical tests (such as the 2 test, Kolmogorov-Smirnov test,
or the Anderson-Darling test. A visual test can be done in Excel by
constructing a histogram.)
INFORMS Transactions on Education 10(2), pp. 90–94, © 2010 INFORMS
the cumulative probability for a single wafer having,
a thickness of 3000 angstroms is 0.0550. (Please see
“Parts E and F” worksheet of the Excel Solution file.)
Students sometimes make a mistake in computing the standard deviation to be used for computing this probability. They incorrectly use the standard
deviation of the 72 sample mean values (which is
an estimate of the standard error) in computing the
cumulative probability.
This problem helps students learn that an estimate
of the standard deviation of the process is the standard deviation of the entire output of the process,
that is, the entire 216 values. This part also serves to
review the concept of cumulative probability. The students may have seen cumulative probability earlier,
but such a review is important because this concept is
important for simulation, which they will see in subsequent courses.
2. What is the percentage of defectives being produced
under the current setup?
A wafer does not meet the specifications if its thickness is less than 2900 angstroms or more than 3100
angstroms. The probability of this happening is 0.156.
(Please see the “Parts E and F” worksheet of the Excel
solution file.)
Some students will find the proportion of nonconforming wafers being produced by the current process (by counting the number of observations in the
sample that fall outside of the 2900–3100 interval and
dividing this number count by 216). This proportion
is an estimate of the probability that we want; however, we can obtain a better estimate of the probability
if we first fit the data to a distribution. We are told
that the output process is normal; we can compute the
mean and the standard deviation of this distribution
to estimate the probability we want.
3. Compute the process capability index, Cp , for this situation. Is the process capable of meeting the customer’s
requirements? Is Cp the appropriate metric for measuring
process capability in this case? If so, why? If not, why not?
What other metric would you suggest using?
If LSL denotes the lower specification limit and USL
denotes the upper specification limit, and Process Std.
Dev. denotes the process standard deviation, Cp equals
USL − LSL/6 ∗ Process Std. Dev., which in this
particular case is 0.87.
Because the process is not centered, Cp may overestimate the process capability, and a better capability
index to use when a process is off-center as in this situation, is Cpk . If LSL denotes the lower specification
limit and USL denotes the upper specification limit,
and Process Mean and Process Std. Dev. denote the
process mean and process standard deviation,
Process Mean−LSL USL−Process Mean
Cpk = min
3∗Process Std. Dev. 3∗Process Std. Dev.
Mirchandani: Teaching Note: The MotoTech Manufacturing Company: Process Control and Improvement
Additional information, including supplemental material and rights and permission policies, is available at http://ite.pubs.informs.org.
INFORMS Transactions on Education 10(2), pp. 90–94, © 2010 INFORMS
Because we do not know the true process mean or the
true process standard deviation, we use our estimates
of these two parameters, and for our data, Cpk is 0.337.
(When the process is not centered, Cpk is always lower
than Cp .) A generally accepted value of Cpk for the corresponding process to be considered capable is 4/3.
Some people (and some industries) might consider a
value of Cpk as low as 1 to be capable; however, a Cpk
of 0.337 does not indicate a capable process by any
standards.
4. Is this a six sigma process? If not, what is its sigma
level? Assuming that the process mean shifts by at most
1.5 sigma in either direction of the target, compute the
approximate number of defectives out of a million. (Hint:
In Motorola’s experience, process mean shifts of up to 1.5
sigma can go undetected, and that is why they use this
assumption to calculate the proportion of defectives.)
Sigma level equals USL−LSL/2∗Process Std. Dev.
= 3100−2900/2∗383 = 2609. Therefore, this is not
a six sigma process. To compute the number of defectives, 1.5 sigma should be subtracted from the sigma
level. This is because, in the original computation
of proportion of defectives given the sigma level,
Motorola had assumed that shifts in the process
mean of up to 1.5 sigma are difficult to detect. So, the
process mean may shift by up to 1.5 sigma without
our realizing it. This gives us 2609 − 15 = 1109.
Therefore, the probability of defectives is 0.1337, or
133,700 defectives out of one million. As expected,
this figure is much higher than the six sigma level of
3.4 defectives out of 1 million.
Part F
1. Should The Famous Five recommend that MM lease
the new equipment?3 Why or why not? (In the computation of the expected costs, you can ignore the time value
of money because some of you may not yet know how to
incorporate it. You can also ignore inflation.)
From the analysis in the worksheet, Parts E and
F, of the Excel solutions file, we see that in each of
the five years that we have considered in our planning horizon, the rework and reject cost savings associated with new equipment more than compensates
for the leasing expense of the new equipment.4 The
difference in the contribution, after accounting for the
cost of the new equipment starts at about $3.8 million and increases to about $24 million as the demand
increases. Therefore, regardless of the discount rate,
from an economic perspective, it makes sense to lease
3
To find the probabilities, one can use the normal tables or can use
Excel’s = NORMDIST function.
4
This analysis assumes that each year MM’s total production quantity equals the expected demand for that year. Thus, if any wafers
are rejected because of poor quality, then the quantity sold is lower
than the demand.
93
the new equipment. (If the students are familiar with
present value analysis, they can be given a discount
rate and asked to compute the net present value.)
There may be other reasons for leasing the new equipment, such as positive impact, that a more consistent
product can have on market perception, revenue, and
profitability.
If time is available, I discuss whether it is appropriate to use the rework and reject costs as described
in Part F of the case. The cost structure as depicted
is convex—the further away we are from the target
value (of 3000 angstroms), the greater is the rework
cost. In principle, this form is similar to the form
of the quadratic Taguchi loss function. Taguchi loss
function reflects the fact that a customer’s level of
dissatisfaction increases the further product characteristics are from the target values. In this case, the
rework cost for the manufacturer increases the further
the product is from the target value.
2. What else would you recommend for the future?
Whether an investment is made in the new equipment or not, by better regulating the air-conditioning
temperature, MM has brought the diffusion process
under control. MM should continue to undertake continuous improvement activities at the diffusion stage.
It should do controlled experimentation regarding
other factors that might affect quality, such as correct
raw material specifications, and atmospheric conditions, such as humidity levels and barometric pressure settings. The variation in quality may also be
attributable to other stages of production (such as
plasma etching, ion implantation, chemical deposition, etc.) An evaluation of these departments can lead
to further improving the output quality. The production stages that follow diffusion, including final packaging and transportation, should also be studied to
ensure that quality is not adversely affected at these
stages. Finally, MM should also keep a close watch on
alternative semiconductor technologies that are being
developed, as these might affect both acceptable quality levels and process investment decisions.
Comments on Assigning the Questions
Instructors may note that some of the later parts of
Case 1 implicitly “give away” the solution to previous parts. (Part C implicitly implies that the data in
Part B would indicate that the process is out of control. Part D mentions that Part C data satisfy the normality assumption.) I believe that immediate feedback
(partial) about work done correctly, or a hint that the
students have made a mistake in a previous analysis, can actually aid learning and boost student confidence. Of course, other instructors may not agree with
this assessment and may decide to split up the case
questions into multiple assignments. If they decide to
do so, then a suggested split is Parts A and B, followed by Parts C and D, and then Parts E and F. (In
Additional information, including supplemental material and rights and permission policies, is available at http://ite.pubs.informs.org.
94
Mirchandani: Teaching Note: The MotoTech Manufacturing Company: Process Control and Improvement
Case 2, none of the later parts give any hints to a
previous part, and so this situation does not arise.)
Second, I often change the data somewhat from
year to year to prevent information transfer among
the different classes. I also do this to sometimes highlight some other conceptual issues. For example, I
might change the data so that one of the sample
means or sample ranges lies close to (but not beyond)
the control chart limits, and then see if students
INFORMS Transactions on Education 10(2), pp. 90–94, © 2010 INFORMS
recognize the managerial impact of observing such a
data point. Also, cost figures can be changed from
year to year (it is easy to check, using the Excel solution spreadsheet, whether or not the decision to invest
in new equipment changes with a change in costs).
Supplementary Material
Files that accompany this paper can be found and
downloaded from http://ite.pubs.informs.org.