ISBN-13: 9781119422754 / Angielski / Twarda / 2018 / 368 str.
ISBN-13: 9781119422754 / Angielski / Twarda / 2018 / 368 str.
A systems-level approach to reducing liability through process improvement Forensic Systems Analysis: Evaluating Operations by Discovery presents a systematic framework for uncovering and resolving problematic process failures.
Preface xix
1 What Is Forensic Systems Engineering? 1
1.1 Systems and Systems Engineering 1
1.2 Forensic Systems Engineering 2
References 4
2 Contracts, Specifications, and Standards 7
2.1 General 7
2.2 The Contract 9
2.2.1 Considerations 9
2.2.2 Contract Review 10
2.3 Specifications 12
2.4 Standards 14
Credits 16
References 16
3 Management Systems 17
3.1 Management Standards 18
3.1.1 Operations and Good Business Practices 18
3.1.2 Attributes of Management Standards 18
3.2 Effective Management Systems 19
3.2.1 Malcolm Baldrige 19
3.2.2 Total Quality Management 20
3.2.3 Six Sigma 20
3.2.4 Lean 21
3.2.5 Production Part Approval Process 22
3.3 Performance and Performance 23
3.4 Addendum 23
Credits 24
References 24
4 Performance Management: ISO 9001 25
4.1 Background of ISO 9000 26
4.1.1 ISO 9001 in the United States 27
4.1.2 Structure of ISO 9000: 2005 27
4.1.3 The Process Approach 28
4.2 Form and Substance 32
4.2.1 Reference Performance Standards 33
4.2.2 Forensics and the Paper Trail 34
Credits 35
References 35
5 The Materiality of Operations 37
5.1 Rationale for Financial Metrics 38
5.1.1 Sarbanes Oxley 38
5.1.1.1 Title III: Corporate Responsibility 38
5.1.1.2 Title IV: Enhanced Financial Disclosures 39
5.1.2 Internal Control 39
5.1.3 The Materiality of Quality 41
5.2 Mapping Operations to Finance 41
5.2.1 The Liability of Quality 43
5.2.2 The Forensic View 44
Credits 44
References 44
6 Process Liability 47
6.1 Theory of Process Liability 48
6.1.1 Operations and Process Liability 50
6.1.2 Process Liability and Misfeasance 51
6.2 Process Liability and the Law 52
Credits 52
References 52
7 Forensic Analysis of Process Liability 55
7.1 Improper Manufacturing Operations 57
7.1.1 Verification and Validation 57
7.1.1.1 Nonstandard Design Procedures 57
7.1.1.2 Unverified or Unvalidated Design 58
7.1.1.3 Tests Waived by Management 58
7.1.1.4 Altered Test Procedures and Results 58
7.1.2 Resource Management 59
7.1.2.1 Unmonitored Outsourcing 59
7.1.2.2 Substandard Purchased Parts 60
7.1.2.3 Ghost Inventory 60
7.1.2.4 Ineffective Flow Down 61
7.1.3 Process Management 61
7.1.3.1 Forced Production 61
7.1.3.2 Abuse and Threats by Management 62
7.2 Management Responsibility 62
7.2.1 Effective Internal Controls 62
7.2.2 Business Standards of Care 63
7.2.3 Liability Risk Management 64
7.2.4 Employee Empowerment 65
7.2.5 Effective Management Review 65
7.2.6 Closed ]Loop Processes 66
References 67
8 Legal Trends to Process Liability 71
8.1 An Idea Whose Time Has Come 71
8.2 Some Court Actions Thus Far 72
8.2.1 QMS Certified Organizations 73
8.2.2 QMS Noncertified Organizations 74
References 75
9 Process Stability and Capability 77
9.1 Process Stability 77
9.1.1 Stability and Stationarity 78
9.1.2 Stability Conditions 79
9.1.3 Stable Processes 80
9.1.4 Measuring Process Stability 82
9.2 Process Capability 83
9.2.1 Measuring Capability 83
9.2.2 A Limit of Process Capability 85
9.3 The Rare Event 85
9.3.1 Instability and the Rare Event 85
9.3.2 Identifying the Rare Event 86
9.4 Attribute Testing 87
References 88
10 Forensic Issues in Product Reliability 91
10.1 Background in Product Reliability 91
10.2 Legal Issues in the Design of Reliability 94
10.2.1 Good Design Practices 95
10.2.2 Design Is Intrinsic to Manufacturing and Service 95
10.2.3 Intended Use 95
10.2.4 Paper Trail of Evidence 96
10.2.5 Reliability Is an Implied Design Requirement 97
10.3 Legal Issues in Measuring Reliability 97
10.3.1 Failure Modes 97
10.3.2 Estimation of MTTF 98
10.3.3 The More Failure Data the Better 99
10.3.4 The Paper Trail of Reliability Measurement 99
10.4 Legal Issues in Testing for Reliability 100
10.4.1 Defined and Documented Life Test Procedures 100
10.4.2 Life Test Records and Reports 101
10.4.3 Test Procedures 101
10.5 When Product Reliability Is not in the Contract 102
10.5.1 Product Liability 102
10.5.2 ISO 9001 and FAR 103
10.6 Warranty and Reliability 104
References 105
11 Forensic View of Internal Control 107
11.1 Internal Controls 108
11.1.1 Purpose of Control 108
11.1.2 Control Defined 109
11.1.3 Control Elements in Operations 109
11.2 Control Stability 110
11.2.1 Model of a Continuous System 111
11.2.2 Transfer Functions 112
11.3 Implementing Controls 115
11.4 Control of Operations 117
11.4.1 Proportional (Gain) Control 118
11.4.2 Controlling the Effect of Change 119
11.4.2.1 Integral Control 120
11.4.2.2 Derivative (Rate) Control 121
11.4.3 Responsibility, Authority, and Accountability 121
References 123
12 Case Study: Madelena Airframes Corporation 125
12.1 Background of the Case 126
12.2 Problem Description 127
12.2.1 MAC Policies and Procedures (Missile Production) 127
12.2.2 Missile Test 127
12.3 Examining the Evidence 128
12.3.1 Evidence: The Players 129
12.3.2 Evidence: E ]mails 129
12.4 Depositions 132
12.4.1 Deposition of the General Manager 132
12.4.2 Deposition of the Senior Test Engineer 132
12.4.3 Deposition of the Production Manager 132
12.4.4 Deposition of the Chief Design Engineer 133
12.4.5 Deposition of the Test Programs Manager 133
12.5 Problem Analysis 133
12.5.1 Review of the Evidence 133
12.5.2 Nonconformities 134
12.5.2.1 Clause 7.3.1(b) Design and Development Planning 134
12.5.2.2 Clause 7.3.5 Design and Development Verification 135
12.5.2.3 Clause 7.3.6 Design and Development Validation 135
12.5.2.4 Clause 8.1 General Test Requirements 135
12.5.2.5 Clause 8.2.4 Monitoring and Measurement of Product 135
12.5.2.6 Clause 4.1 General QMS Requirements 135
12.5.2.7 Clause 5.6.1 General Management Review
Requirements 135
12.6 Arriving at the Truth 136
12.7 Damages 137
12.7.1 Synthesis of Damages 137
12.7.2 Costs of Correction 137
References 138
13 Examining Serially Dependent Processes 139
13.1 Serial Dependence: Causal Correlation 140
13.2 Properties of Serial Dependence 142
13.2.1 Work Station Definition 142
13.2.2 Assumptions 142
13.2.2.1 Assumption 1 143
13.2.2.2 Assumption 2 143
13.2.2.3 Assumption 3 143
13.2.3 Development of the Conditional Distribution 144
13.2.4 Process Stability 145
13.3 Serial Dependence: Noncausal Correlation 147
13.4 Forensic Systems Analysis 147
Credits 148
References 148
14 Measuring Operations 149
14.1 ISO 9000 as Internal Controls 151
14.2 QMS Characteristics 152
14.3 The QMS Forensic Model 154
14.3.1 Estimating Control Risk 155
14.3.2 Cost of Liability 156
14.4 The Forensic Lab and Operations 157
14.5 Conclusions 158
Credits 159
References 159
15 Stability Analysis of Dysfunctional Processes 161
15.1 Special Terms 162
15.1.1 Dysfunction 162
15.1.2 Common and Special Causes 163
15.1.3 Disturbances and Interventions 163
15.1.4 Cause and Effect 163
15.2 Literature Review 165
15.3 Question Before the Law 168
15.4 Process Stability 169
15.4.1 Internal Control 170
15.4.2 Mathematical Model for Correlation 170
15.5 Conclusions 173
Credits 174
References 174
16 Verification and Validation 179
16.1 Cause and Effect 180
16.1.1 An Historical View 180
16.1.2 Productivity versus Quality 182
16.2 What Is in a Name? 185
16.2.1 Verification and Validation Defined 186
16.2.2 Inspection and Test 187
16.2.3 Monitor and Measure 188
16.2.4 Subtle Transitions 189
16.3 The Forensic View of Measurement 190
16.3.1 Machine Tools and Tooling 190
16.3.2 Measurement 191
16.3.3 Control Charting 192
16.3.4 First Pass Yield 192
16.3.5 First Article Inspection 193
16.3.6 Tool Try 194
References 194
17 Forensic Sampling of Internal Controls 197
17.1 Populations 198
17.1.1 Sample Population 199
17.1.2 Homogeneity 199
17.1.3 Population Size 200
17.1.4 One Hundred Percent Inspection 201
17.2 Sampling Plan 201
17.2.1 Objectives 201
17.2.2 Statistical and Nonstatistical Sampling 202
17.2.3 Fixed Size and Stop ]or ]Go 203
17.2.4 Sample Selection and Size 204
17.3 Attribute Sampling 204
17.3.1 Internal Control Sampling 204
17.3.2 Deviation Rates 206
17.3.2.1 Acceptable Deviation Rate 206
17.3.2.2 System Deviation Rate 207
17.3.3 Sampling Risks 207
17.3.3.1 Control Risk 207
17.3.3.2 Alpha and Beta Risks 208
17.3.4 Confidence Level 208
17.3.5 Evaluation 209
17.4 Forensic System Caveats 209
References 210
18 Forensic Analysis of Supplier Control 211
18.1 Outsourcing 213
18.2 Supply Chain Management 215
18.3 Forensic Analysis of Supply Systems 216
18.3.1 Basic Principles of Supplier Control 216
18.3.2 The Forensic Challenge 216
18.3.2.1 Ensure that Purchased Units Conform
to Contracted Specifications 217
18.3.2.2 Assessment of the Supplier Process 218
18.3.2.3 Tracking 218
18.3.2.4 Customer Relations 219
18.3.2.5 Verification and Storage of Supplies 221
18.3.2.6 Identification and Traceability 222
18.4 Supplier Verification: A Case Study 223
18.4.1 Manufacture 224
18.4.2 V50 Testing 224
18.4.3 V50 Test Results 226
18.5 Malfeasant Supply Systems 226
References 227
19 Discovering System Nonconformity 229
19.1 Identifying Nonconformities 231
19.1.1 Reporting Nonconformities 232
19.1.2 Disputes 233
19.2 The Elements of Assessment 234
19.2.1 Measures of Performance 234
19.2.2 Considerations in Forensic Analysis of Systems 235
19.3 Forming Decisions 236
19.4 Describing Nonconformities 238
19.5 A Forensic View of Documented Information 240
19.5.1 Requirements in Documented Information 241
19.5.2 The Quality Manual 241
19.5.3 Documented Information Control 243
19.5.4 Records 244
Acknowledgment 246
References 246
Appendix A The Engineering Design Process: A Descriptive View 247
A.1 Design and Development 248
A.1.1 The Design Process 248
A.1.2 Customer Requirements 249
A.1.3 Interactive Design 249
A.1.4 Intermediate Testing 249
A.1.5 Final Iteration 251
A.2 Forensic Analysis of the Design Process 252
References 253
Appendix B Introduction to Product Reliability 255
B.1 Reliability Characteristics 256
B.1.1 Reliability Metrics 256
B.1.2 Visual Life Cycle 257
B.2 Weibull Analysis 259
B.2.1 Distributions 259
B.2.2 Shape and Scale 260
B.2.2.1 Shape 260
B.2.2.2 Scale 262
B.2.3 The B ]Percentile 262
B.3 Design for Reliability 263
B.4 Measuring Reliability 265
B.4.1 On Reliability Metrics 265
B.4.2 Graphing Failure Data 266
B.5 Testing for Reliability 269
References 271
Appendix C Brief Review of Probability and Statistics 273
C.1 Measures of Location 274
C.1.1 Average: The Mean Value 274
C.1.2 Average: The Median 275
C.1.3 Average: The Mode 275
C.2 Measures of Dispersion 276
C.2.1 Variance 276
C.2.2 Range 276
C.3 Distributions 277
C.3.1 Continuous Distributions 277
C.3.2 Discrete Distributions 279
C.4 Tests of Hypotheses 281
C.4.1 Estimating Parametric Change 281
C.4.2 Confidence Level 284
C.5 Ordered Statistics 284
References 285
Appendix D Sampling of Internal Control Systems 287
D.1 Populations 288
D.1.1 Sample Populations 289
D.1.2 Population Size 290
D.1.3 Homogeneity 290
D.2 Attribute Sampling 291
D.2.1 Acceptable Deviation Rate 292
D.2.2 System Deviation Rate 293
D.2.3 Controls 293
D.3 Sampling
Risks 294
D.3.1 Control Risk 294
D.3.2 Consumer and Producer Risks 294
D.3.3 Alpha and Beta Errors 295
D.4 Sampling Analysis 297
D.4.1 Statistical Inference 297
D.4.2 Sample Distributions 298
D.4.3 Sample Size 299
D.4.4 Estimating the SDR 299
D.4.5 Confidence Interval 300
References 302
Appendix E Statistical Sampling Plans 305
E.1 Fixed ]Size Attribute Sampling Plan 306
E.1.1 Determine the Objectives 306
E.1.2 Define Attribute and Deviation Conditions 306
E.1.2.1 Acceptable Deviation Rate 306
E.1.2.2 System Deviation Rate 307
E.1.3 Define the Population 307
E.1.4 Determine the Method of Sample Selection 307
E.1.5 Determine the Sample Size 308
E.1.6 Perform the Sampling Plan 312
E.1.7 Evaluate Sample Results 312
E.2 Stop ]or ]Go Sampling 313
E.2.1 Acceptable Deviation Rate 313
E.2.2 Sample Size 314
E.2.3 Evaluation 316
E.3 One Hundred Percent Inspection 316
E.4 Application: An Attribute Sampling Plan 317
References 318
Appendix F Nonstatistical Sampling Plans 321
F.1 Sampling Format 322
F.1.1 Frame of the Sampling Plan 322
F.1.2 Attribute and Deviation Conditions 323
F.1.3 The Population 323
F.1.4 Nonstatistical Sample Selection 324
F.1.5 Sample Size 325
F.1.6 The Effect of Sample Size on Beta Error 326
F.1.7 Evaluating Sample Results 327
F.2 Nonstatistical
Estimations 327
References 328
Index
William A. Stimson, PhD, is an independent consultant in systems engineering, and an expert witness for the Department of Justice and private law firms in evaluation of contractor performance. He has taken an active role in developing legal strategy for the evaluation of performance of operations in litigation, presented on the topic of forensic evaluation, and published peer–reviewed papers on dysfunctional processes.
A systems–level approach to reducing liability through process improvement
Forensic Systems Engineering: Evaluating Operations by Discovery presents a systematic framework for uncovering and resolving problematic process failures. Carefully building the causal relationship from process to product, the discussion lays out in significant detail the appropriate and tactical approaches necessary to the pursuit of litigation with respect to corporate operations.
Systemic process failures are addressed by flipping process improvement models to study both improvement and failure, resulting in arguments and methodologies relevant to any product or service industry. Guidance on risk analysis of operations combines evaluation of process control, stability, capability, verification, validation, specification, product reliability, serial dependence, and more, providing a robust framework with which to target large–scale nonconforming products and services.
Relevant to anyone involved in business, manufacturing, service, and control, this book:
The global economy has created an environment in which huge production volume, complex data bases, and multiple dispersed suppliers greatly challenge industrial operations. This informative guide provides a practical blueprint for uncovering problematic process failures.
1997-2024 DolnySlask.com Agencja Internetowa