Abstract
We introduce the conceptual and theoretical foundations of our prescriptive paradigm for robust decisions. Whereas decision-making is an event, executive decision management is a life-cycle of a complex of five spaces. The five spaces are: The Problem Space, Solution Space, Operations Space, Performance Space and the Commitment Space. Consistent with the prescriptive nature of our paradigm, we concentrate on actionable processes and procedures within each of those five spaces. The goal of our prescriptive paradigm is to enable systematic design of robust decisions. The key sociotechnical processes are robust design synthesis, Design of Experiments (DOE) using gedanken experiments, Gage R&R, making uncertainty tractable with spanning set of uncertainty regimes, and the process to represent system behavior phenomenologically.
References
Achterbergh, J., & Vriens, D. (2009). Organizations. Berlin: Springer.
Ackoff, R. L. (1974). Redesigning the future. New York: Wiley.
AIAG, A. (2002). Measurement systems analysis (MSA). Reference Manual (3rd ed.). Troy: The Automotive Industries Action Group.
Allen, P., Varga, L., & Stratern, M. (2010). Risk Management, 12, 9–30.
Almquist, E., & Wyner, G. (2001). Boost your marketing ROI with experimental design. Harvard Business Review, 79(October), 135–141.
Andersen, D. F., Richardson, G. P., & Vennix, J. A. (1997). Group model building: Adding more science to the craft. System Dynamics Review, 13(2), 187–201.
Antonsson, E. K., & Otto, K. N. (1995). Imprecision in engineering design. Journal of Vibration and Acoustics, 117(B), 25–32.
Arkes, H. R. (2001). Overconfidence in judgmental forecasting. In J. Scott Armstrong (Ed.), Principles of forecasting: A handbook for researchers and practitioners. Norwell: Kluwer.
Armstrong, J. S. (2001). Combining forecasts. In Principles of forecasting (pp. 417–439). Springer: New York.
Ashby, W. R. (1957). An introduction to cybernetics. London: Chapman Hall.
Ashton, A. H. (1985). Does consensus imply accuracy in accounting studies of decision making? The Accounting Review, LX(2), 173–185.
Banks, A. P., & Millward, L. J. (2000). Running shared mental models as a distributed cognitive process. British Journal of Psychology, 91, 513–531.
Baron, J. (2000). Thinking and deciding (3rd ed.). Cambridge: Cambridge University Press.
Bar-Yam, Y. (2003). Complexity of military conflict: Multiscale complex systems analysis of littoral warfare. Report for Contract F30602-02-C-0158 for the Chief of Naval Operations Strategic Studies Group, John Q. Dickmann and William G. Glenney
Boje, D. M., & Mirninghan, K. (1982). Group confidence pressures in iterative decisions. Management Science, 28(10), 1187–1196.
Box, G. E., Hunter, J. S., & Hunter, W. G. (2005). Statistics for experimenters: Design, innovation, and discovery. AMC, 10, 12.
Box, G. E., Hunter, W. G., & Hunter, J. S. (1978). Statistics for experimenters. Hoboken, NJ: Wiley.
Box, G. E. P., & Wilson, K. G. (1951). On the experimental attainment of optimum conditions. Journal of the Royal Satistical Society, B(13), 1–45.
Box, J. F. (1978). R.A.Fisher: The life of a scientist. New York: Wiley.
Bredehoeft, J. (2005). The conceptualization model problem—surprise. Hydrogeology Journal, 13(1), 37–46.
Breyfogle, F. W. (2003). Implementing six sigma. Hoboken: Wiley.
Brodbeck, F. C., Kerschreiter, R., Mojzisch, A., & Schulz-Hardt, S. (2007). Group decision making under conditions of distributed knowledge: The information asymmetries model. Academy of Management Review, 32(2), 459–479.
Brown, J. R., & Yiftach, Y. (2014). Thought experiments. In Edward N. Zalta(Ed.), The Stanford encyclopedia of philosophy (Fall 2014 Edition). http://plato.stanford.edu/archives/fall2014/entries/thought-experiment/
Carl-Axel, S., & von Holstein, S. (2004). A tutorial in decision analysis. In R. A. Howard (Ed.), The principles and applications of decision analysis (Vol. 1). Menlo Park, CA: Strategic Decisions Group.
Carlile, P. R. (2002). A pragmatic view of knowledge and boundaries: Boundary objects in new product development. Organization science, 13(4), 442–455.
Carlile, P. R. (2004). Transferring, translating, and transforming: An integrative framework for managing knowledge across boundaries. Organization Science, 15(5), 555–568.
Carroll, J. S., Sterman, J., Marcus, A., & Johnson, E. J. (1998). Playing the maintenance game: How mental models drive organizational decisions. In J. J. Halpern & R. N. Stern (Eds.), Debating rationality: Nonrational aspects of organizational decision making. Ithaca: Cornell U Press.
Castro, J. (2007). Alignment in product development: How different disciplines successfully interact within an organizational environment. Also downloaded January 9, 2016, http://18.7.29.232/bitstream/handle/1721.1/83973/SRP_070315_JoaoCastro.pdf?sequence=1
Chi, M.T.H. (2006). Two approaches to the study of experts’ characteristics. In K. A. Ericsson, N. Charness, P. J. Feltovich, & R. R. Hoffman (Eds.), The cambridge handbook of expertise and performance (pp. 21–30). Cambridge: Cambridge Univ. Press.
Clausing, D. (1994). Total quality development. New York, NY: ASME Press.
Clemson, B., Tang, Y., Pyne, J., & Unal, R. (1995). Efficient methods for sensitivity analysis. System Dynamics Review, 11(1, Spring 1995), 31–49.
Coner, J. L., & Coner, P. D. (1995). Characteristics of decisions in decision analysis practice. Journal of the Operation Research Society, 46, 304–314.
Creveling, C. M., Slutsky, J., & Antis, D. (2003). Design for Six Sigma in technology and product development. Upper Saddle River, NJ: Prentice Hall PTR.
Cummings, J. N. (2004). Work groups, structural diversity, and knowledge sharing in a global organization. Management Science, 50(3), 352–364.
Cummings, J. N., & Cross, R. (2003). Structural properties of work groups and their consequences for performance. Social Networks, 25, 197–210.
Dawes, R. M. (1979). The robust beauty of improper linear models in decision making. American Psychologist, 34(7), 571–582.
Dawes, R. M., & Mulford, M. (1996). The false consensus effect and overconfidence: Flaws in judgment or flaws in how we study judgment? Organizational Behavior and Human Decision Processes, 65(3), 201–211.
Dewey, J. (1933). How we think. Buffalo, NY: Prometheus Books.
Erden, Z., Von Krogh, G., Nonaka, I., & I. (2008). The quality of group tacit knowledge. The Journal of Strategic Information Systems, 17(1), 4–18.
Etmer, P. A., Stepich, D. A., York, C. S., Stickman, A., Wu, X., Zurek, S., & Goktas, Y. (2008). How instructional design experts use knowledge and experience to solve ill-structured problems. Performance Improvement Quarterly, 21(1), 17–42.
Feltovich, P. J., Prietula, M. J., & Ericsson, K. A. (2006). Studies of expertise from psychological perspectives. In Ericsson, Charness, Feltovich, & Hoffman (Eds.), The Cambridge handbook of expertise and performance (pp. 41–68). Cambridge: Cambridge University Press.
Fildes, R., Goodwin, P., Lawrence, M., & Nikopoulos, K. (2009). Effective forecasting and judgmental adjustments: an empirical evaluation and strategies for improvement in supply-chain planning. International Journal of Forecasting, 25(1, January–March), 3–23.
Fischhoff, B. (1999). Debiasing. In D. Kahneman, P. Slovic, & A. Tversky (Eds.), Judgment under uncertainty: Heuristic and biases. Cambridge: Cambridge University Press.
Fischhoff, B., & Johnson, S. (1997). The possibility of distributed decision making. In Organizational decision making (pp. 216–237). Cambridge: Cambridge University Press.
Fisher, R. (1955a). Statistical methods and scientific induction. Journal of the Royal Statistical Society. Series B (Methodological), 69–78.
Fisher, R. A. (1955b). Statistical methods and scientific induction. Journal of the Royal Statistical Society. Series B (Methodological), 17(1), 69–78.
Fisher, R. A. (1966). Design of experiments (8th ed.). New York, NY: Hafner Publishing Company.
Fisher, R. A. (1971). The design of experiments. New York, NY: Hafner Publishing Company.
Foster, S. T., Jr., Wallin, C., & Ogden, J. (2011). Towards a better understanding of supply chain quality management practices. International Journal of Production Research, 49(8), 2285–2300.
Fowlkes, W. Y., & Creveling, C. M. (1995). Engineering methods for robust product design. Reading, MA: Addison-Wesley.
Frey, D. D., Engelhardt, F., & Greitzer, E. M. (2003). A role for “one-factor-at-a-time” experimentation in parameter design. Research in Engineering Design, 14, 65–74.
Frey, D. D., Jugulum, R. (2003). How one-factor-at-a-time experimentation can lead to greater improvements than orthogonal arrays. Proceedings of DETC”03, ASME 2003 Design engineering technical conferences and computers and information in engineering conference, Chicago, Illinois, USA, September 2–6, 2003.
Hanson, R. B. (1998). Consensus by identifying extremists. Theory and Decision, 44, 293–301.
Hibon, M., & Evgeniou, T. (2005). To combine or not to combine: Selecting among forecasts and their combinations. International Journal of Forecasting, 21(1), 15–24.
Hoffman, P. J., Slovic, P., & Rorer, L. G. (1986). An analysis of variance model for the assessment of configural cue utilization in clinical judgment. In H. R. Arkes & K. R. Hammond (Eds.), Judgment and decision making: An interdisciplinary reader (pp. 568–581). Cambridge: Cambridge University Press.
Hopp, W. (2014). Experiments in thought. Perspectives on Science, 22(2), 242–263.
Horwitz, W. (2003). The certainty of uncertainty. Journal of AOAC International, 86(1), 109–111.
Howard, R. A. (2004). An introduction to decision analysis. In R. A. Howard & J. E. Matheson (Eds.), The principles and applications of decision analysis (Vol. 1). Menlo Park, CA: Strategic Decisions Group.
Howard, R. A., & Matheson, J. E. (2004). The principles and applications of decision analysis 1364 (Vol. 1). San Mateo, CA: Strategic Decisions Group.
Howard, R. A. (2007). The foundations of decision analysis revisited. In W. Edwards, R. F. Miles Jr., & D. von Winterfeldt (Eds.), Advances in decision analysis: From foundations to applications (pp. 32–56). Cambridge: Cambridge University Press.
Isenberg, D. J. (1988). How senior managers think. In D. E. Bell, H. Raiffa, & A. Tverskly (Eds.), Decision making: Descriptive, normative, and perscriptive interaction. Cambridge: Cambridge University Press.
Jahanshahi, M., Najafpour, G., & Rahimnejad, M. (2008). Applying the Taguchi method for optimized fabrication of bovine serum albumin (BSA) nanoparticles as drug delivery vehicles. African Journal of Biotechnology, 7(4), 362–367.
Janis, I. (1992). Causes and consequences of defective policy-making: A new theoretical analysis. In F. Heller (Ed.), Decision-making and leadership. Cambridge: Cambridge University Press.
Jones, J., & Hunter, D. (1995). Consensus methods for medical and health services research. British Medical Journal, 311(7001), 376.
Kawakita, J. (1991). The original KJ method. Tokyo: Kawakita Research Institute.
Keeney, R. E. (1992). Value focused thinking: A path to creative decision-making. Cambridge: Harvard University Press.
Keeney, R. E. (1994). Using values in operations research. Operations Research, 42(5.): September–October), 793–813.
Keeney, R. L. (1996). Value-focused thinking: Identifying decision opportunities and creating alternatives. European Journal of Operational Research, 92(3), 537–549.
Kerr, N. L., & Tindale, R. S. (2011). Group-based forecasting?: A social psychological analysis. International Journal of Forecasting, 27(1), 14–40.
Klein, G. (1999). Sources of power: How people make decisions. Cambridge, MA: MIT Press.
Klein, G. A. (2001). The fiction of optimization. In G. Gigerenzer & R. Selten (Eds.), Bounded rationality: The adaptive toolbox. Cambridge, MA: MIT Press.
Kolb, D. A. (1984). Experiential learning. Experience as the source of learning and development (Vol. 1). Englewood Cliffs, NJ: Prentice Hall.
Koriat, A., Lichtenstein, S., & Fishoff, B. (1980). Reasons for confidence. Journal of Experimental Psychology: Human Learning and Memory, 6, 107–118.
Kotler, P. (1980). Marketing management (4th ed.). Upper Saddle River, NJ: Prentice Hall.
Kotler, P., & Keller, K. L. (2009). Marketing management (13th ed.). Upper Saddle River, NJ: Prentice Hall.
Kray, L. J., & Galinsky, A. D. (2003). The debiasing effect of counterfactual mind-sets: Increasing the search for disconfirmatory information in group decisions. Organizational Behavior and Human Decision Processes, 91, 69–81.
Kumar, P., Barua, P. B., & Gainhar, J. L. (2000). Quality optimization (Multi-characteristics) through Taguchi’s technique and utility concept. Quality and Reliability Engineering International, 16, 475–485.
Lempert, R. J. (2002). A new decision sciences for complex systems. Proceedings National Academy of Sciences, 99(3), 7309–7313.
Lerner, J. S., & Tetlock, P. E. (2003). Bridging individual, interpersonal, and institutional approaches to judgment and decision making: The impact of accountability on cognitive bias. In S. L. Schneider & J. Shanteau (Eds.), Emerging perspectives on judgment and decision research. Cambridge: Cambridge University Press.
Levine, D. M., Ramsey, P. P., & Smidt, R. K. (2001). Applied statistics for engineers and scientists: Using Microsoft Excel and Minitab. Upper Saddle River, NJ: Pearson.
Lipshitz, R., & Bar-Ilan, O. (1996). How problems are solved: Reconsidering the phase theorem. Organizational Behavior and Human Decision Processes, 65(1), 48–60.
Makridakis, S. (1989). Why combining works? International Journal of Forecasting, 5(4), 601–603.
Makridakis, S., & Winkler, R. L. (1983). Averages of forecasts: Some empirical results. Management Science, 29(9), 987–996.
Markman, K. D., Lindberg, M. J., Kray, L. J., & Galinsky, A. D. (2007). Implications of counterfactual structure for creative generation and analytical problem solving. Personality and Social Psychology Bulletin, 33(3), 312–324.
Mest, D. P., & Plummer, D. (2003). Analysts’ rationality and forecast bias: Evidence from Sales Forecasts. Review of Quantitative Finance and Accounting, 21, 103–122.
McLeod, S. A. (2013). Learning styles. Downloaded November 2017. http://cei.ust.hk/files/public/simplypsychology_kolb_learning_styles.pdf
Michelson, A. A., & Morley, E. W. (1887). American Journal of Science, 34, 333.
Miller, G. (1956). The magical number seven, plus or minus two: Some limites on our capacity for information processing. Psychological Review, 63(2), 81–97.
Mohammed, S., & Dumville, B. C. (2001). Team mental models in a team knowledge framework: Expanding theory and measurement across disciplinary boundaries. Journal of organizational Behavior, 22(2), 89–106.
Mohammed, S., Ferzandi, L., & Hamilton, K. (2010). Metaphor no more: A 15-year review of the team mental model construct. Journal of Management, 4(36), 876–910.
Mojzisch, A., & Schulz-Hardt, S. (2010). Knowing others’ preferences degrades the quality of group decisions. Journal of Personality and Social Psychology, 98(5), 794.
Montgomery, D. C. (2001). Design and analysis of experiments. Hoboken, NJ: Wiley.
Moon, J. A. (2013). Reflection in learning and professional development: Theory and practice. London: Routledge.
Murmann, J. P. (2003). Knowledge and competitive advantage: The coevolution of firms, technology, and national institutions. Cambridge: Cambridge University Press.
Nalbant, M., Gökkaya, H., & Sur, G. (2007). Application of Taguchi method in the optimization of cutting parameters for surface roughness in turning. Materials & Design, 28(4), 1379–1385.
Otto, K. N., & Wood, C. (2001). Product design: Techniques in reverse engineering and new product development. Boston: Prentice Hall.
Phadke, M. S. (1989). Quality engineering using robust design. Englewood Cliffs, NJ: Prentice Hall.
Phillips, L. D. (2007). Decision conferencing. In W. Edwards, R. F. Miles Jr., & D. von Winterfeldt (Eds.), Advances in decision analysis. Cambridge: Cambridge University Press.
Polanyi, M. (2009). The tacit dimension. Chicago, Ill: University of Chicago Press.
Regan-Cirincione, P. (1994). Improving the accuracy of group judgment: A process intervention combing group facilitation, social judgment analysis, and information technology. Organizational Behavior and Human Decision Processes, 58, 246–270.
Rentsch, J. R., Small, E. E., & Hanges, P. J. (2008). Cognitions in organizations and teams: What is the meaning of cognitive similarity? In D. B. Smith & D. Brent (Eds.), The people make the place: Dynamic linkages between individuals and organizations, LEA’s organization and management series (pp. 129–157). Mahwah, NJ: Lawrence Erlbaum.
Rogers, C. (2002). Defining reflection: Another look at John Dewey and reflective thinking. Teachers College Record, 104(4), 842–866.
Ropohl, G. (1999). Philosophy of socio-technical systems. Techné: Research in philosophy and technology, 4(3), 186–194.
Roth, A. E. (1995). Bargaining games. In J. H. Kagel & A. E. Roth (Eds.), Handbook of experimental economics. Princeton, NJ: Princeton University Press.
Rouwette, E. A., Vennix, J. A., & Mullekom, T. V. (2002). Group model building effectiveness: a review of assessment studies. System Dynamics Review, 18(1), 5–45.
Roy, R. K. (2001). Design of experiments using the Taguchi method. New York: John Wiley.
Russo, J. E., & Schoemaker, P. J. (1989). Decision traps: The ten barriers to brilliant decision-making and how to overcome them. New York: Fireside.
Russo, J. E., Schoemaker, P. J. (1992). Managing overconfidence. Sloan Management Review, Winter.
Schön, D. (1983). The reflective practitioner. New York: Basic Books.
Schulz-Hardt, S., Brodbeck, F. C., Mojzisch, A., Kerschreiter, R., & Frey, D. (2006). Group decision making in hidden profile situations: dissent as a facilitator for decision quality. Journal of Personality and Social Psychology, 91(6), 1080.
Shiba, S., & Walden, D. (2001). Four practical revolutions in management (2nd ed.). Portland, OR: Productivity Press.
Simon, H. A. (2000). Can there be a science of complex systems. In Y. Bar-Yam (Ed.), Unifying themes in complex systems: Proceedings from the International Conference on complex systems (Vol. 1). Westview Press.
Skånér, Y., Strender, L. E., & Bring, J. (1988). How to GP’s use clinical information in their judgments of heart failure? A clinical judgement analysis study. Scandinavian Journal of Primary Health Care, 16, 95–100.
Smith, G. F. (1989). Defining managerial problems: a framework for prescriptive theorizing. Management science, 35(8), 963–981.
Smith, D. W. (2013). Phenomenology. In E. N. Zalta (Eds.), The Stanford encyclopedia of philosophy (Winter edition). URL=http://plato.stanford.edu/archives/win2013/entries/phenomenology/
Smith, S. D., Osborne, J. R., & Forde, M. C. (1995). Analysis of earth-moving systems using discrete-event simulation. Journal of Construction Engineering and Management, 121(4), 388–396.
Sorensen, R. A. (1992). Thought experiments. New York: Oxford University Press.
Sorrell, M., Komisar, K., & Mulcahy, A. (2010, March). How we do it: Three executives reflect on strategic decision making. McKinsey Quarterly.
Taguchi, G. (1987). System of experimental design. White Plains, NY: UNIPUB/Kraus International Publications.
Taguchi, G. (1991). Taguchi methods. Research and development (Vol. 1, p. 3). Dearborn, MI: American Supplier Institute.
Taguchi, G., Chowdury, S., & Taguchi, S. (2000). Robust engineering. New York: McGraw-Hill.
Taguchi, G., & Clausing, D. (1990). Robust quality. Harvard Business Review, 68(January–February), l.
Thomke, S. H. (2001). Enlightened experimentation: The new imperative for innovation. Harvard Business Review, 79(February), 67–75.
Thomke, S. H. (2003a). Experimentation matters: Unlocking the potential of new technologies for innovation. Boston, MA: Harvard Business School Press.
Thomke, S. H. (2003b). R&D comes to services: Bank of America’s pathbreaking experiments. Harvard Business Review, April, 71–79.
Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 211(4481), 453–458.
Ulrich, K. (2003). JK method. http://opim.wharton.upenn.edu/~ulrich/documents/ulrich-KJdiagrams.pdf. Downloaded September 7, 2015.
von Foerster, H. (1981). Observing systems. Seaside, CA: Intersystems Publications.
von Foerster, H., & Poerksen, B. (2002). Understanding systems. Heidelberg: Springer.
von Winterfeldt, D., & Edwards, W. (2007). Defining decision analytic structure. In W. Edwards, R. F. Miles Jr., & D. von Winterfeldt (Eds.), Advances in decision analysis. Cambridge: Cambridge University Press.
Vucjkov, I. N., & Boyadjieva, L. N. (2001). Quality improvement with design of experiments. Boston: Kluwer Academic Publishers.
Wallis, K. F. (2011). Combining forecasts–forty years later. Applied Financial Economics, 21(1–2), 33–41.
Wang, J. (2004). Assessing measurement system acceptability for process control and analysis using gage R&R study. The Graduate School University of Wisconsin Stout Menomonie, WI, 54751.
Weick, K. E., Sutcliffe, K. M., & Obstfeld, D. (2005). Organizing and the process of sensemaking. Organization Science, 16(4), 409–421.
Winquist, J. R., & Larson, J. R., Jr. (1998). Information pooling: When it impacts group decision making. Journal of Personality and Social Psychology, 74(2), 371–377.
Wright, G., & Rowe, G. (2011). Group-based judgmental forecasting: An integration of extant knowledge and the development of priorities for a new research agenda. International Journal of Forecasting, 27(1), 1–13.
Wu, C. F. J., & Hamada, M. (2000). Planning, analysis, and parameter design optimization. In Wiley series in probability and statistics. Hoboken, NJ: Wiley.
Wu, Y., & Wu, A. (2000). Taguchi methods for robust design. New York: American Society of Mechanical Engineers.
Yaniv, I. (2011). Group diversity and decision quality: Amplification and attenuation of the framing effect. International Journal of Forecasting, 27(1), 41–49.
Author information
Authors and Affiliations
Appendices
Appendix 3.1 Keeney’s Techniques for Identifying Objectives
The table below is taken directly from Keeney’s (1996) article on this subject. This is not a recipe for finding the objectives for a decision problem, but it is an approach to explore the thinking of the decision maker.
Type of Objective | Questions |
---|---|
Wish list | • What do you want? What do you value? • What should you want? |
Alternatives | • What is the perfect alternative, a terrible alternative, some reasonable alternative? • What is good about each? |
Problems and shortcomings | • What is right or wrong with your organization? • What needs fixing? |
Consequences | • What has occurred that was good or bad? What might occur that you care about? |
Different perspectives | • What are your aspirations? • What limitations are placed upon you? |
Strategic objectives | • What are your ultimate objectives? • What are your values that are absolutely fundamental? |
Generic objectives | • What objectives do you have for your customers, employees, your shareholders, yourself? • What environmental, social, economic, or health and safety objectives are important? |
Structuring objectives | • Follow means-ends relationships: why is that objective important, how can you achieve it? • Use specification: what do you mean by this objective? |
Quantifying objectives | • How would you measure achievement of this objective? • Why is objective A three times as important as objective B? |
Appendix 3.2 Smith’s Approach to Conceptualizing Objectives
The table below is an extension from Smith’s article (1989) on conceptualization of objectives. All eight conceptualizations are different types of “gaps.” To show what we mean, we restate his examples as a “gap statement.” Discovering corporate gaps is where we begin in our field experiments with our executive interviews. Simultaneously we try to learn as much as possible about the conditions and historical situations that led to these identified gaps. From this we distill corporate objectives we want to study. Then the background of the gap becomes what we call “the decision situation,” which gives the context of the corporate problem and objectives senior executives want to achieve. This is a way to frame a decision situation.
Example | Description | Conceptualization | Gap Statement |
---|---|---|---|
“Sales are $150,000 under budget.” | Comparing existing and desired states | Gap Specification | Same |
“It‘s tough competing, given our limited experience in this market.” | Identifying factors inhibiting goal achievement | Difficulties and Constraints | “The differences between our experience and what is required are ... |
“We need to convince management that this is a profitable market.” | Stating the final ends served by a solution | Ultimate Values and Preferences | “We need to show +x% more profitability to our management.” |
“This year’s sales target of $5.2 million must be met.” | Identifying the particular goal state to be achieved | Goal State Specification | “Current sales are $x M, a shortfall of $Y M from target of $5.2 M.” |
“We have to hire more salespeople.” | Specifying how a solution might be achieved | Means and Strategies | “We are short of +xx sales people.” |
“The real problem is our ineffective promotional material.” | Identify the cause(s) of the problematic state | Causal diagnosis | “Our promotional material is ineffective in the following areas because ....” |
“Our product is 6 years old; our competitors out-spend us on advertising; etc.” | State facts and beliefs pertinent to the problem | Knowledge specification | “Our product is 6 years old; competitors out-spend us on advertising by x% per y unit sales ...; etc.” |
“Since the market isn’t growing, we’re in a zero-sum game with our competition.” | Adopting an appropriate point-of-view on the situation | Perspective | “We need to gain share of x% from our competitors ...” |
Appendix 3.3 Eight Analytic Structures
von Winterfeldt and Edwards (2007) specify eight mathematical structures to model the system behavior to predict and analyze variables that have an influence the outputs. These approaches are not limited to mathematical structures. They are also very effective in qualitative analyses as well. Our descriptions that follow are presented in this spirit.
Means-Ends Networks
This process can start at any level of a problem or opportunity, say at level n. To build the means-ends chain, ask the question: “why?” Viz. why is this objective important? Itemize the reasons and now you have the n − 1 level of the network. Next from the n level, ask the question: “how?” Namely, how will this objective be accomplished? Itemize the answers and now you have the n + 1 level of the network. Proceed iteratively, up or down or both, until you have found the appropriate level at which to address the opportunity/problem. Clearly the process can produce very complex networks.
Objectives Hierarchies
Objectives hierarchies are simple two-column tables. On the left hand column list your itemized list of objectives. On the right hand column, for each objective, list the measures to achieve the objective. For example, for the objective to: “Improve customers’ service economics”, the right hand column can show, for example, “reduce consulting fees.” Or “provide the first 50 h of consulting for free”. Complete the table and you have an objective hierarchy.
Consequence Tables
Consequence tables are also two column tables. On the left hand side list the fundamental objectives and the right hand side specify the measures. (This is almost identical to Objective hierarchies.) Complete the table in this manner and you have a consequence table.
Decision Trees
Decision trees begin with a decision node, N0, normally depicted by a square. Emanating from the N0 node are the various links identifying alternative decisions, say d1, d2, and d3, that can be made. Each of these links terminate in a chance node, normally identified by a circle. To each d1, d2, d3 link, a probability can be assigned. Links emanate from each of these circles to potential outcomes with an associated payoff. Suppose that from d1 we have 2 links to outcome o11 and o12; from d2 we have outcomes 021, 022, and 023. And from d3, we have outcome o31 and o32. The expected value of the outcome o32 is the product of the probability of d3 and payoff o32. This a schematic description of a decision-tree of 3 layers. A decision tree becomes very bushy when it has many levels.
Influence Diagrams
Influence diagrams are the inventions of Howard (2004) who coined the term “decision analysis”. An influence diagram is graphical representation of the decision in question. The diagram is represented with the following elements: decision nodes as rectangles, the outcomes and their value represented as octagons or rectangles with rounded corners, and functional arrows to show the variable-nodes on which values depend. Clearly, a functional arrow must exist between a decision (rectangle) and outcomes (octagon or rounded-corner rectangle). Using these geometric illustrations a network that represents the causal relationships of a decision can be illustrated.
Event Trees
Event trees are built from the “bottom up”. The consequences of an event are identified in a step-wise feed forward successively branching out as in the decision tree approach. Event trees are often used to determine probabilities of failures and other undesirable events. This a “bottom up” approach.
Fault Trees
This is a so-called “top down” approach. This is the opposite approach of event trees, which uses a “bottom-up approach”. The idea of a fault tree is to start with a fault. A fault can be understood as an engineering failure or a serious deleterious sociotechnical outcome. The fault tree is constructed starting with fault and identifying the reasons leading to the fault. Reasons can be conjunction or disjunction. The process proceeds iteratively down using the same logic.
Belief Networks
A belief network is a directed acyclic network/graph, with an associated set of probabilities. The graph consists of nodes and links. The nodes represent variables. Links represent causal relationships between variables. In general, a belief in a statement/hypothesis S, involving variables, that depend on some prior related knowledge K. Our belief in S, given we know something about K, forms a belief function P(S|K). Bayes theorem give us a way to determine the value of this expression. Thus associated probabilities and prior knowledge gives us a way to reason about uncertainty. Modeling a problem this way involves many nodes that are linked, forming a Bayesian Belief Network.
Appendix 3.4 Debiasing Logic
This debiasing procedure is from Lerner and Tetlock (2003).
Appendix 3.5 Examples of Engineering Applications Using DOE
Engineering problems | Reference |
---|---|
• Chemical vapor deposition process • Tuning computing systems • Design of accelerometer • Paper feeder w/o misfeeds and multifeeds • Waste water treatment plant • Camera zoom shutter design • Capstan roller printer • Numerically controlled machine • V-process casting Al-7%Si Alloy • Development of a filter circuit • Gold plating process • Optimization of inter-cooler • Replenisher dispenser • Warfare Receiver System • Body panel thick variation • Tensile strength of air bag • Electrostatic powder coating • Chemical reaction experiment • Task efficiency • Injection molding shrinkage • Carbon electrodes study • Clutch case study • Medical serum • Multicycle chemical process • Yield of chemical process • Impeller machine for jet turbines • Medical serum | Phadke (1989) Phadke (1989) Antonsson and Otto (1995) Clausing (1994) Clemson et al. (1995) Fowlkes and Creveling (1995) Fowlkes and Creveling (1995) Wu and Wu (2000) Kumar et al. (2000) Wu and Wu (2000) Wu and Wu (2000) Taguchi et al. (2000) Taguchi et al. (2000) Taguchi et al. (2000) Roy (2001) Roy (2001) Roy (2001) Wu and Hamada (2000) Wu and Hamada (2000) Wu and Hamada (2000) Frey et al. (2003), Frey and Jugulum (2003) Frey et al. (2003), Frey and Jugulum (2003) Nalbant et al. (2007) Montgomery (2001) Montgomery (2001) Montgomery (2001) Jahanshahi et al. (2008) |
Rights and permissions
Copyright information
© 2018 Springer International Publishing AG, part of Springer Nature
About this chapter
Cite this chapter
Tang, V., Otto, K., Seering, W. (2018). Operations: Foundations and Processes. In: Executive Decision Synthesis. Contributions to Management Science. Springer, Cham. https://doi.org/10.1007/978-3-319-63026-7_3
Download citation
DOI: https://doi.org/10.1007/978-3-319-63026-7_3
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-63024-3
Online ISBN: 978-3-319-63026-7
eBook Packages: Business and ManagementBusiness and Management (R0)