Skip to main content
Log in

Software architecture evaluation methods based on cost benefit analysis and quantitative decision making

  • Published:
Empirical Software Engineering Aims and scope Submit manuscript

Abstract

Since many parts of the architecture evaluation steps of the Cost Benefit Analysis Method (CBAM) depend on the stakeholders’ empirical knowledge and intuition, it is very important that such an architecture evaluation method be able to faithfully reflect the knowledge of the experts in determining Architectural Strategy (AS). However, because CBAM requires the stakeholders to make a consensus or vote for collecting data for decision making, it is difficult to accurately reflect the stakeholders’ knowledge in the process. In order to overcome this limitation of CBAM, we propose the two new CBAM-based methods for software architecture evaluation, which respectively adopt the Analytic Hierarchy Process (AHP) and the Analytic Network Process (ANP). Since AHP and ANP use pair-wise comparison they are suitable for a cost and benefit analysis technique since its purpose is not to calculate correct values of benefit and cost but to decide AS with highest return on investment. For that, we first define a generic process of CBAM and develop variations from the generic process by applying AHP and ANP to obtain what we call the CBAM+AHP and CBAM+ANP methods. These new methods not only reflect the knowledge of experts more accurately but also reduce misjudgments. A case study comparison of CBAM and the two new methods is conducted using an industry software project. Because the cost benefit analysis process that we present is generic, new cost benefit analysis techniques with capabilities and characteristics different from the three methods we examine here can be derived by adopting various different constituent techniques.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11

Similar content being viewed by others

Notes

  1. CBAM does not provide a detailed method for reducing scenarios. Choosing the top one-third or top 50% is not mandatory and in some cases all collected scenarios may be used.

  2. According to T. L. Saaty (Saaty 1999), if numerical judgments were taken at random from the scale 1/9, 1/8, 1/7, ..., 1/2, ..., 1, 2, ..., 9, then we would have the following average consistencies for different-order random matrices:

References

  • Al-Naeem T, Gorton I, Rabhi MA, Benatallah B (2005) A quality-driven systematic approach for architecting distributed software applications. Proc. Int. Conf. Software Engineering (8th ICSE), St. Louis, USA, pp 244–253

  • Bratthall L, Wohlin C (2002) Is it possible to decorate graphical software design and architecture models with qualitative information?—an experiment. IEEE Trans Softw Eng 28(12):1181–1193

    Article  Google Scholar 

  • Clements P, Kazman R, Klein M (2002) Evaluating software architectures methods and case studies. Addison Wesley, Boston, MA

    Google Scholar 

  • ExpertChoice (2007). Expert Choice11.5™. Available at http://www.expertchoice.com

  • Kazman R, Asundi J, Klein M (2002) Making architecture design decisions: an economic approach (CMU/SEI-2002-TR-035, ESC-TR-2002-035). Software Engineering Institute, Carnegie Mellon University, Pittsburgh, PA

    Google Scholar 

  • Kazman R, In H, Chen H (2004) From the requirements to negotiation to software architecture decisions. Proc. Int. Conf. Software Engineering Research, Management and Applications (SERA), Los Angeles, CA, USA

  • Kim CK, Lee DH, Ko IY, Baik JM (2007) A lightweight value-based software architecture evaluation. Proc ACIS Int Conf Software Engineering, Artificial Intelligence, Networking, and Parallel/Distributed Computing (SNPD) 2:646–649

  • Lee JH, Kang SW, Cho JH, Kim JS (2006) Architecture evaluation utilizing CBAM and AHP. Korea Inf Process Soc Trans 13-D(5):683–690

    Google Scholar 

  • Liu B, Fan Y (2007) A new performance evaluation model and AHP-based analysis method in service-oriented workflow. Proc. Int. Conf. Grid and Cooperative Computing (GCC), 685–692

  • Nord RL, Barbacci MR, Clements P, Kazman R, Klein M, O’Brien L et al (2003) Integrating the architecture tradeoff analysis method (ATAM) with the cost benefit analysis method (CBAM) (CMU/SEI-2003-TN-038). Software Engineering Institute, Carnegie Mellon University, Pittsburgh, PA

    Google Scholar 

  • Raisighani M, Meade SL, Schkade LL (2007) Strategic e-business decision analysis using the analytic network process. IEEE Trans Eng Manage 54(4):673–686

    Article  Google Scholar 

  • Saaty TL (1980) The analytic hierarchical process. McGraw-Hill, New York

    Google Scholar 

  • Saaty TL (1990) Multicriteria decision making: the analytic hierarchical process, I, AHP series. RWS, Pittsburgh, PA

    Google Scholar 

  • Saaty TL (1999) Decision making for leaders. RWS, Pittsburgh, PA

    Google Scholar 

  • Saaty TL (2005) Theory and applications of the analytic network process. RWS, Pittsburgh, PA

    Google Scholar 

  • Saaty TL, Vargas LG (1982) The logic of priorities. Kluwer-Nijhoff, London

    Google Scholar 

  • Saaty TL, Vargas LG (1994) Decision making in economic, political, social and technological environments with the analytic hierarchy process. RWS, Pittsburgh, PA

    Google Scholar 

  • SuperDecision (2007) Creative decisions foundation. SuperDecision, available at http://www.superdecisions.com

  • Svahnberg M (2004) An industrial study on building consensus around software architectures and quality attributes. Inf Softw Technol 46(12):805–818

    Article  Google Scholar 

  • Svahnberg M, Wholin C, Lundberg L (2003) A quality-driven decision-support method for identifying software architecture candidates. Int J Softw Eng Knowl Eng 13(5):547–573 doi:10.1142/S0218194003001421

    Article  Google Scholar 

  • Wallin P, Froberg J, Axelsson J (2007) Making decisions in integration of automotive software and electronics. Proc. Int. Workshop on Software Engineering for Automotive Systems (ICSE Workshops SEAS ‘07)

  • Yi JN, Meng WD, Ma WM, Du JJ (2005) Assess model of network security based on analytic network process. Proc Int Conf Mach Learn Cybern 1:27–32 doi:10.1109/ICMLC.2005.1526914

    Google Scholar 

  • Zhu L, Aurum A, Gorton I, Jeffery R (2005) Tradeoff and sensitivity analysis in software architecture evaluation using analytic hierarchy process. Softw Qual J 13:357–375 doi:10.1007/s11219-005-4251-0

    Article  Google Scholar 

Download references

Acknowledgements

This work was partially supported by Defense Acquisition Program Administration and Agency for Defense Development under contract (2008-SW-14-IJ-02) and by the Industrial Technology Development Program funded by the Ministry of Commerce, Industry and Energy (MOCIE, Korea).

The authors are thankful to Jae-ha Song of NHN Corp., Gu-yul Noh of Samsung SDS, and Jong-gul Park of Tmaxsoft, for participating in the empirical study conducted for this research.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jihyun Lee.

Additional information

Editor: Kenichi Matsumoto, Ph.D.

Appendix

Appendix

In decision making it is important to know how consistent the decisions are. In this section we present how to compute the inconsistency rate using pair-wise comparison results (Table 9).

Table 9 Results of pair-wise comparison conducted by domain experts

We obtain the normalized matrix through each row divided by column total, its row sums, and the percentages of relative overall priorities (i.e. weights) from the pair-wise comparison result of domain expert (See Table 10).

Table 10 Normalized matrix, row sums, and weights

And then we calculate average of each scenario by multiplying weight of scenario to pair-wise comparison results with remaining scenarios i.e. the row of Table 9.

$$0.281 \times {\left[ {\begin{array}{*{20}c} {1} \\ {{0.25}} \\ {{0.33}} \\ {1} \\ {{0.5}} \\ {{0.5}} \\ \end{array} } \right]} + 0.069 \times {\left[ {\begin{array}{*{20}c} {4} \\ {1} \\ {1} \\ {4} \\ {2} \\ {3} \\ \end{array} } \right]} + 0.087 \times {\left[ {\begin{array}{*{20}c} {3} \\ {1} \\ {1} \\ {2} \\ {2} \\ {2} \\ \end{array} } \right]} + 0.237 \times {\left[ {\begin{array}{*{20}c} {1} \\ {{0.25}} \\ {{0.5}} \\ {1} \\ {{0.5}} \\ {1} \\ \end{array} } \right]} + 0.148 \times {\left[ {\begin{array}{*{20}c} {2} \\ {{0.5}} \\ {{0.5}} \\ {2} \\ {1} \\ {1} \\ \end{array} } \right]} + 0.178 \times {\left[ {\begin{array}{*{20}c} {2} \\ {{0.33}} \\ {{0.5}} \\ {1} \\ {1} \\ {1} \\ \end{array} } \right]}$$

Now we obtain eigenvector λ by dividing this result by weights vector.

$$\lambda = {\left[ {\begin{array}{*{20}c} {{6.074733}} \\ {{6.070048}} \\ {{6.105364}} \\ {{6.084388}} \\ {{6.060811}} \\ {{6.092697}} \\ \end{array} } \right]}$$

And we calculate maximum eigenvalue λ max by averaging the eigenvector λ:

$$\lambda _{\max } = 6.08134$$

Consistency index can be obtained by calculating (λ max − n)/(n − 1) as follows

$${\text{CI}} = \frac{{\left( {6.08134 - 6} \right)}}{{\left( {6 - 1} \right)}} = 0.016$$

In case the size of matrix is 6, RIFootnote 2 is 1.24 and the consistency ratio is

$${\text{CR}} = {{0.016} \mathord{\left/{\vphantom {{0.016} {1.24}}} \right.\kern-\nulldelimiterspace} {1.24}} = 0.013.$$

As the result, inconsistency rate of domain expert is 0.013. So according to the paper (Saaty 1990) we can decide that the decision was a consistent one.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Lee, J., Kang, S. & Kim, CK. Software architecture evaluation methods based on cost benefit analysis and quantitative decision making. Empir Software Eng 14, 453–475 (2009). https://doi.org/10.1007/s10664-008-9094-4

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10664-008-9094-4

Keywords

Navigation