Abstract
This paper offers a conceptually novel contribution to the understanding of the distinctive governance challenges arising from the increasing reliance on formalized knowledge in the governance of research activities. It uses the current Australian research governance system as an example – a system which exhibits a comparatively strong degree of formalization as to its knowledge mechanisms. Combining theoretical reflections on the political-administrative and epistemic dimensions of processes of formalization with analyses of interview data gathered at Australian universities, it is suggested that such a strong reliance on formalized knowledge has rather ambivalent governance ramifications. On the one hand, it allows for a seemingly rational and efficient form of the control and coordination of research activities. Yet on the other hand, it also increases the risk that knowledge is used in governance contexts in superficial, unconsidered and ultimately unreasonable ways. It is further suggested that there are a range of indications that precisely such use elicits and reinforces a range of dysfunctional behaviors on part of relevant individual and organizational actors in the public science system.
Similar content being viewed by others
Notes
The concept of governance, while generally referring to processes of coordination and steering involving various actors, is multifaceted, and, due to its prominent use, has necessarily become analytically somewhat imprecise. In many contexts, ‘governance’ is used to signal a departure from ‘government’ in the coordination of various activities; where ‘government’ is commonly associated with hierarchical forms of decision-making in which the state features as the central, regulatory actor. This understanding may, however, be potentially limiting if applied to contemporary developments in the governance of public universities and of the public sciences more generally. Here, it has been convincingly suggested (see, e.g., Capano 2011), the change toward an apparently more hands-off and devolved form of the steering of activities does not necessarily mean that the influence of the state as the major coordinating actor has diminished. The analysis presented in this paper gives some additional weight to this particular view.
My use of the concepts ‘science’ and ‘scientific’ throughout this paper is inclusive rather than exclusive, including the humanities and social sciences along with the sciences ‘proper.’
In the case of the Norwegian institution, the authors convincingly argue, this trend toward the increasing administrative formalization is driven strongly by the increasing exogenous accountability pressures on universities to demonstrate that resources are used efficiently and effectively, and, in the case of the American institution, somewhat ironically, by entrepreneurial and outreach agendas and the resulting complexities.
Various commentators have noted that recent NPM-style governance reforms appear to entail “a dynamic of deregulation and reregulation” (Christensen 2011: 512) rather than a general ceding of centralized forms of control and authority. See on this point also Schimank, who observes with regard to the situation Germany: “Recently, ‘management by objectives’ has become institutionalized in the form of mission-based contracts between ministries and universities. In theory, such contracts should not produce recommendations, but only statements of objectives; in practice, however, this flexibility is not granted, and ministries revert to regulation under the guise of NPM” (2005: 367–8).
This metrics-driven approach is in line with the first ERA indicator principle, which states that appropriate indicators for the assessment of research activities must be “quantitative,” that is, “objective measures that meet a defined methodology that will reliably produce the same result, regardless of when by whom principles are applied” (Australian Research Council 2012). Overall, however, ERA has not yet totally dispensed with more qualitative forms of assessment such as peer-review. For example, in the context of ERA, peer-review is still used in the assessment of research performances in some disciplines, most of all the humanities. Nevertheless, compared, for example, to the British ‘Research Assessment Exercise’ last conducted in 2008, the use that is made of peer-review in ERA is relatively restricted, and the primary focus in ERA is on quantity of outputs, and not on the quality of selected publications.
The focus of Husserl’s discussion of technization is on the modern development of rational-scientific thinking. The question whether Husserl’s assessment concerning the danger of technization is authoritative and realistic with regard to modern science and the role the formal method plays therein obviously lies beyond the scope of this paper. See for a recent collection of instructive essays addressing this and other related questions Hyder and Rheinberger (2010).
Weingart makes the valid point that claims as to the superior ‘objectivity’ and ‘exactness’ of evaluations that are purely based on bibliometric measures ignore the fact the such measures tacitly build upon more ‘subjective’ peer-review evaluations: “In fact, publication and citation measures are representations of the communication process as it unfolds in journal publications. Thus, they also embody the peer-review evaluations that have led to actual publications” (2005: 122). Weingart adamantly observes that the evaluative use of bibliometric indicators is nevertheless useful in incorporating a larger number of decisions as to publication or citation than one single peer-review process (2005: 122).
This behavioral adaption of sacrificing quality since the focus of performance measurement is on quantity has been repeatedly noted and explored in the public administration literature, where it is commonly referred to as “output distortion” (Bevan & Hood 2006).
References
Andras, Peter. 2011. Research: Metrics, quality, and management implications. Research Evaluation 20: 90–106.
Australian Research Council. 2012. ERA indicator principles. Retrieved from http://www.arc.gov.au/pdf/era12/ERA_2012_Indicator_Principles.pdf.
Beaver, Donald de B. 2012. Quantity is only one of the qualities. Scientometrics 93: 33–39.
Bevan, Gwyn, and Christopher Hood. 2006. What’s measured is what matters: Targets and gaming in the English public health care system. Public Administration 84: 517–538.
Bleiklie, Ivar. 2005. Organizing higher education in a knowledge society. Higher Education 49: 31–59.
Bornmann, Lutz. 2010. Mimicry in science? Scientometrics 86: 173–177.
Bornmann, Lutz, and Hans-Dieter Daniel. 2007. Multiple publication on a single research study: Does it pay? The influence of number of research articles on total citation counts in biomedicine. Journal of the American Society for Information Science 58: 1100–1107.
Braun, Dietmar. 2003. Lasting tensions in research policy-making—A delegation problem. Science and Public Policy 30: 309–321.
Buckley, R. Philip. 2004. Husserl and the communal praxis of science. In Husserl and the sciences: Selected perspectives, ed. Richard Feist, 213–226. Ottawa: University of Ottawa Press.
Burns, Tom. 1961. Micropolitics: Mechanisms of institutional change. Administrative Science Quarterly 6: 257–281.
Butler, Linda. 2003a. Modifying publication practices in response to funding formulas. Research Evaluation 12: 39–46.
Butler, Linda. 2003b. Explaining Australia’s increased share of ISI publications—The effects of a funding formula based on publication counts. Research Policy 32: 143–155.
Capano, Gilberto. 2011. Government continues to do its job. A comparative study of governance shifts in the higher education sector. Public Administration 89: 1622–1642.
Christensen, Tom. 2011. University governance reforms: Potential problems of more autonomy? Higher Education 62: 503–517.
Feller, Irwin. 2009. Performance measurement and the governance of American academic science. Minerva 47(3): 323–344.
Geuna, Aldo, and Ben R. Martin. 2003. University research evaluation and funding: An international comparison. Minerva 41: 277–304.
Gläser, Jochen, and Stefan Lange. 2007. Wissenschaft. In Handbuch Governance: Theoretische Grundlagen und empirische Anwendungsfelder, eds. Arthur Benz, Susanne Lütz, Uwe Schimank, and Georg Simonis, 437–451. Wiesbaden: VS Verlag für Sozialwissenschaften.
Gläser, Jochen, and Gritt Laudel. 2007. Evaluation without evaluators: The impact of funding formulae on Australian university research. In The changing governance of the sciences: The advent of research evaluation systems, eds. Richard Whitley, and Jochen Gläser, 127–151. Dordrecht: Springer.
Husserl, Edmund. 1970. The Crisis of European Sciences and Transcendental Phenomenology: An Introduction to Phenomenological Philosophy (trans: David Carr). Evanston, IL: Northwestern University Press. (Original work published 1954).
Hyder, David, and Hans-Jörg Rheinberger (eds.). 2010. Science and the Life-world: Essays on Husserl’s “Crisis of European Sciences”. Stanford: Stanford University Press.
Kehm, Barbara M., and Ute Lanzendorf (eds.). 2006. Reforming University Governance: Changing Conditions for Research in Four European Countries. Bonn: Lemmens.
Klein, Jacob. 1968. Greek Mathematical Thought and the Origin of Algebra (trans: Eva Brann). Cambridge, MA: MIT Press. (Original work published 1934 and 1936).
Larkins, Frank P. 2011. Australian Higher Education Research Policies and Performance 1987–2010. Melbourne: Melbourne University Press.
Laudel, Gritt. 2006. The art of getting funded: How scientists adapt to their funding conditions. Science and Public Policy 33: 489–504.
Lewis, Jenny, and Sandy Ross. 2011. Research funding systems in Australia, New Zealand and the UK: Policy settings and perceived effects. Policy & Politics 39: 379–398.
Luhmann, Niklas. 1997. Die Gesellschaft der Gesellschaft, vol. 2. Frankfurt am Main: Suhrkamp.
Luhmann, Niklas. 1999. Funktionen und Folgen formaler Herrschaft, 5th ed. Berlin: Dunker & Humblot.
Marginson, Simon, and Mark Considine. 2000. The Enterprise University: Power, Governance and Reinvention in Australia. Cambridge: Cambridge University Press.
Moed, Henk F. 2005. Citation Analysis in Research Evaluation. Dordrecht: Springer.
Neumann, Ruth, and James Guthrie. 2002. The corporatization of research in Australian higher education. Critical Perspectives on Accounting 13: 721–741.
Power, Michael. 2005. The theory of the audit explosion. In The Oxford handbook of public management, eds. Ewan Ferlie, Laurence E. Lynn, and Christopher Pollitt, 326–344. Oxford: Oxford University Press.
Power, Michael. 2007. Research evaluation in the audit society. In Wissenschaft unter Beobachtung: Effekte und Defekte von Evaluationen, eds. Hildegard Matthies, and Dagmar Simon, 15–24. Wiesbaden: VS Verlag für Sozialwissenschaften.
Ramirez, Francisco O., and Tom Christensen. 2012. The formalization of the university: Rules, roots, and routes. Higher Education 65: 695–708.
Schimank, Uwe. 2005. “New Public Management” and the academic profession: Reflections on the German situation. Minerva 43: 361–376.
Shaffer, David W., and James J. Kaput. 1999. Mathematics and virtual culture: An evolutionary perspective on technology and mathematics education. Educational Studies in Mathematics 37: 97–119.
Shin, Jung Cheol, Robert K. Toutkushian, and Ulrich Teichler (eds.). 2011. University Rankings: Theoretical Basis, Methodology and Impacts on Global Higher Education. Dordrecht: Springer.
Van Raan, Anthony F.J. 2005. Fatal attraction: Conceptual and methodological problems in the ranking of universities by bibliometric methods. Scientometrics 62: 133–143.
Weber, Max. 1978. Economy and Society: An Outline of Interpretative Sociology (trans: Ephraim Fischoff et al., Guenther Roth, and Claus Wittich, eds.). Berkeley: University of California Press. (Original work published 1922).
Weingart, Peter. 2005. Impact of bibliometrics upon the science system: Inadvertent consequences? Scientometrics 62: 117–131.
Weingart, Peter. 2013. The loss of trust and how to regain it: Performance measures and entrepreneurial universities. In Trust in Universities, eds. Lars Engwall, and Peter Scott, 83–95. London: Portland Press.
Weingart, Peter, Roswitha Sehringer, and Matthias Winterhager. 1990. Which reality do we measure? Scientometrics 19: 481–493.
Whitley, Richard. 2007. Changing governance of the public sciences: The consequences of establishing research evaluation systems for knowledge production in different countries and scientific fields. In The changing governance of the sciences: The advent of research evaluation systems, eds. Richard Whitley, and Jochen Gläser, 3–27. Dordrecht: Springer.
Whitley, Richard. 2011. Changing governance and authority relations in the public sciences. Minerva 49(4): 359–385.
Whitley, Richard, and Jochen Gläser (eds.). 2007. The Changing Governance of the Sciences: The advent of Research Evaluation Systems. Dordrecht: Springer.
Whitley, Richard, Jochen Gläser, and Lars Engwall (eds.). 2010. Reconfiguring Knowledge Production: Changing Authority Relationships in the Sciences and their Consequences for Intellectual Innovation. Oxford: Oxford University Press.
Woelert, Peter. 2013. The ‘economy of memory’: Publications, citations, and the paradox of effective research governance. Minerva 51(3): 341–362.
Woelert, Peter, and Victoria Millar. 2013. The ‘paradox of interdisciplinarity’ in Australian research governance. Higher Education 66: 755–767.
Acknowledgments
I wish to acknowledge Australian Research Council funding support for the research project ‘Knowledge Building in Schooling and Higher Education: Policy strategies and effects’ (ARC Discovery Project 2011–14, DP110102466, Chief Investigator Prof. Lyn Yates). I would like to thank Lyn Yates as well as two anonymous reviewers of this journal for their helpful feedback and suggestions.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Woelert, P. Governing Knowledge: The Formalization Dilemma in the Governance of the Public Sciences. Minerva 53, 1–19 (2015). https://doi.org/10.1007/s11024-015-9266-5
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11024-015-9266-5