Elsevier

Omega

Volume 66, Part A, January 2017, Pages 118-139
Omega

Research productivity in management schools of India during 1968-2015: A directional benefit-of-doubt model analysis

https://doi.org/10.1016/j.omega.2016.02.004Get rights and content

Highlights

  • Research in Business Schools of India has not been assessed systematically so far.

  • A composite indicator of research productivity was developed, using the directional benefit-of-doubt model.

  • Faculty members at the Indian Institutes of Technology had better research productivity than those at the Indian Institutes of Management.

  • Faculty members with doctoral degrees from foreign, relative to Indian, schools were more productive.

  • Suggestions for accelerating management research in India are made.

Abstract

Given the growing emphasis on research productivity in management schools of India over the years, the present authors developed a composite indicator (CI) of research productivity, using the directional- benefit-of-doubt (directional-BOD) model. Specifically, we examined overall research productivity of the schools and their respective faculty members during the 1968-69–2014-15 and 2004-05–2014-15 periods. There are four key findings. First, the relative weights of the journal tier, total citations, author h-index, number of papers, impact factor, and journal h-index varied from high to low in order for estimating the CI of a faculty member. Second, both public and private schools were seemingly similar in research productivity. However, faculty members at the Indian Institutes of Technology (IITs) outperformed those at the Indian Institutes of Management (IIMs). Third, faculty members who had their doctoral degrees from foreign schools were more productive than those who had similar degrees from Indian schools. Among those trained in India, however, alumni of IITs were more productive than those of IIMs. Finally, IIMs at Ahmedabad and Bangalore and the Indian School of Business, Hyderabad have more names than other schools among the list of top 5% researchers during 2004-05–2014-15. These findings indicate a shift in the priority from mere training of managers to generating impactful knowledge by at least two of the three established public schools, and call further attention to improving the quality of doctoral training in India in general and IIMs in particular. Five suggestions for improving research productivity are offered.

Introduction

The productivity research reported in this article was motivated by the following two observations on institutions of higher learning in India:

Indian institutions produce best and hardworking students who can compete anywhere in the world, but the very same institutions are not able to build a culture that can provide a world-class research environment and produce best of researchers. Why is this so?” (Mishra [1], p. 1787)

… our educational institutions have to explore and extend new frontiers of knowledge domain. They have to give priority to build a culture where the basic human instinct of ׳questioning׳ is given primacy, where there is ample space for [re]creation of knowledge with changing time.” (Mishra [1], p. 1788)

In recent years, India has indeed been aiming at becoming a hub of knowledge. Stressing on the importance of science, technology, and innovation in transforming the nation, India׳s Prime Minister Narendra Modi also announced at the 102nd session of the Indian Science Congress that the Government of India (GOI) would provide the scientific communities and universities in India with an atmosphere conducive to pursue world-class research [2]. Further, the GOI has been developing a strong culture of collaboration between institutions and across disciplines in India to leverage the cross-functional advantage of expertize, development, and innovation [3], [4]. Put simply, the GOI is favorably inclined toward encouraging institutions of higher learning including business management schools in India to conduct world class research. International schools have also been recently entering into research collaboration with Indian institutions. The All India Council of Technical Education (AICTE), for example, has now come up with guidelines on how a foreign university can collaborate with the Indian academia in research [5]. Global higher education brands have already opened research centers in India to tap the research opportunities that India offers [6]. For example, the Harvard Business School has a research center in Mumbai, and the University of Chicago and Deakin University have such research centers in New Delhi. To us, these developments highlight the growing national realization of research including business research as a priority in India and global recognition of India as an exciting avenue for undertaking such business research endeavors.

Despite the foregoing increased foci on developing research and research climate in India, management schools in India, just like other premier educational institutions in India, have not yet met world standards in research [1]. For instance, the Indian Institutes of Management (IIMs), the Indian Institutes of Technology (IITs), and the Central Universities (CUs) – the premier institutions established by GOI – did not make to the list of top 100 productive schools across three successive rankings [7], [8], [9]. Such poor research performance of premier Indian institutions at an international level is a matter of vexing concern for academics [1] and policy-makers in India [10]. The Ministry of the Human Resource Development (MHRD) of GOI thus sponsored the PanIIM Conferences at Goa in 2013 and at Kozhikode in 2014 to discuss how to improve research productivity in India [11].

Given the recent interest in improving research productivity of management scholars in India, we undertook the current task of developing a composite index of research productivity based on data available in the public domain. Such an objective measure seemed promising from at least five vantage points. First, it would be the first of its kind to objectively measure quality of research of faculty members of management schools in India. Second, it can facilitate credible comparisons within and across institutions in India. Third, it can guide the Indian and international management scholars in choosing a suitable research collaborator as well as the doctoral students in choosing a suitable dissertation supervisor. No less important, the research funding bodies in India (see, e.g., Indian Council of Social Science Research, Indian Council of Agricultural Research, Council of Scientific and Industrial Research, etc.) may benefit in identifying subject matter experts for evaluations of the research grant applications submitted and/or for supporting research projects of those who are established researchers. Fourth, it can serve as a benchmark for setting higher productivity goals in research by faculty members. Finally, it can facilitate formulations and/or revisions of research policies by institutions and by the GOI as Mishra [1] noted.

In 2011, the then Environment Minister of India kicked up a controversy by commenting that faculty members at the premier universities, including the IIMs and IITs, were neither world-class nor worthwhile with respects to creativity and research [10]. Countering this comment, however, the then Human Resource Development Minister attributed the poor research productivity in IITs and IIMs more to limited resources, low priority to research, and limited research support than to poor quality of faculty members themselves [12].

Using the ISI Web of Science database, Kumar [13] found only 132 author counts (108 unique articles) by scholars affiliated with Indian management schools during 1990–2009. To provide a perspective on how low this Indian productivity might be, he contrasted the productivity of around 5 articles per year for the entire India with the productivity of the business school at the Hong Kong University of Science and Technology (HKUST), China, whose 100 plus faculty members had produced over 30 articles annually and of the Wharton Business School, University of Pennsylvania, Philadelphia, USA, whose 200 plus faculty members had produced about twice as many number of articles annually as HKUST. A follow up editorial on ‘Publish or Perish’ in the Economic Times [14] also reiterated such a need for producing high quality research from business schools (B-Schools) in India.

One response to the foregoing suggestions has been seemingly defensive: Indian scholars should study Indian problems, using indigenous methods, and publish in Indian journals. Pressure to publish in world class journals can unfortunately result in imitation instead of generation of original thoughts and methods. As Khatri et al. [15] argued, publishing in international journals would require writing for their audiences and contexts using their theories and methods, which may not augur well the Indian management research. Another equally defensive response is that international journals are disinterested in publishing Indian data. However, Singh [16] refuted this possibility, arguing that sloppy research (i.e., issues selected, techniques employed, unclear writing) by Indian faculty might be a key factor in the low record of international publications at B-Schools in India.

Of the suggestions offered to improve quality of management research in India, two are notable. One is shift in emphasis from teaching to research. That is, B-Schools should make research mandatory, enhance research capabilities, hire more research-trained faculty, and provide those faculty members who publish in international journals with financial incentives [17]. Another is a culture of collaboration in research wherein management schools in India should initiate research collaboration with foreign schools of repute and allocate adequate funds for bringing in research faculty from abroad along the lines of Scandinavian B-Schools [17]. Consistent with these suggestions, B-Schools in India have already made several interventions to improve their current research productivity. For example, the premier B-Schools in India have started emphasizing quality research to improve their respective rankings among their global counterparts [18]. Importantly, the tenure and promotion of faculty members nowadays depend on research productivity as well [19], [20].

A well-known indicator of one׳s research productivity is the number of publications in peer-reviewed journals. In fact, academic institutions are also adjudged by the number of publications in reputed journals, and there has recently been an increasing proliferation of the rankings, listings, and productivity indicators of schools and universities based on such publications in journals. These rankings have drawn attention of not only the associations such as the Association of Business Schools (ABS) and the Association to Advance Collegiate Schools of Business (AACSB), for example, but also the dominant industry players such as Thomson Reuters’ Web of Science, Elsevier׳s Scopus, and Google׳s Scholar.

Most areas of management1 analyze research productivity in terms of either the reputation of an author or the quality of the journal in which the article was published. The former is indicated by an author׳s total number of published papers [21], [22], [23], h-index2 [21], [23], [24], [25], and the number of citations of that author׳s publications [21]. The latter is indicated by the journal׳s h-index3 [25], tiering4, and impact factor (IF)5 [26], [27], [28], [29], [30]. Each such indicator taken in isolation has its own strengths and weaknesses in gauging the overall scholarly contribution of a researcher [31]. Nevertheless, some academic researchers have objected to such mismeasurement of science [32].

Research productivity has previously been judged along multiple criteria as well. We found two obvious shortcomings with those studies. First, there is a growing trend of Indian scholars publishing an article with multiple international co-authors. Such publications make estimation of “real” contributions of Indian researchers rather difficult. A top-tier journal publication with one or more international authors may well reflect the genuine scholarship that required international collaboration by the Indian counterparts. Unfortunately, however, such co-authorships in many cases lend credence to Indian faculty gossip that scholarship could have been confined to only collecting the Indian data for the research of well-known scholars. If so, assigning an equal importance or weight to the contribution of each individual author of an article and just counting the number of papers published might erroneously overestimate the productivity of those co-authors from India. Second, multiple authorship could alternatively reflect the contributions of the number of students mentored. For example, the present second author, who published single-authored articles in 1970s [33], [34], 1980s [35], [36] and 1990s [37], [38], [39], has recently been publishing articles co-authored with 8–10 colleagues and/or students to train and inspire them in research [40], [41]. Thus, assigning an equal importance or weight to the contribution of each individual author might again underestimate the productivity of first author and overestimate the contributions of his co-authors. Given the foregoing two concerns, we decided to aggregate multiple non-commensurate indicators and weight one׳s contribution to a published article by the order of authorships. We agree that this system might not be a perfect solution particularly when authorships were determined by alphabets of the last names instead of contributions to the article. Nonetheless, our system is still better than non-weighting of the order of authorships in adjudging one׳s research productivity.

A comprehensive measure of the overall research productivity required us to integrate multiple non-commensurate indicators into a single composite index (CI)6. While developing such CI, we were as aware as other recent scholars (cf. [42], [43]) that all the indicators might not be equally diagnostic of research productivity. To be meaningful, the CI requires setting of unknown weights for the indicators used, depending upon their relative importance. To us, the weight of an indicator should reflect on the priority given to it by the individual researcher contingent upon his or her career and aspiration (i.e., age, education, experience, and positions sought, etc.). If weights fail to capture the priorities given to one׳s career strategy, the resulting CI of research productivity might become questionable in terms of its unintended consequence of a skewed scholarship for younger more than senior faculty members.

We considered the data envelopment analysis (DEA) and the econometric approach as two ways of endogenously generating unknown weights (cf. [44], [45], [46]). Because of the identification of an efficient frontier, the DEA seemed to have an advantage over the traditional econometric approach in generating the impartial benefit-of-the-doubt (BOD) weighting [47].7 That is, if a researcher has high productivity according to one indicator of h-index, then the relative weight of his h-index should be correspondingly high. Since the CI estimate from the DEA measures the maximum productivity performance of a researcher, high research productivity in the BOD weighting implies high priority to the career strategy.

To overcome the aforementioned two problems, we employed the directional-BOD model to comprehensively gauge the research productivity of every scholar. We used six indicators. The first three pertained to the author: (1) h-index scores (I1), (2) total citations (I2), and (3) number of publications (I3); the last three, in contrast, pertained to the journal: (4) h-index scores (I4), (5) tier scores (I5), and (6) Impact Factor (IF) scores (I6). We took the h-index and the IF scores of the various journals from the Scopus, which has a much broader coverage of journals than the alternative Journal Citations Reports of the Thompson Reuter.

We also realized that the sole reliance on citations in journal rankings by the Scopus may not always be so accurate. For example, an otherwise important work that is casually dismissed as common knowledge may not get cited at all. Authors working on niche areas get cited less [33]. Worse, citation counts may at times be more a fashion within the academic community than a true indicator of the impact of the journal [50], [51], [52], [53]. Citation-based analyses can also be biased due to selective citations or self- and mutual citations which render the association between the quality of a journal and that of an individual article in it rather uninformative [53], [54], [55]. Despite these reservations, these citation-based indicators continue to be viewed as the valid representatives of the quality of journals in the contemporary literature. Thus, we included citations as one of the six indicators of research productivity in the DEA model.

Scholars around world in general and India in particular have been skeptical of the coverage by the Scopus. In particular, the Scopus has been accused of excluding the citations from books and non-traditional sources, such as web sites, dissertations, monographs, chapters in the edited volumes, open-access online journals, and/or the proceedings of important conferences [56]. To deal with such concerns, we selected publications included in the ranking list of the National University of Singapore (NUS). For the sake of fairness and comprehensiveness, we further considered publications in all journals listed in the Scopus, ABS, and NUS databases. To ensure accuracy, we also relied on the author׳s h-index8 and the total citations reported in the Google Scholar9 that covers all sorts of citation from published and unpublished documents. We believe that consideration of Indicators 1–3 mitigates some of the concerns of Indian scholars, and that of Indicators 4–6 gives them due credit for publishing in prime international journals.

Given our directional-BOD model analysis of the relative weights of six non-commensurate indicators in developing the CI of research productivity of a faculty member, we felt confident that our indices might be psychometrically much better and practically more useful than the alternative estimates. Relying on the relative weights of individual indicators in estimating the CI is not only a methodological novelty in productivity assessment [62], [63] but also an objective check on whether the earlier cited Western rankings had “fairly” portrayed research productivity in B-Schools of India. To the best of our knowledge, ours is the first attempt toward assessing the state-of-the-art in research productivity in B-Schools of India. We are also the first to come up with CI that seems to be more valid and practical than any of the previously used indices of research productivity.

The remainder of this paper unfolds as follows. Section 2 deals first with issues and problems in our data collection, and then with the presentation of relevant data on B-schools in India used to arrive at the six indicators. Section 3 first presents the general input–output based production model in DEA setting to measure overall research productivity. Since the use of this model required some data that were currently unavailable, we first present the description of BOD models used to estimate CI, then point out their limitations, and finally suggest a generalized version of the directional BOD model. 4 Results, 5 Discussion, 6 Suggestions for accelerating research productivity present results, discussion and suggestions for accelerating research productivity in India, respectively.

Section snippets

Data collection

Collecting the accurate data on publications by the faculty members of different B-Schools in India was a mammoth task for us. In general, faculty members did not provide the full information on their respective websites (e.g., “a large number of publications in reputed journals”). Of those who reported the titles of the articles and the names of the journals, most of them did not report the orders of authorships (e.g., “coauthored with other professors”) either. Although websites had option

The input–output based DEA model

In a DEA setting, research productivity can be studied either in the form of an input–output production model or a composite indicator. The former is a production model in which research productivity is assessed by considering some set of inputs leading to generation of some set of outputs.11

Results

Of the 1416 faculty members in the 32 B-Schools of India, only 783 (i.e., 55.37%) had at least one publication captured in one of the three databases (i.e., NUS, ABS, or Scopus). Across 32 B-schools, 56.40% of the faculty members had published at least one journal article. While 92.31% management faculty members of the IIT, Madras were research active, only 16.28% of those at the S P Jain Management School, Mumbai were so.

We present the distribution of 5539 papers by those faculty members over

Discussion

The CIs derived through the directional-BOD model indeed provided us with new information about research productivity of B-Schools in India. As we noted, the CI entailed relative weights of the six criteria objectively generated from the data at hand, and the weights were further corrected by including the directional distance function to avoid any arbitrariness in imposing weight restrictions by the policy analysts. Specifically, our use of the PCA to objectively estimate the directional

Suggestions for accelerating research productivity

Our current work on measurement of research productivity in B-Schools allows us to offer some directions on the pressing research policy issues related to accelerating research productivity in India. Here we focus on the five important issues of (1) quality of the doctoral programs, (2) self-renewals of the faculty members, (3) research programs by the stars identified, (4) appointing academic leaders with experience and focus on research, and (5) retaining the star researchers.

  • 1.

    Faculty members

Limitations and future research

In any study, what one observes is not TRUTH per se but TRUTH accessed through the particular method that was employed. Our evaluation of research productivity in B-Schools in India has the same limitation. Thus, what we report in this article are findings from the directional-BOD model meta-frontier analysis, using the three criteria about the journal and the three criteria about the author. Given new inputs and criteria of research, the estimated productivity of some faculty members might

Acknowledgment

The authors are grateful to M. Mehdiloozad and M. Khoveyni for developing the General Algebraic Modeling System (GAMS) code which we used to estimate the CI scores. This version of the manuscript has benefited substantially from the extremely constructive suggestions and criticisms of three anonymous reviewers, the chief editor, the area editor, Elisa Fusco, and numerous colleagues in India who commented on the first draft and the implications of our findings for promoting management research

References (129)

  • R. Singh

    “Fair” allocations of pay and workload: tests of a subtractive model with nonlinear judgment function

    Organizational Behavior and Human Decision Processes

    (1995)
  • R. Singh

    Subtractive versus ratio model of "fair" allocation: can group level analyses be misleading?

    Organizational Behavior and Human Decision Processes

    (1996)
  • R. Singh

    Group harmony and interpersonal fairness in reward allocation: on the loci of the moderation effect

    Organizational Behavior and Human Decision Processes

    (1997)
  • B.K. Sahoo et al.

    An alternative approach to monetary aggregation in DEA

    European Journal of Operational Research

    (2010)
  • M. Oral et al.

    The appreciative democratic voice of DEA: a case of faculty academic performance evaluation

    Socio-Economic Planning Sciences

    (2014)
  • R.G. Dyson et al.

    Pitfalls and protocols in DEA

    European Journal of Operational Research

    (2001)
  • M.J. Jones et al.

    Journal evaluation methodologies: a balanced response

    Omega-International Journal of Management Science

    (1996)
  • B.S. Frey et al.

    Do rankings reflect research quality?

    Journal of Applied Economics

    (2010)
  • H. Tüselmann et al.

    Towards a consolidation of worldwide journal rankings – a classification using random forests and aggregate rating via data envelopment analysis

    Omega-International Journal of Management Science

    (2015)
  • M. Abbott et al.

    The efficiency of Australian universities: a data envelopment analysis

    Economics of Education Review

    (2003)
  • N.K. Avkiran

    Investigating technical and scale efficiencies of Australian universities through data envelopment analysis

    Socio-Economic Planning Sciences

    (2001)
  • B.L. Lee

    Efficiency of research performance of Australian universities: a reappraisal using a bootstrap truncated regression approach

    Economic Analysis and Policy

    (2011)
  • B.K. Sahoo et al.

    Decomposing technical efficiency and scale elasticity in two-stage network DEA

    European Journal of Operational Research

    (2014)
  • E. Fusco

    Enhancing non-compensatory composite indicators: a directional proposal

    European Journal of Operational Research

    (2015)
  • C.A.K. Lovell et al.

    Radial DEA models without inputs or without outputs

    European Journal of Operational Research

    (1999)
  • A. Hashimoto

    A ranked voting system using a DEA/AR exclusion model: a note

    European Journal of Operational Research

    (1997)
  • M.J. Bellenger et al.

    An economic approach to environmental indices

    Ecological Economics

    (2009)
  • M. Kortelainen

    Dynamic environmental performance analysis: a malmquist index approach

    Ecological Economics

    (2008)
  • A. Fernandez-Castro et al.

    Towards a general non-parametric model of corporate performance

    Omega-International Journal of Management Science

    (1994)
  • E. Thanassoulis et al.

    A comparison of data envelopment analysis and ratio analysis as tools for performance assessment

    Omega-International Journal of Management Science

    (1996)
  • G.E. Halkos et al.

    Efficiency measurement of the Greek commercial banks with use of financial ratios: a data envelopment analysis approach

    Management Accounting Research

    (2004)
  • D.K. Despotis

    Measuring human development vis data envelopment analysis: the case of Asia and the Pacific

    Omega-International Journal of Management Science

    (2005)
  • W.B. Liu et al.

    A study of DEA models without explicit inputs

    Omega-International Journal of Management Science

    (2011)
  • S.N. Mishra

    Reflections on science in service of a symbiotic society

    Current Science

    (2014)
  • Kumar N. Taking stock of Indian management research. Economic Times...
  • Publish or perish. Economic Times...
  • Singh R. Sloppy research versus disinterest in Indian data as a difficulty factor in international publications. In:...
  • Kumar N. Indian Business schools still display pre-reforms mentality. Economic times...
  • Madhavan N. Paper Lambs. Business Today; October 28, 2012 edition,...
  • Cited by (60)

    View all citing articles on Scopus

    This manuscript was processed by Associate Editor Huang.

    View full text