Home Articles Reader Opinion Editorial Book Reviews Discussion Writers Guide About TCRecord
transparent 13
Topics
Discussion
Announcements
 

If the Research is Not Used, Does it Exist?


by Gustavo Fischman & Adai Tefera - June 17, 2014

The purpose of this commentary is to argue that a better way of addressing the centuries old criticisms about the ivory toweresque model is to stop complaining and engage with knowledge mobilization strategies (KM). Scholars engaging in knowledge mobilization seek to understand and increase the impact and usability of research by means of multi-dimensional, interactive strategies that target a wide range of stakeholders as an approach to meet the ethical obligation scholars have to ensuring the research produced in education is more accessible and ultimately impactful. We recognize, as well, the evolving tensions that this process will likely evoke in the quest to improving scholarly impact by making research more accessible and usable for the public.

Given the continuous expansion of publications by educational scholars, it is evident that “publish or perish” remains not only a popular saying but also a command that must be obeyed. What is ironic is that the increase in the number, and even we dare to say the quality, of publications focused on research in education (Ware & Mabe, 2012), those clear signs of the field’s internal improvement, do not seem to help in placating the critical voices about the lack of relevance of our field.


Educational research is not an exception and it is subject to the general criticisms and complaints that academics are insulated behind the walls of ivory towers and disconnected from everyday reality. Critical declarations can be traced as far back as Plato and Aristotle, and especially to the seminal work of Immanuel Kant and The Conflict of the Faculties. Kant contended that the university was originally conceived of for public dialogue and debate and argued that public discourse and open arguments were essential to societies and universities. Yet over two centuries later Nicholas Kristof’s article, Professors We Need You! has set off a firestorm of responses, particularly among scholars, due, in part, to the claim that professors and universities are guilty of creating “a culture that glorifies arcane unintelligibility while disdaining impact and audience” (February, 2014). Reaction and responses emanated primarily from academics deriding Kristof for his oversimplified understanding of academia. While on the one hand the arguments Kristof makes are not new nor the problems particularly unknown, growing support for scholars’ more deliberate engagement with the public seems to be reaching a critical mass.1


The purpose of this commentary is to argue that a better way of addressing the centuries old criticisms about the ivory toweresque model is to stop complaining and engage with knowledge mobilization strategies (KM). Scholars engaging in knowledge mobilization seek to understand and increase the impact and usability of research by means of multi-dimensional, interactive strategies that target a wide range of stakeholders as an approach to meet the ethical obligation scholars have to ensuring the research produced in education is more accessible and ultimately impactful. We recognize, as well, the evolving tensions that this process will likely evoke in the quest to improve scholarly impact by making research more accessible and usable for the public.


FROM PUBLISH OR PERISH TO RESEARCH USE


One of the biggest challenges to implement effective KM is the prevalence of an “enlightened” (Weis, 1977) model of scientific communication that mostly uses a hard-to-access, top-down, one-way (research to community/schools/policy makers) approach for disseminating research. The persistence of a unidirectional model for research communication has contributed to a broad perception especially among practitioners that scholars impose research-based knowledge on them, leading to a failure to recognize the knowledge teachers, principals and policy makers, for example, do possess. There is, therefore, emerging recognition that this top down and one-way model is neither effective nor relevant for broader audiences in education (Cooper et al., 2011; Levin, 2011). While most scholars believe that the research they produce is valuable, many are acknowledging that teachers, parents, journalists and policymakers – those who should ideally benefit and also engage with the research – often do not due to a perceived lack of relevance.


In part, the problem of relevance and use are connected to university promotion and tenure committees’ reliance on indirect metrics of research quality that are too narrow and restrictive, such as the Journal Impact Factor (JIF) which often serves as the supreme arbiter for research accountability and incentive structures (Alperin, Fischman, Willinsky, 2011; Piwowar, 2013). The limitations of indirect metrics have been broadly recognized (DORA, 2012) and support has been coalescing among scholars for broadening definitions of scholarly impact beyond generic indicators such as JIF. Consider, for example, thousands of signatories, including journal editors and scholars across hundreds of disciplines, who, in 2012, signed the San Francisco Declaration on Research Assessment (DORA) – a worldwide initiative encouraging individuals and organizations to appropriately assess scientific research by eliminating journal-based metrics to determine funding, appointment and promotion considerations, and to place more emphasis on research itself and less emphasis on the journals in which the research is published.


In an effort to improve scholarly access and use, scholars have begun using alternative means for disseminating research as well as assessing its impact.  Among the most used are publishing in open access journals to expand access and using Article Level Metrics and collecting altmetrics --which includes assessing references in bibliographic databases, abstract and article views, downloads, or mentions in social media and news media (Piwowar, 2013; Priem et al., 2010) --to complement bibliometric and citation counts.


Yet, as these alternative models and metrics evolve, tensions are likely to arise given the current broad use of simple but indirect metrics to determine what constitutes research rigor and quality. A very significant number of responses to Kristof’s piece reflect the broad perspective among researchers that social media outlets such as Facebook, Twitter, YouTube and the like could be interesting as sources of measures of popularity but are inadequate and insufficient given the challenges of assessing research use and impact.  What we want to highlight is that more and more scholars are using these outlets to reach the public and broaden audiences beyond the regular customers of research journals in education.


Essential questions are therefore being asked regarding what counts as quality research in education (Southerland, Gadsden, & Herrington, 2014). How does one determine research impact in today’s contemporary research environment? How can or should social media, for example, count towards promotion and tenure? Such questions that reflect tensions between the benefits for individual researchers’ in academia and the public’s benefit, while challenging, are inevitable as the research field evolves. We agree with Juan Pablo Alperin (2014) who explains that “Altmetrics are captured from the Web (i.e., social media, blogs, Wikipedia), and thus are (somewhat) more democratic – one reader, one vote. More precisely: one reader, several potential votes. Unlike citations, which can only be counted if the citing document is in a select group of journals, altmetrics are counted regardless of where in the world they are originated, with one important consequence: they open the possibility of tracking impact in new segments, both within and beyond the academy.”


We firmly believe that scholars’ autonomy to explore areas of scientific interest and engage in the production of knowledge is and should remain a cornerstone of academia. But it is equally important that educational researchers consider whether and to what extent the knowledge we are producing is accessible and utilized by practitioners, policy makers and the public at large. Consider, for example, the potential impact of increasing the public’s access to the knowledge produced by scholars by providing open access to researchers’ scholarship. Open Access Press is “digital, online, free of charge, and free of most copyright and licensing restrictions” (Suber, 2011). Although common practice among universities is to reward scholars based on the number of publications in scholarly journals with particular weight given to so-called prestigious journals, open access policies disrupt these traditional notions of measuring impact (Alperin, Willinsky, & Fischman, 2011). Such approaches offer a clear approach by universities to make research more accessible and impactful to the public.


Equally important is ensuring that the research knowledge is also accessible with regard to understandability. This maximizes the chances that the research can be disseminated using diverse mediums in order to maximize its reach to a diverse body of education practitioners and policy makers. The use of policy briefs, video commentaries and social media outlets, to name a few, offer important dissemination alternatives for research that broaden the public’s access to research knowledge.


As the advancement of technological and social media tools continue to rapidly improve worldwide access to research knowledge, academics are also increasingly using these resources to disseminate their own research to the public (Cooper, forthcoming). These tools offer potential in disrupting traditional producer-push models of academic research, allowing for two-way communication between users and producers, providing opportunities for users to engage in, comment and create research content simultaneously (Cooper, forthcoming).


Our rapidly changing landscape of research dissemination along with growing calls from the public and scholars are merging in ways that are forcing institutions of higher education to consider questions regarding relevance, access and impact. In this commentary, we offer knowledge mobilization strategies, including the use of altmetrics, social media tools and open access, to improve scholarly impact, but acknowledge the challenges this quest will likely evoke (see Jacobson, et al., 2004; Sá, Li, & Faubert, 2011).


We acknowledge that challenges such as the difficulty of measuring research impact or translating complex and specialized scientific languages without losing the explanatory power to name just two, are complex and have no simple answers, but believe that research rigor will improve when research knowledge is ultimately produced with and informed by multiple dialogs with relevant publics and ultimately for the public good.


As greater numbers of researchers use open access journals and engage with altmetrics for their research, universities and research centers will be left with a number of interrogates, including questions related to tenure and promotion, financial resources to support knowledge mobilization and training the next generation of scholars to improve scholarly impact through a variety of both traditional and contemporary methods. While the university will always offer fertile ground for innovation, science and exploration, the time, it seems, has come to simultaneously consider the limitations of the publish or perish model and began considering that research that is not used, may not exist. Educational researchers and their institutions need to find ways to better manage the tensions between assessing research quality, scholarly impact, social understandability and pedagogical, social, and scientific relevance to meet our ethical obligation to serve the public and close the rift between the production of research, its access, and ultimate use.


Notes


1. A recent special issue in Nature, (October, 2013) entitled “Impact” stated that “[e]very government and organization that funds research wants to support science that makes a difference – by opening up new academic vistas, stimulating innovation, influencing public policies or directly improving people’s lives” (p. 1). In concert with the recent calls that have been made, there has also recently been a growing body of both empirical and theoretical research exploring the complex and multidimensional relationships among research, policy and practice (Amara et al., 2004; Belkhodja et al., 2007; Gutierrez & Penuel, 2014; Hemsley-Brown, 2004; Lavis et al., 2002; Lemieux & Champagne, 2004; Levin, 2004; Mitton et al., 2007; Nutley et al., 2007; OECD, 2007; Phillips, 2014; Rudolph, 2014; Southerland, Gadsden, & Herrington, 2014; Wieman, 2014).


References


Alperin, J. P. (2014, March 12). Altmetrics aren’t always so ‘alt’: Ask the developing world. Social science space. Retrieved online from, http://www.socialsciencespace.com/2014/03/altmetrics-arent-always-so-alt/.


Alperin, J. P., Willinsky, J., Fischman, G. E. (2011). Scholarly communication strategies in Latin America’s research-intensive universities. Revista Educacion Superior Sociedad 2(16). Retrieved online from, http://pkp.sfu.ca/files/iesalc_final.pdf.


Amara, N., Ouimet, M., & Landry, R. (2004). New evidence on instrumental, conceptual and symbolic utilization of university research in government agencies. Science Communication, 26(1), 75–106.


Belkhodja, O, Amara, N., Landry, R., & Ouimet, M. (2007). Determinants of research utilization in Canadian health services organizations. Science Communication, 28(3), 377417.


Cooper, A., Rodway Macri, J., & Read, R. (2011). Knowledge mobilization practices of educational researchers in Canada. Paper presented at the American Educational Research Association, New Orleans, LA.


Cooper, A. (forthcoming). Cooper, A. (2014) Social media and all that jazz: Online knowledge mobilization strategies utilized by Canadian Research Brokering Organizations in education. Education Policy Analysis Archives, 22(71) Retrieved [date], from http://epaa.asu.edu/ojs/article/view/1369


Declaration of Research Assessment. (2012). Retrieved online from, http://am.ascb.org/dora/.

Hemsley-Brown. J. (2004). Facilitating research utilization: A cross-sector review of research evidence.

The International Journal of Public Sector Management, 17(6), 534–552.


Jacobson (2004). Faculties of education and institutional strategies for knowledge mobilization: An exploratory study. Higher Education, 61(5), 501–512.


Kristof, N. (2014, February 15). Professors we need you! The New York Times. Retrieved online from, http://www.nytimes.com/2014/02/16/opinion/sunday/kristof-professors-we-need-you.html?_r=0.  


Lavis, J., Ross, S., & Hurley, J. (2002). Examining the role of health services research in public policymaking. The Milbank Quarterly, 80(1), 125–154.


Lavis, J. (2006). Research, public policymaking, and knowledge-translation processes: Canadian efforts to build bridges. Journal of Continuing Education in the Health Professions, 26(1), 37–45.


Lemieux, L., & Champagne, F. (2004). Using knowledge and evidence in health care: Multidisciplinary perspectives. University of Toronto:  


Levin, B. (2004). Making research matter more. Education Policy Analysis Archives, 12(56), 1–20.


Levin, B. (2008). Thinking about knowledge mobilization: A discussion paper prepared at the request of the Canadian council on learning and the social sciences and humanities research council.


Levin, B. (2011). Mobilising research knowledge in education. London Review of Education, 9(1), 15–26.


Mitton, C., Adair, C. E., McKenzie, E., Patten, S. B., & Perry, B. W. (2007). Knowledge transfer and exchange: Review and synthesis of the literature. The Milbank Quarterly, 85(4), 729768.


Nature. (2013, October). Impact: The search for the science that matters. Retrieved online from, http://www.nature.com/news/specials/impact/index.html#editorial.


Nutley, S., Walter, I., & Davies, H. (2007). Using evidence: How research can inform public services. Bristol: Policy Press.


OECD. (2007). Evidence in Education. Linking Research and Practice. Paris: OECD.


Piwowar, H. (2013). Introduction altmetrics: What, why and where? Bulletin for the American Society for Information, 39(4) 8-9.


Priem, J., Taraborelli, D., Groth, P., & Neylon, C. (2010). Altmetrics: a manifesto. altmetrics. Retrieved online from, http://altmetrics.org/manifesto/.


Sá, C. M., Li, S. X., & Faubert, B. (2011). Faculties of education and institutional strategies for knowledge mobilization: An exploratory study. Higher Education, 61(5), 501–512.


Southerland, S. A., Gadsden, V. L. & Herrington, C. D. (2014). Editors’ introduction: What should count as quality education research? Continuing the discussion. Educational Researcher, 43(7), 6–8. DOI: 10.3102/0013189X13519962


Suber, P. (2011). Open access overview (definition, introduction). Retrieved online from, http://legacy.earlham.edu/~peters/fos/overview.htm.


Ware, M., & Mabe, M. (2012). The stm report: An overview of scientific and scholarly journal publishing. Oxford: International Association of Scientific, Technical and Medical Publishers). Retrieved online from, http://www.stm-assoc.org/2012_12_11_STM_Report_2012.pdf.


Weis, C. H. (1977). Research for policy’s sake: The enlightenment function of social research. Policy Analysis, 3(4), 531–545.





Cite This Article as: Teachers College Record, Date Published: June 17, 2014
https://www.tcrecord.org ID Number: 17570, Date Accessed: 1/22/2022 7:04:02 PM

Purchase Reprint Rights for this article or review
 
Article Tools
Related Articles

Related Discussion
 
Post a Comment | Read All

About the Author
 
Member Center
In Print
This Month's Issue

Submit
EMAIL

Twitter

RSS