Professional Doctorate in Education (EdD) research proposal:
Evidence-based decision making to improve student satisfaction by course managers in higher education
The Quality Assurance Agency for Higher Education (QAA) is the independent body entrusted with monitoring, and advising on, standards and quality in UK higher education (HE). Amongst other activities, it publishes the UK Quality Code for Higher Education, which sets out quality expectations that UK higher education providers must meet. During QAA scrutiny inspections, higher education institutions (HEIs) must demonstrate how they meet these expectations, supported by various forms of evidence.
In addition to evidence required for QAA purposes, the recent government green paper on development of a Teaching Excellence Framework (TEF) (Department for Business, Innovation and Skills, 2015) indicates that in addition to Research Excellence Framework (REF) league tables, UK HEIs will also be ranked according to learning and teaching-related metrics. The green paper proposes that success in the TEF will mean that HEIs will have the opportunity to increase their tuition fees. This is an obvious commercial advantage, so having strong evidence to support TEF applications will be very beneficial.
The TEF green paper proposes that the National Student Survey (NSS) results, in particular the teaching quality and learning environment sections, will be one of the metrics against which HEIs will be judged (Department for Business, Innovation and Skills, 2015). However, because these are only proxies for teaching quality, HEIs are to provide institutional evidence to balance the NSS results. It seems only logical that Evidence-based Practice (EBP) methods will support this endeavour.
Evidence-based practice finds its roots in medicine. Strong developments in the 90s led to the seminal definition of evidence-based medicine by Sackett et al. (1996):
Evidence-based medicine is the conscientious, explicit and judicious use of current best evidence in making decisions about the individual patient.
Over time, and with the development of further allied health professions, this definition has evolved to reflect a greater number of health care professionals. The 2005 Sicily statement on evidence-based practice defines EBP as decisions about health care based on the best available current and valid evidence (Dawes et al., 2005). However, common assumption is that “best evidence” is quantitative by nature, with the meta-analysis of randomised controlled trials being the gold standard (Rycroft-Malone et al., 2004).
This narrow view of what constitutes evidence has been criticised as being exclusionary (Holmes, Murray, Perron and Rail, 2006) and disregards the interaction between research evidence and practitioner-generated evidence (Upshur, 1999). Morrell and Learmonth (2015) quite rightly argue that this attitude towards evidence is not limited to health care: even though Barends et al. (2014) inclusively state that in evidence-based management all evidence, regardless of its source, may be included if found trustworthy and relevant, the authors then go on to value positivistic research characteristics higher than phenomenological research characteristics. This concern is shared by Davies (1999) who notes that the relevance of education research depends greatly on what questions are asked in what context and what the practical application is, and by Morrison (2001), who argue that the randomised controlled trial is not automatically the ideal solution. More recently, Biesta (2010) proposed to move away from evidence-based education because of various deficits associated with the methodology. However, I still am in favour of using and evidence-based practice approach to support decision making, and as argued by Wiseman (2010), if the most appropriate evidence is used in an appropriate way, quality of education and student experience will benefit. In light of this, a more suitable definition of evidence is required, given that education and education management research is often multidisciplinary and/or cross-disciplinary. The following proposed definition of evidence by Rycroft-Malone et al. (2004)
Evidence is the knowledge derived from a variety of sources that has been subjected to testing and has found to be credible.
appears most inclusive and can easily be applied to HE course management. Indeed, where these authors list evidence bases for health care as “Research, Clinical experience, Patients, clients and carers and Local context and environment”, evidence bases for HE course managers can be adapted without much trouble to “Research, Teaching and management experience, Students and (academic) staff and Local context and environment”. This more inclusive approach to what constitutes evidences is supported by Noyes et al. (2011) who argue that evidence from qualitative studies can play an important role in adding value to systematic reviews for policy, practice and consumer decision making. Furthermore, these authors find there are many qualitative methods of evidence synthesis that are appropriate to the aims and scope of a Cochrane systematic review. Therefore, rather than focussing on meta-analysis of randomised controlled trials, it is perhaps more appropriate to widen the scope and consider systematic reviews compiled from high quality original research of mixed methodology the golden standard for evidence-based decision making in education and education management.
Aim and focus of the study
The aim of this project is to provide an insight into applying EBP methods to decision making by course scheme managers in higher education, with particular attention to understanding student satisfaction with their course. This project will result in recommendations for academic and senior management staff on how to better engage with and further develop student satisfaction, based on sound, well understood evidence.
This study will focus on a newly written four-year integrated masters course at Writtle College which offers the opportunity of starting from a “clean slate”. The study will approach EBP from both levels as described by Davies (1999): the current literature will be reviewed systematically according to Campbell Collaboration (2009) and Cochrane Collaboration (Noyes et al., 2011) guidelines with a view to understand the currently available evidence base surrounding student satisfaction, and new evidence will be generated from locally sourced data to fill knowledge gaps identified in a relevant manner to Writtle College.
Personal, employer and professional practice perspective
As a course scheme manager, I am responsible for the day-to-day management of undergraduate courses, including responding to senior management requests for summary reports, annual course reviews, periodic degree reviews and internal satisfaction surveys and NSS results. Although I think I am doing a reasonable job, I strongly feel a deeper understanding of the evidence available would help me in my development as a practitioner and practitioner-researcher. Additionally, I am looking to challenge myself with the goal of level 8 qualification.
Writtle College traditionally has struggled with student satisfaction. However, the academic year 2014-2015 proved particularly difficult and the College significantly fell down the rankings. Although there are some theories going around, nobody really knows why this has happened. I strongly feel an evidence-based approach to course management would benefit the College in understanding various student-related metrics better.
Finally, the wider profession would benefit from the findings of this project due to the changing nature of higher education in the UK: rankings and league tables will become ever more important, and for small specialist institutions (like Writtle) who lack the data processing units of large universities, doing well will become a challenge. Therefore, a greater understanding of how evidence-based course management can be applied to undergraduate and postgraduate courses would help course managers across small specialist HEIs.
Barends, E., Rousseau, D.M. and Briner, R.B., 2014. Evidence-based Management: The basic principles. [online] Amsterdam: Center for Evidence-based Management. Available at: <http://www.cebma.org/wp-content/uploads/Evidence-Based-Practice-The-Basic-Principles-vs-Dec-2015.pdf> [Accessed 18 Feb. 2016].
Biesta, G.J.J., 2010. Why ‘What Works’ Still Won’t Work: From Evidence-Based Education to Value-Based Education. Studies in Philosophy and Education, 29(5), pp.491–503.
Davies, P., 1999. What is Evidence-based Education? British Journal of Educational Studies, 47(2), pp.108–121.
Dawes, M., Summerskill, W., Glasziou, P., Cartabellotta, A., Martin, J., Hopayian, K., Porzsolt, F., Burls, A. and Osborne, J., 2005. Sicily statement on evidence-based practice. BMC Medical Education, 5, p.1.
Department for Business, Innovation and Skills, 2015. Fulfilling our potential: teaching excellence, social mobility and student choice. London: The Stationery Office.
Holmes, D., Murray, S.J., Perron, A. and Rail, G., 2006. Deconstructing the evidence-based discourse in health sciences: truth, power and fascism. International Journal of Evidence-Based Healthcare, 4(3), pp.180–186.
Morrell, K. and Learmonth, M., 2015. Against Evidence-Based Management, for Management Learning. Academy of Management Learning & Education, 14(4), pp.520–533.
Morrison, K., 2001. Randomised Controlled Trials for Evidence-based Education: Some Problems in Judging ‘What Works’. Evaluation & Research in Education, 15(2), pp.69–83.
Noyes, J., Popay, J., Pearson, A., Hannes, K. and Booth, A., 2011. Qualitative research and Cochrane reviews. In: J.P.T. Higgins and S. Green, eds., Cochrane Handbook for Systematic Reviews of Interventions, version 5.1.0. [online] The Cochrane Collaboration. Available at: <http://handbook.cochrane.org/> [Accessed 21 Feb. 2016].
Rycroft-Malone, J., Seers, K., Titchen, A., Harvey, G., Kitson, A. and McCormack, B., 2004. What counts as evidence in evidence-based practice? Journal of advanced nursing, 47(1), pp.81–90.
Sackett, D.L., Rosenberg, W.M.C., Gray, J.A.M., Haynes, R.B. and Richardson, W.S., 1996. Evidence based medicine: what it is and what it isn’t. BMJ, 312(7023), pp.71–72.
The Campbell Collaboration, 2009. The Campbell Collaboration Resource Center. [online] The Campbell Collaboration. Available at: <http://www.campbellcollaboration.org/resources/training.php> [Accessed 22 Feb. 2016].
Upshur, R.E.G., 1999. Priors and Prejudice. Theoretical Medicine and Bioethics, 20(4), pp.319–327.
Wiseman, A.W., 2010. The Uses of Evidence for Educational Policymaking: Global Contexts and International Trends. Review of Research in Education, 34(1), pp.1–24.