Evaluation

A central challenge for HE providers is to evaluate the impact of their policies and practices. Following closely on the heels of that is the need to use information, data and research to create lasting change. We use a ‘logic chain’ approach to identify the intended causal relationships between activities, outputs, outcomes and impact. Having established the logic chain we use mixed methods to evaluate impact and provide insights into what has been effective - and why. We offer services to work with you to improve your use of data, evaluate impact and improve planned and continuing activities. We draw on research and theory in the following areas:

• Theory of change
• Logic chains
• Logical frameworks
• Impact of activity and effective practice
• Change management

Relevant publications

Thomas, L, and Jones, R. (2012) Using data. An evidence-based approach to improving transition, induction and retention. Birmingham: National HE STEM programme

Thomas, L. et al (2001) Widening Participation: Evaluation of the collaboration between higher education institutions and further education colleges to increase participation in higher education. Bristol: HEFCE.

Thomas, L. and Slack, K. (2002) ‘Developing an Evaluation Framework to assess the contribution of community and work based learning’, Research in Post-compulsory Education, 8.2, pp19-38, with Kim Slack

Thomas, L. and Slack, K. (2000) Aiming High Evaluation 1999-2000. Stoke-on-Trent: Institute for Access Studies.

Thomas L. et al (1999) Staffordshire Strategic Partnership Evaluation Report. Stafford: Stafford College.

Thomas, L. (2000) ‘Bums on Seats’ or ‘Listening to Voices’: The nature and role of evaluation research in the ‘Widening Participation Agenda’ of the 1990’s and beyond, Studies in Continuing Education, volume 22:1, pp95-113

Bowes et al 2012-13 Outcomes of the formative evaluation of the national scholarship programme reports available from http://www.hefce.ac.uk/pubs/rereports/year/2013/nspeval/