This is an archive of the 2013 version of ocTEL.

#ocTEL Week 10 – Activity 10.1

Reflecting on evaluation instruments / tools.

Whilst being quite familiar with evaluation tools such as Bristol Online Surveys, SurveyMonkey and evaluative tools in VLEs such as Blackboard I have chosen a bespoke use of Moodle to perform an evaluation for study days.

Images of website for study days

The Context and Tool

There was a need to demonstrate ‘return on investment’ for study days that were run by a department for NHS staff, both to demonstrate that the finical effort put into sending staff on these, and to understand if they were having a positive impact on clinical practice. Replacing face to face evaluations at end of study day with something more holistic.

The bespoke tool was created on Moodle which enabled non-university staff access and flexibility in the structure.

It was divided into pre-reading before attendance on a study day. A set of questions to gauge confidence and areas for improvement before attending.

After attending some further resources were provided along with asking the same questions 6 weeks after attendance along with some further questions around how they felt the impact of the study day had been on their work.

A certificate of attendance could then be downloaded after completion of this.

Effectiveness and Critique  of Evaluation

The evaluation would only be effective if people engaged with the process.

The aim of pre-reading and later evaluation was also meant to bring a reflective component for the attendee as much as an evaluation of the study day.

Overall there was a reasonable level of engagement (around 70%), with a driver often being that the completion would be the only way that a certificate of attendance could be claimed.

For the full positive effect as an evaluation tool and for the attendees learning it was important that the process took part at the appropriate time before and after the face to face study day.  In some cases this did not happen and whilst the last set of questions around impact were useful the most useful were the comparison between the before and after questions.

As a tool people engaged with this and found it straightforward after some guidance was provided, despite some participants being quite nervous about online and computer use.

The methodology was quite simple, which allowed some clear comparisons between before and after attendance at a study day.

The struggle with measuring impact after the study day is that some of the effects may have been more subtle than could have been indicated through the set of questions asked, or been demonstrated much later after the study day.

There was also the need to evaluate the evaluation method further to see if engagement could have been increased and integrated more within the study days face to face sessions.

ALT-C 2012 Presentation of this approach:

http://altc2012.alt.ac.uk/talks/28056

Tagged with: , ,