Best Practices in Clinical Education
Welcome
About Us
Planning Education
Resources
Forms
Plain Language/Health Literacy
Learning Objectives
Test Questions
Evaluations
Adult Learning
Creativity in Education
Critical Thinking
eLearning
Simulation

Evaluations

Back
Next

There are several tools available to provide evaluations.

  • Children's University – contact Cindy Dries for assistance
  • SurveyMonkey – contact the Quality Improvement department
  • AutoData – contact Jean Rudzik for assistance

Evaluation Question Guidelines

The definition of evaluation is  “an appraisal of the value of something”. (http://www.thefreedictionary.com/evaluation).

For education/training, we can determine the value of the education by checking in with the learner to see if they agree that they have met the objectives identified for the education. Evaluation is important because it will let you know if the learner self-rates that they have met objectives, and it also helps uncover the barriers and frustrations of the test taker at meeting the objectives. The test writer will be able to look at these frustrations and identify the step in the education process that can be altered in the future to prevent the frustration and increase the likelihood that the objectives can be met.

bulleted item Evaluation data must be looked at regularly for ongoing programming and in planning future programs that may be similar. 
bulleted item Evaluation data needs to planned for in every education offering in some way. 
bulleted item When developing a new program, it is most helpful to trial the program or education with the end- user before final launch and get their feedback in a written document or verbal manner. 
bulleted item Even sitting with the end-user to get timing and immediate feedback is very helpful in some situations.
bulleted item Using SurveyMonkey or other evaluation tools helps to organize the feedback into a format that is accessible and understandable.
bulleted item Questions should include how the end-user feels, how it will impact their area of work and areas they think they would like to know more about in the future related to the topic. For example:
 

- Did the education meet the stated objectives?
- Was the education relevant to my area of work?
- The content of the education was (a) excellent (b) very good (c) good (d) fair (e) poor.
- The visual presentation of the material/class was (a) excellent (b) very good (c) good (d) fair (e) poor.
- I was able to complete the education in the allotted time. yes/no
- What information did you learn that you plan to use on the job?
- The best feature of the education was...
- The worst feature of the education was...

Parts of the education process that you may change based on the evaluation:

bulleted item Objectives: It could be that the learner lets you know that the objective you had is not in line with the policy and procedure.
bulleted item Length/format of the education: Getting to the key points in an interactive way that keeps the learner’s attention and intrinsically motivates them to implement the education is the goal.
bulleted item Test questions/cases: Test questions may not be the best way to judge if the objectives are met. Poorly written questions may be frustrating or easy to guess. This would de-value the education and alienate the learner from wanting to participate in future education/training. Please refer to the Test Question Guidelines section.

Uses for evaluation data:

bulleted item Check the data monthly and update the education.
bulleted item Review the data prior to developing similar education to be proactive.
bulleted item Use for research.

Resources

Developing Valid Level 2 Evaluations by Ken Phillips (ASTD Connector / November 2009)

Measuring Learning Results: Creating fair and valid assessments by considering findings from fundamental learning research (QuestionMark)