Guilford, New York, 2009

ISBN 978-1-59385-872-8

With the rapid proliferation of distance education and e-learning courses, the need is growing for a comprehensive, professional approach to evaluating their effectiveness. This indispensable book offers a road map to guide evaluation practice in these innovative learning environments. Providing practical, step-by-step guidelines and tools for conducting evaluation studies-including how to deal with stakeholders, develop surveys and interview protocols, collect other scientific evidence, and analyze and blend mixed-methods data-the work also features a template for writing high-quality reports. The "unfolding model" developed by the authors draws on Messick's influential assessment framework and applies it to program evaluation. Two case studies of actual programs (a distance learning course and an e-learning course) demonstrate the unfolding model in action.


Valerie Ruhe is an Evaluation Studies Specialist at the University of British Columbia. Previously, she was an Assessment and Evaluation Consultant at the Center for Teaching and Learning, University of Minnesota. She has 10 years of professional program evaluation experience in distance education, K-12, and higher education.

Bruno D. Zumbo is Professor of Measurement, Evaluation, and Research Methodology, and of Statistics at the University of British Columbia. He is widely published in research methodology, validity, and validation processes, as well as statistical science and program evaluation methodology. His work has had wide-ranging influence across many fields in the social, educational, and health sciences.

Content:

l. Why Do We Need a New Approach to Evaluation in Distance Education and E-learning?

Distance Education versus E-Learning
The Rapid Expansion of Distance Education and E-Learning
What Is Evaluation?
Why Do We Need a Professional Approach to Evaluation?
What Does a Professional Approach to Evaluation Look Like?
Responding to the Call for a Professional Evaluation Approach: The Unfolding Model
Conclusions: Our Approach to Evaluation

2. The Theory and Practice of Program Evaluation

Why Are Program Evaluation Models Important?
Classification Frameworks for Program Evaluation Models
Alkin and Christie's Evaluation Tree: The Roots and Branches
Where Do We Diverge from Alkin and Christie?
Conclusions

3. Evaluation Theory and Practice in Distance Education and E-Learning

Evaluation Theory
Models with Scientific Evidence in the Foreground
Models Based on Evidence, Values, and Consequences
Models Based on Messick's (1989) Framework
A Summary of Evaluation Models in Distance Education
Evaluation Studies
Evaluation Practice
Do Unintended Consequences Emerge in Authentic Evaluation Studies?
Conclusions

4. Messick's Framework: What Do Evaluators Need to Know?

The Overlap between Test Validity and Program Evaluation
Messick's Contributions
Messick's Framework
The Overlap among the Four Facets
The Controversy over Unintended Consequences
Implications for Evaluation
Conclusions

5. Getting Started

Planning the Evaluation Study
The Ethics Review Process
The Political Context of Evaluation

Using the Unfolding Model as a "Road Map"
Mixed Methods: Blending Quantitative and Qualitative Data
What Is Essential to Our Approach?
Tailoring the Unfolding Model to Your Needs
Conclusions

6. The Unfolding Model: Scientific Evidence

Definition of "Scientific Evidence"
Scientific Evidence
How to Write Good Survey Questions
Using the Unfolding Model to Write Survey and Interview Questions
Administering Surveys
Analyzing Survey Data
Qualitative Data: Interviews, Focus Groups, and Online Ethnographies
Qualitative Data Analysis
Outcomes
The Evaluation of Environmental Quality
Relevance
Cost-Benefit Analysis
Bringing It All Together: Mixed Methods
Conclusions

7. The Unfolding Model: Values and Consequences

Underlying Values
How to Identify the Values Underlying Your Course
Course Goals and Objectives
Writing Survey/Interview Questions about Underlying Values
Analyzing Data on Underlying Values
Unintended Consequences
How Can You Identify Unintended Consequences?
Writing Survey/Interview Questions about Unintended Consequences
Analyzing Data on Unintended Consequences
How to Enhance the Validity of Your Findings
Recommendations for Course Improvement
Writing the Evaluation Report
Conclusions

8. Findings from Two Authentic Case Studies

Methods and Procedures for Both Studies
Distance Learning: Computing Science 200 (CPSC 200)
E-Learning: Professional Writing 110 (PWRIT 110)
Conclusions

9. Bringing It All Together

Using Messick's Framework to Evaluate Distance and E-Learning Courses
Using the Unfolding Model to Evaluate Your Courses
Conducting Your Evaluation Study

What Have We Learned from Two Case Studies?
E-Learning and Beyond: Is the Unfolding Model the Last Word?
The Future of Distance Education and E-Learning
The Future of the Unfolding Model
Conclusions

Appendix A. Summary of the 1994 Program Evaluation Standards

Appendix B. Glossary

Appendix C. List of Associations

Reviews:

"Ruhe and Zumbo have written the premier text for evaluation of distance education and e-learning. This is the first theoretically grounded, comprehensive guide for conducting rigorous process and outcome studies of one of the fastest growing segments of curriculum development and education. It will be valuable to those involved in evaluating innovative educational practices and programs, today and for years to come. The book is unique in presenting both evaluation theory and practice, making it an excellent course text and practical resource."

-Christina A. Christie, PhD, School of Behavioral and Organizational Sciences, Claremont Graduate University

"I found this book intriguing and worthy of thoughtful discussion by students, practitioners, and theorists. It provides a good introduction to program evaluation in the context of distance education and e-learning, and to the authors' unfolding model. I will definitely use this book in the four graduate courses I teach on evaluation and qualitative inquiry. The book will help our Instructional Psychology and Technology students sort through the issues, form their own theories, and refine their practices. Students need this kind of deep discussion of the role of stakeholder values, how to respond dynamically to continual changes in technology, and how to integrate evaluation into the design process to enhance their distance education and e-learning processes and products."

-David Dwayne Williams, PhD, Department of Instructional Psychology and Technology, Brigham Young University

"A much-needed text. While there are resources available to aid in the design of e-learning materials, there is very little out there focusing on how to document their effectiveness. This book presents an approach to evaluation and also provides the theory that underlies that approach. Graduate students in evaluation and instructional technology will find it a useful resource because of its balance of theory and practice. Anyone involved in designing or evaluating e-learning or technology-supported instruction needs to take a look at this book!"

-Dianna Newman, PhD, Department of Educational and Counseling Psychology, University at Albany, State University of New York

"This book should have a significant impact on the field. Distance education is no longer on the periphery of teaching and learning; rather, it is now widely practiced and applied. This book is well written, clearly explained, and relevant to all those who are interested in best practices in distance education and e-learning."

-Michael Simonson, PhD, Instructional Technology and Distance Education Program, Nova Southeastern University