Efficacy in action: Understanding and improving our customers’ experiences and outcomes

College professor and students gathered around a laptop

If I had to sum up “efficacy in action” in a few words, I would say it is identifying and enhancing what works to improve learning outcomes in real-life learning environments. During product development we use a tremendous amount of data to plan, design, create, and test our products. But we don’t stop there. We work with customers to understand how they use the products within their unique environments and identify how and to what extent our products impact teaching, learning, and student achievement. This is a challenging and rewarding process.

It would be nice to be able to say to customers that they simply need to do x, y, and z to see improved learner success. But in reality it takes a lot of time spent with teachers, professors, administrators, principals, and students to deeply understand the teaching and learning interaction, and the specific interventions, conditions, and actions that are producing results. By the time we begin collecting implementation data, we have already had many conversations with educators and administrators about the challenges they face and the goals for the school, district, college class, or university program in order to understand the context of the implementation and what they hope to achieve. Then we start to collect and analyze the data.

We have efficacy teams in K-12 and higher education, working with scores of customers to understand product implementation and learner outcomes and the ways in which and extent to which the two are related. These teams continually conduct case studies, using both qualitative and quantitative data to gain insight into teaching and learning and to provide replicable and adaptable models of improved outcomes.

For example, in K-12 schools, we studied the implementation of one of our products across a district with 20 elementary schools. Some schools used the product the recommended 20 hours per year, per student, while others barely used the product at all. We were surprised the results showed no correlation between student time on task and gains in learner achievement. It was through interviews with the teachers and administrators that we determined that the school exhibiting the most growth had a robust system in place to use program-generated data to inform classroom instruction. As a result of the study, administrators in the district implemented a district-wide data analysis and usage process. This is a great example regarding how quantitative and qualitative research painted the complete picture and led us to provide a replicable model for other districts.

In higher education, we have been reporting customers’ experiences and results for more than 10 years, and we have developed more than 600 case studies for our MyLab and Mastering products. That volume of user cases studies enables us to identify trends common among the users experiencing the best results, and we have published those trends to help other institutions understand and follow these recommended implementation best practices.

We gain valuable insight from each study because teachers, principals, coaches, deans, district administrators, and professors share with us how our products impact their institutions, their schools, their teaching, and their students’ learning. These insights inform further research, more meaningful conversations with and outcomes for customers, and product improvements. What I really enjoy most is the ability to share meaningful and replicable  insights with customers that help more learners make substantive and sustained progress.

Our efficacy team, many of whom are former teachers, is staffed by professionals who are deeply committed to learner success. Our team uses the data we collect to create evidence-supported guidelines to help educators implement our products with fidelity and aligned with the goals and outcomes that they believe are most critical for learner progress and learner success. Best of all, we work with great educators and educational institutions every day engaging in deeper conversations about their goals and their students’ needs and chart a course together to reach them.

 

About the Author
John Tweeddale, SVP

John Tweeddale, SVP

John Tweeddale is senior vice president of efficacy and quality at Pearson North America. John has spent his entire career in educational publishing and technology. He started his career as a higher education sales representative, became sales manager, and then director of marketing for Prentice Hall Engineering/Science/Math/Computing. In 2003, he moved to London to take a role as director for higher education and professional education publishing and became the national sales manager for Allyn & Bacon/Longman, and then for the higher education humanities/social studies team. After a year as senior vice president/director of marketing strategy for the Arts & Sciences group, he accepted an assignment as chief customer experience officer for higher education.