MyLab Math educator study explores benefits of implementing personalization tools in Quantitative Literacy course at Guilford Technical Community College

MyLab Math educator study explores benefits of implementing personalization tools in Quantitative Literacy course at Guilford Technical Community College

Key Findings

  • Results show a very strong correlation between Mastery Points earned in MyLab Math’s personalized Companion Study Plan and final course grades.
  • Students who missed nine or fewer Quiz Me’s performed significantly better in every course performance metric, including 15 percentage points higher on final course grade.
  • In a voluntary, end-of-semester student survey, 87 percent or more of responders had positive reactions to several questions about the course’s structure and the Companion Study Plan.
  • In that same survey, students ranked Sutton’s pacing guide highest, followed by quizzing over material multiple times, followed by MyLab Math’s Study Plan. Lecture and working in groups ranked fourth and fifth, respectively.

School name
Guilford Technical Community College, NC

Course name
Quantitative Literacy

Course format
Face to face and hybrid

Course materials
MyLab Math for Thinking Mathematically by Blitzer

Timeframe
Fall 2016–Spring 2017

Educator
Mike Sutton, Instructor

Results reported by
Traci Simons, Pearson Customer Outcomes Analytics Manager

Setting

As the third largest of 58 community colleges in the North Carolina Community College System, Guilford Technical Community College (GTCC) serves more than 40,000 students annually from its Jamestown, Greensboro, High Point, Aviation, and Donald W. Cameron campuses as well as its Small Business Centers in Greensboro and High Point.

The school’s website reports the following quick facts for the 2015–2016 school year:

  • Enrollment (curriculum): 15,281 unduplicated
  • Gender: 56.4 percent female, 43.6 percent male
  • Ethnicity: 42.3 percent Caucasian, 39.5 percent African American, 8.1 percent Hispanic, 10.1 percent Other/Unknown
  • Average age: 27
  • Enrollment status: 55 percent part time, 45 percent full time
  • Average class size: 20
  • Average credit hours carried: 10

Students at GTCC represent 110 countries throughout the world. The following are the top five countries of origin after the United States of America: Mexico, Vietnam, Pakistan, Sudan, Nigeria.

About the Course

The Quantitative Literacy course at Guilford Tech is designed to engage students in complex and realistic situations involving the mathematical phenomena of quantity, change and relationship, and uncertainty through project- and activity-based assessment. Emphasis is placed on authentic contexts which will introduce the concepts of numeracy, proportional reasoning, dimensional analysis, rates of growth, personal finance, consumer statistics, practical probabilities, and mathematics for citizenship. Upon completion, students should be able to utilize quantitative information as consumers and to make personal, professional, and civic decisions by decoding, interpreting, using, and communicating quantitative information found in modern media and encountered in everyday life. Prerequisite requirements for the course include satisfactory completion of all five developmental math modules and developmental reading.

Challenges and Goals

Since Quant Lit is a course for non-STEM majors, typically students have a lot of apathy towards the course and just want to get in and get out successfully so they can move forward with courses in their major. In addition, Instructor Mike Sutton has found that the distribution of students in this course tends to be bimodal when it comes to preparedness: some are well-prepared, and others are very underprepared. Those who are underprepared are not comfortable with learning math or feel that they have a lower math aptitude than those students in STEM-level math courses such as Precalculus. Sutton, whose background is in instructional design, felt good about his teaching of the subject, but he felt that students were just going through the motions without thinking about why what they did worked.

Sutton is a big believer in learning through assessment, so when MyLab Math released the Companion Study Plan and the Quiz Me’s, he decided to implement the new features with four goals in mind:

  1. He wanted his students to learn more deeply than broadly.
  2. Since Sutton had found his students’ abilities to be bimodal when it comes to preparedness, he wanted to implement some group work to allow prepared students to help underprepared students.
  3. He wanted students to see material across several stages after having spent at least 24 hours between assessments in order to help their understanding.
  4. He wanted students to enjoy the course and feel competent at math.

Implementation

Sutton started to create his MyLab Math course by mapping Pearson’s Student Learning Outcomes (SLOs) to his own. He typically chose to cover about half the SLOs suggested by Pearson based on what he knew students would be learning later and those he felt were most critical.

Every section starts with a concept quiz associated with SLOs. Sutton pulls some questions for the quiz from other textbooks and uses some written by the department and himself. Performance on the concept quiz informs the Companion Study Plan in MyLab; then students work in the Companion Study Plan and complete assigned Quiz Me’s. Students must earn 100 percent on the short concept quiz and master the associated study plan before attempting a skills quiz.

Sutton created a prerequisite in MyLab so that students have to meet a certain threshold on Quiz Me’s before they can take the section’s skills quiz. At first, Sutton required students to earn 100 percent on the Quiz Me’s; however, many students found this to be too high. In a voluntary, end-of-semester survey in Fall 2016 (50 percent response rate), only 58 percent of responders agreed that the Quiz Me’s were a good indication of their understanding of the concept. Some comments about the 100 percent requirement included:

  • “I got frustrated with the Quiz Me’s when there were 4 or 5 of them and I would have to do them multiple times even if I only missed one question.”
  • “The questions were good, but the fact that you had to score a hundred to advance to the actual quiz that is for a grade is very counter productive.”

After receiving this feedback at the end of the first semester of implementing the Companion Study Plan, Sutton backed the Quiz Me requirement down to 87 percent. In a voluntary, end-of-semester student survey in Spring 2017 (86 percent response rate), 90 percent of responders agreed that the Quiz Me’s were a good indication of their understanding of the concept and 77 percent felt that the 87 percent requirement on the Quiz Me’s was high enough to ensure that they mastered the subject. None of the responders in the Spring 2017 survey mentioned that the new requirement was too high or cumbersome.

Once students complete the Quiz Me’s to 87 percent, they can take the section’s skills quiz and then move on to the next section’s concept quiz. Every two or three sections, Sutton also assigns a small summary quiz that contains 12–14 questions and re-quizzes the students on SLOs from previous sections. Summary quizzes include two questions per SLO. Students can retake summary quizzes for a better grade as long as they get at least 50 percent (1 of 2 questions correct on each SLO) the first time. If they score below 50 percent on an SLO, they must re-demonstrate mastery of the SLO by passing the associated Quiz Me with 87 percent before they can retake the summary quiz. No quizzes are timed, but all have due dates. If students don’t submit a quiz before the due date, they receive a grade of zero for that quiz. Throughout the semester, students also take four tests in MyLab that are no more than 20 questions. The tests are untimed and students get two attempts.

In order to ensure students stay up-to-date on their MyLab assignments, Sutton provides them with a pacing guide. The pacing guide can be found in the syllabus and also on the customized landing page for MyLab so that students see it every time they login. Students must keep up with the class in order to participate in group work. The due dates posted in MyLab are the latest dates that assignments are accepted. Sutton checks student progress against the pacing guide ten times in a semester. If a student is behind on a day he checks, that counts as a “pacing guide absence.”

In addition to completing paper-and-pencil midterm and final exams in class, students also complete group work during class time. In order to encourage collaboration and peer learning, Sutton often pairs students who have mastered the SLOs with those who haven’t. In Spring 2017, as part of a campus-wide experiment, Sutton taught one four-hour section of Quantitative Literacy and one three-hour section. Typically, Sutton spends 40–50 minutes lecturing and then, in the 4-hour section only, they take a 10–15 minute break and then go into a computer lab. Once in the lab, Sutton encourages students to try the section’s Quiz Me’s right after lecture so that the material is fresh on their mind. He states, “I want them to see what they know right now, rather than spending hours practicing for it. Taking the Quiz Me right away helps them focus on what they don’t know and saves them time.”

Assessments

  • 40% MyLab Math quizzes
  • 25% Final exam
  • 25% Midterm exam
  • 10% Tests

Sutton adds an attendance adjustment to students’ final grades at the end of the semester. Students start with a +3 percentage point adjustment to their final grade and 1 percentage point is subtracted for each class absence, up to a maximum of -3. The same is true for the pacing guide. Students start with +3 points and 1 point is subtracted for each pacing guide absence, to a maximum of -3.

Results and Data

Overall, Sutton believes that the combination of being able to work together in the lab during class time and changing the prerequisite requirements on quizzes have made his 4-hour Quantitative Literacy section more successful. He also recognizes an improvement in his 3-hour section compared to previous years when he did not structure the course to continually assess his students.

The first analysis was run to review the impact of Quiz Me’s on course performance. Students from Sutton’s two Fall 2016 and two Spring 2017 sections were grouped based on the mean number of missed Quiz Me’s, which, in the case of Sutton’s four sections, was nine out of 64. Students who did not take the final exam were excluded from this analysis. Figure 1 shows students’ course letter grades based on the mean number of missed Quiz Me’s. Figure 2 shows the average scores for each course component for the two groups. Students who missed nine or fewer Quiz Me’s scored higher in every category and results of a t-Test performed on four of the categories showed that each difference is statistically significant.1-4

  • Sutton’s courses had an overall success (ABC) rate of 66 percent, and all students, including those who did not take the final and who withdrew from the course, were included in this calculation.
  • Every student who missed nine or fewer of the 64 Quiz Me’s earned an A, B, or C in the course (figure 1).
  • Students who missed nine or fewer of the 64 available Quiz Me’s scored higher on overall course grade, final and midterm exam, and test averages compared to their counterparts who missed more than nine Quiz Me’s (figure 2):
    • 16 percentage points higher on overall course grade, t(79)=6.32, p<.051
    • 9 percentage points higher on final exam, t(79)=2.58, p<.052
    • 9 percentage points higher on midterm exam, t(79)=2.54, p<.053
    • 15 percentage points higher on average test scores, t(79)=5.18, p<.054

Course letter grade based on missed Quiz Me’s

Figure 1. Course Letter Grade Based on Mean Number of Missed Quiz Me’s, Fall 2016–Spring 2017 (n=81)

Course performance based on missed Quiz Me’s

Figure 2. Course Performance Based on Mean Number of Missed Quiz Me’s, Fall 2016–Spring 2017 (n=81)

A correlation analysis was also performed on Sutton’s four sections. Correlations do not imply causation but instead measure the strength of a relationship between two variables, where r is the correlation coefficient. The closer the positive r value is to 1.0, the stronger the correlation. The corresponding p-value measures the statistical significance/strength of this evidence (the correlation), where a p-value <.05 shows the existence of a positive correlation between these two variables.

Quizzes, tests, and overall course grade had strong correlations to mastery points earned in the Study Plan:

  • A strong positive correlation exists between mastery points and average concept quiz scores where r=.64, p<.05
  • Very strong positive correlations exist between mastery points earned and average skills quiz scores where r=.88, p<.05, average summary quiz scores where r=.80, p<.05, average test scores where r=.85, p<.05,and average overall course grade where r=.82, p<.05.

Figures 3 and 4 are scatter plots depicting the correlation between mastery points earned and average test score (figure 3) and overall course score (figure 4). Since mastery points earned in the study plan are not part of the final course grade, these correlations may indicate that work in the study plan can be a leading indicator for course performance.

Correlation between mastery points and average test score

Figure 3. Correlation Between Mastery Points Earned and Average Test Score, Fall 2016–Spring 2017 (n=81)

Correlation between mastery points and overall course score

Figure 4. Correlation Between Mastery Points Earned and Overall Course Score, Fall 2016–Spring 2017 (n=81)

Finally, final exam scores from Sutton’s personalized MyLab Math sections (Fall 2016–Spring 2017) were compared to his Spring 2013 section when he did not use those features. It was found that students in Sutton’s Spring 2013 section averaged 73 percent on their final exam, while students in his Fall 2016–Spring 2017 sections averaged 74 percent. Students who did not take the final were not included in this analysis. While the difference is only one percentage point, Sutton maintains that this is significant in his view because the rigor of his final exam has increased substantially since 2013, and he no longer curves the final exam grade, which he did in 2013. Sutton states, “I would expect if we analyzed the pre-curve final exam grades, we would see a definite difference in performance.”

Sutton is very pleased with the results of the analysis. Anecdotally, many of Sutton’s students mentioned that they appreciate the rigor that his structure requires. One student emailed this to Sutton:  “I do try to enjoy math. I really have not been able to understand it so well until your class. I am very surprised at how well I’ve been able to understand this class because in the past I have struggled with math.”

Sutton loves that students that previously didn’t feel that they could do math now feel that they proved they could learn at a high level, even when a lot was expected of them.

For Sutton, the feeling of accomplishment that students have gained is reward enough. “With students at this level, instructors may often pick very few application problems where the reading is a little simpler and there aren’t tons of different variables to consider and assign those because you don’t have to worry about students going into a STEM field. But this mastery-based structure really doesn’t allow for that. If you choose an SLO the students need to learn, then they have to learn it well enough to get through all the Quiz Me’s, etc. to move forward.”

The Student Experience

Student feedback is important to Sutton, as evidenced by his changing the prerequisite score requirements based on early student feedback. In a Spring 2017 voluntary student survey (72 percent response rate), responses regarding the course structure were much more positive than the previous survey in Fall 2016 when Sutton was still requiring 100 percent mastery.

  • 90 percent of students agreed that doing work in the Study Plan helped them better prepare for exams (80 percent Fall 2016).
  • 87 percent of students wished they could use the Study Plan in other classes like they did in Sutton’s class (80 percent Fall 2016).
  • 87 percent of students liked the way the course was structured (80 percent Fall 2016).
  • 87 percent of students said that obtaining Mastery Points in the Study Plan gave them confidence in their ability to do math (60 percent Fall 2016).
  • 93 percent of students said they would recommend their instructor continue having students use the Study Plan (80 percent Fall 2016).

Given Sutton’s background in instructional design and his belief in learning through assessment, he is most proud that his students acknowledged the part that played in their success:

  • 90 percent of students recognized that the course was structured in such a way that they worked with and were quizzed over concepts more than once (80 percent Fall 2016); and
  • 97 percent of students said they believed that seeing the concepts more than once helped them learn the material better (80 percent Fall 2016).

Sutton states, “They got it! Right there, I know that, while they may not have liked how many quizzes they had to take or the parameters I set, especially in Fall 2016, they still acknowledge that in the end, it helped them and was good for them. That’s a win for me.”

In addition, Spring 2017 students were asked to rank parts of the course according to how much they helped them in the course. Items given a 1 for most important received 5 points, 2 received 4 points, etc. Sutton’s pacing guide ranked highest, followed by quizzing over material multiple times and then MyLab Math’s Study Plan. Lecture and working in groups were fourth and fifth, respectively.

Students were also asked to rate the different aspects of the course according to how helpful they were:

  • 93 percent said working in the MyLab Math Study Plan was helpful or very helpful to them learning the material.
  • 93 percent said the lecture was helpful or very helpful to them learning the material.
  • 90 percent said taking quizzes over material to assess their understanding was helpful or very helpful to them.
  • 77 percent said they found the pacing guide helpful or very helpful in making sure they were prepared for class.
  • 73 percent said working in groups was helpful or very helpful to them to practice the material.

Conclusion

So many students have come up to me and said, ‘Mr. Sutton, I’ve never really felt like I could do math before, but now I feel like I can do it.’

 

When Sutton set out to implement the adaptive features in MyLab Math, he did so with four goals in mind, all of which he accomplished based on course data and student feedback:

  1. Focus on deeper learning vs. broad: Sutton achieved this by covering only the Pearson Learning Objectives most critical to mastering course SLOs rather than covering all the material in related textbook sections. Using MyLab allowed him to select which SLOs he wanted students to be assessed on and to what extent.
  2. Implement group work to allow students to teach each other: By focusing peer learning on topics students found most challenging, Sutton felt group work was more productive and powerful, and 73 percent of survey respondents agreed that group work was helpful or very helpful with practicing course material.
  3. Increase understanding by quizzing over material multiple times: 97 percent of responders to the voluntary student survey said that seeing the concepts more than once helped them learn the material better. In addition, final exam averages were one percentage point higher in his redesigned sections, and Sutton feels that indicates a substantial improvement because of the fact that his final exam is harder and he no longer curves.
  4. Create a course that students enjoy and allows them to feel good about their math skills: Based on voluntary student feedback, the majority of students like the structure of the course and feel confident in math. Sutton reports, “So many students have come up to me and said, ‘Mr. Sutton, I’ve never really felt like I could do math before, but now I feel like I can do it.’ They really felt challenged by the mastery structure, but they realize that they really learned.”

Moving forward, Sutton plans to incorporate Learning Catalytics into his course to increase activity during lecture and monitor group work a little more. He would also like to try his MATH 143 structure in a class where there is a prerequisite, such as Precalculus Trig or Calculus 1.

 

1Overall Grade t-Test Results: The group with fewer than the mean number of missed Quiz Me’s (M=70%, SD=15%, N=27) were significantly higher than the group with missed Quiz Me’s over the Mean (M=86%, SD=7%, N=54), t(79) = 6.32, p<.05

2Final Exam t-Test Results: The group with fewer than the mean number of missed Quiz Me’s (M=68%, SD=18%, N=27) were significantly higher than the group with missed Quiz Me’s over the Mean (M=77%, SD=14%, N=54), t(79) = 2.58, p<.05

3Mid Term t-Test Results: The group with fewer than the mean number of missed Quiz Me’s (M=73%, SD=16%, N=27) were significantly higher than the group with missed Quiz Me’s over the Mean (M=82%, SD=13%, N=54), t(79) = 2.54, p<.05

4Average Test Score t-Test Results: The group with fewer than the mean number of missed Quiz Me’s (M=74%, SD=19%, N=27) were significantly higher than the group with missed Quiz Me’s over the Mean (M=89%, SD=6%, N=54), t(79) = 5.18, p<.05