MyMISLab educator study analyzes homework, quiz, and exam scores at University of South Florida

Print a PDF version EDUCATOR STUDY

MyMISLab educator study analyzes homework, quiz, and exam scores at University of South Florida

Key Findings

  • Data indicate strong, positive correlations for all course formats between MyMISLab homework and MyMISLab quiz scores, as well as MyMISLab quiz and exam scores.
  • Students earning higher MyMISLab homework scores also earned substantially higher average quiz and exam scores.
  • Use of pooling and randomizing questions in MyMISLab testing helped the university in their efforts to attain academic integrity in testing for all students regardless of course format.

School name
University of South Florida, Tampa, FL

Course name
Information Systems in Organizations

Course format
Online, hybrid, and face-to-face

Course materials
MyMISLab; Using MIS by Kroenke

Timeframe
Spring 2016

Submitted by
Barbara Warner, Instructor and Master of Science MIS Admissions Director

Setting

  • Locale: large, urban, public, four-year university with main campus in Tampa
  • Enrollment: 42,000 students at three campuses, undergraduates: 30,000
  • Student-faculty ratio: 24:1
  • Freshman retention rate: 87 percent
  • Graduation rate (six-year): 64 percent
  • Gender: 55 percent female
  • Diversity: 38 percent minority

Challenges and Goals

USF had used MyMISLab for the Information System (IS) course several years ago, but implementation of the program didn’t go as smoothly as planned. With so many students in the program, the need to minimize technology issues led to discontinuing its use in Fall 2012. However, asked to standardize her campus with others in the USF system who were offering the IS course online, Warner sought a digital component with academic integrity in testing at its core. The switch to online testing required a digital program that included testing options like pooling and randomizing questions, enabling USF attempt to securely test a very large number of students several times throughout the semester. In Fall 2015, USF was offered the opportunity to pilot MyMISLab and to evaluate how online testing and homework could work for them in a safe, non-proctored environment. A fairly successful implementation led Warner to continue using the program in Spring 2016.

About the Course

Instructor Barbara Warner has been teaching for approximately 34 years, including the last 15 years at University of South Florida, where she has been the only full-time instructor for the Information Systems in Organizations course for 13 years. Information Systems in Organizations is a one-semester, three-credit course enrolling approximately 1500 junior and senior, college of business students per year, which is also open as an elective to other students interested in the use of technology within organizations. The course focuses on content in two areas: a survey of the language, concepts, structures, and processes involved in the management of information systems, and, the use of business-based software for analytics supporting managerial decisions. Course learning outcomes include:

  • describing hardware, software, and mobile and network system components;
  • using and understanding the importance of database and analytical software;
  • identifying how information systems can be used in process management and systems development; and
  • discussing technology threats and safeguards, and the use of technology in the global workforce.

Implementation

MyMISLab is required; the program is used primarily by students working at home on a personal computer. Students use MyMISLab for understanding content, applying principles to the real-world, homework assignments and testing. Warner’s goals for assigning work in MyMISLab are to teach new concepts, provide homework and practice opportunities, help students assess their own understanding of the course material and track their progress, and identify at-risk students. As the course instructor, Warner’s role is to assign content, homework, and assessments in MyMISLab and provide support and remote monitoring to students using the program at home. While students are often working remotely, the course is not self-paced as defined due dates must be adhered to.

Warner anticipates that students will spend 3–4 hours per week working in MyMISLab, not including time spent reading the eText. Warner’s students confirmed this on a voluntary, end-of-semester Spring 2016 survey (66 percent response rate)48 percent of students said they spent 2–4 hours per week working in MyMISLab while an additional 24 percent of students said they spent four or more hours working in the program.

There are three sections offered most semesters, one in each course format: face-to-face, hybrid, and fully online. The face-to-face section offers standard weekly meeting times where current events and real-world application of the chapter content are introduced and discussed. Students bring their laptops to onsite meetings and significant class time is available for students to work on the HTML, Excel, and Access projects that comprise 40 percent of the final course grade. The hybrid section only meets for project assistance. However, students from all sections can take advantage of any on-campus contact opportunities like office hours or lectures; Warner encourages students to be responsible for their own learning experience. Course structure remains the same regardless of course format. Students complete 3–4 weekly MyMISLab assignments per chapter that consist of the following:

  • Chapter warm-up questions: questions that help students become familiar with chapter material and prepares them for lecture
  • Simulation exercises: branching, decision-making simulations that put Warner’s students in the role of a manager and ask them to make a series of decisions based on realistic business challenges
  • Video exercises: videos with follow-up questions that help students see chapter concepts in action; Warner only assigns these if they fit with her course learning objectives.
  • Dynamic Study Modules (DSM): questions that continuously assess student performance and activity, using data and analytics to provide personalized feedback that targets the individual student’s strengths and weaknesses in real-time. Warner assigns DSM to give students additional practice in the areas where they struggle the most.
  • Study Plan: students are directed toward the study plan, which monitors student performance on homework, quizzes, and tests and continuously makes recommendations based on their performance. It provides customized remediation activities–based on personal proficiencies, number of attempts, or difficulty of questions–to get students back on track.
  • Practice tests: Warner creates practice tests from the quiz questions and allows students to test their knowledge before each exam.

Students were asked about the MyMISLab variety of assignments on the end-of-semester survey:

  • 80 percent of students strongly agree or agree that the MyMISLab simulation exercises helped them see how to apply the chapter content to real-world situations.
  • 76 percent of students strongly agree or agree that the MyMISLab video exercises helped them see the chapter objectives in action.
  • 80 percent of students strongly agree or agree that the test-review-retest pattern of the DSM in MyMISLab helped them learn and remember chapter content.

For additional information on the types of exercises in MyMISLab and how to assign them, see Lesson 3.1 of the MyMISLab Implementation Guide.

After completing the MyMISLab homework assignments, students complete a weekly MyMISLab quiz. Warner intends for the pre-built quiz to be a capstone assessment of students’ knowledge and understanding of the chapter content. Quizzes consist of 25 multiple-choice, true-false pooled questions, timed at 20 minutes. Warner allows two attempts in the event of technical issues, and the highest score is recorded in the gradebook.

Students also complete four exams in MyMISLab, covering approximately four chapters each, except the final exam which is cumulative. Each exam consists of 40 multiple-choice and true-false, pooled questions chosen from the Pearson test bank as well as questions created by Warner. Students have just one attempt and 40 minutes to complete the exam once it has been opened. Students can choose between a morning and evening exam session, and Warner is online during testing times to handle any technical issues that might arise. Warner uses the MyMISLab Email by Criteria function midway through the exam period to contact students who have not yet taken the exam, as both an alert regarding the exam time frame and to avoid multiple emails and requests on the exam due date. Exams cannot be completed late.

Assessments

  • 40% Exams (four)
  • 40% Projects: HTML, Excel, Access
  • 10% MyMISLab quizzes (13)
  • 5% MyMISLab homework exercises
  • 5% Syllabus quiz/end-of-semester reflection

Results and Data

Figures 1 and 2 are correlation graphs; correlations do not imply causation but instead measure the strength of a relationship between two variables, where r is the correlation coefficient. The closer the r value is to 1.0, the stronger the correlation. The corresponding p-value measures the statistical significance/strength of this evidence (the correlation), where a p-value <.05 shows the existence of a positive correlation between these two variables.

  • A strong positive correlation exists between average MyMISLab homework grades and average quiz grades, where r=.57 and p<.05 (online format).
  • A strong positive correlation exists between average MyMISLab quiz grades and average exam grades, where r=.64 and p<.05 (face-to-face format).

For students, the formative MyMISLab homework assignments are intended to help them identify where they are in terms of successfully completing the summative exams; it appears that performance on these assignments could be a leading indicator of course success (additional research is needed to develop and test this concept further). As a best practice, MyMISLab homework grades are intended to help Warner identify students early on who are struggling and might be at risk of poor overall course performance.

Grade distribution data in terms of course success (students earning an A, B, or C) show that students in Warner’s face-to-face section who earned higher MyMISLab homework scores also earned higher average quiz and exam scores (figure 3).

  • Students who earned A, B, or C average quiz grades had average MyMISLab homework grades 25 percentage points higher than students who earned D or F quiz averages.
  • Students who earned A, B, or C average exam grades had average MyMISLab homework grades 35 percentage points higher than students who earned D or F exam averages.
  • Students earning an average quiz grade of A recorded an average MyMISLab homework score of 99 percent, and students earning an average exam grade of A recorded an average MyMISLab homework score of 91 percent.

Students in Warner’s online section were placed into two groups based on the average number of MyMISLab homework assignments they completed. Students in Warner’s online section who completed all assignments earned higher average quiz and exam scores than students who did not complete all MyMISLab assignments (figure 4).

  • Average number of MyMISLab homework assignments skipped: <1
  • Students who completed all MyMISLab homework assignments had average quiz grades 18 percentage points higher than students who did not complete all assignments.
  • Students who completed all MyMISLab homework assignments had average exam grades 5 percentage points higher than students who did not complete all assignments.
  • 76 percent of students completed all MyMISLab homework assignments.

Data analysis was completed for Warner’s three course formats (face to face, hybrid, online) but could not all be included here due to space and length restrictions; however, results for all formats were positive and similar to those reported here. To read the complete data analysis, please contact candace.cooney@pearson.com.

Correlation between average MyMISLab homework score and average quiz score

MyMISLab_BarbaraWarner_Figure1

Figure 1. Correlation between Average MyMISLab Homework Score and Average Quiz Score, Online Format, Spring 2016 (n=365)

Correlation between average MyMISLab quiz score and average exam score

MyMISLab_BarbaraWarner_Figure2

Figure 2. Correlation between Average MyMISLab Quiz Score and Average Exam Score, Face-to-Face Format, Spring 2016 (n=163)

Relationship between average MyMISLab homework score and average quiz and exam scores

MyMISLab_BarbaraWarner_Figure3

Figure 3. Relationship between Average MyMISLab Homework Score and Average Quiz and Exam Scores, Face-to-Face Format, Spring 2016 (n=163)

Relationship between average MyMISLab homework completion score and average quiz and exam scores

MyMISLab_BarbaraWarner_Figure4

Figure 4. Relationship between MyMISLab Homework Completion and Average Quiz and Exam Scores, Online Format, Spring 2016 (n=365)

The Student Experience

Responses from the Spring 2016 end-of-semester, voluntary survey of Warner’s students (66 percent response rate) indicate that the majority of responding students recognize the value of MyMISLab.

  • 88 percent of students agree or strongly agree that their understanding of the course material increased as a result of using MyMISLab.
  • 80 percent of students agree or strongly agree that the use of MyMISLab positively impacted their exam scores.
  • 79 percent of students agree or strongly agree that they would recommend MyMISLab to another student taking this course.

Student survey responses to the question, “What did you like most about the Dynamic Study Modules?” include:

  • “The Dynamic Study Module assignments gave me an idea of what I needed to concentrate on.”
  • “I liked that I could click on two answers I think are correct and it would let me know if one of them was the right answer. It gave me more confidence when picking my answers and not to second guess myself.”
  • “I really liked that when I got a question wrong, it gave me a learning session to see the correct answer as well as why I was wrong. Then, it proceeded to give me a similar question later on so I could really nail down that topic.”
  • “I liked that they went back and reviewed information you didn’t know and explained it in further detail.”
  • “There was a lot of repetition that made me remember more than I thought I did!”

Student survey responses to the question, “What did you like most about MyMISLab?” include:

  • “I like how MyMISLab incorporates different methods to help me remember the material, such as with videos and real-life simulations.”
  • “I liked the homework and quizzes which allowed me to prepare for the exams. I also liked the amount of study materials available.”
  • “What I liked most about MyMISLab was how it split up the homework, from learning content to actually using the knowledge in a simulation.”
  • “I liked that there was more than one type of homework assignment, so I could practice in different ways.”
  • “All the information is located in a centralized place. It allows me to see the scores I have earned on each assignment in real time and it provides multiple activities per chapter to reinforce the material.”

Conclusion

Following a MyLab best practice, Warner’s MyMISLab homework assignments consisted of multiple problem types: multiple-choice chapter warm-ups, video exercises, simulations, as well as the adaptive learning-style Dynamic Study Modules. By offering students options for visual, verbal, aural, and logical learning styles, her students were able to study using techniques best suited for them. In fact, many students commented on Warner’s end-of-semester survey that they believe the variety of homework assignments helped them master the course content. Because maintaining academic integrity in testing was critical when choosing a digital component for this high-enrollment course, Warner also utilized MyMISLab’s pooling and randomization of test questions to create course assessments. This allowed for exams that challenged students, but offered all students a similar assessment experience, while creating a secure environment for testing in which USF could feel confident.

0 Comments

Leave a reply

Your email address will not be published. Required fields are marked *

*