Skip to main content

Customer Case Study: School of Education


Jeff Greene and Matt Bernacki are learning scientists in the UNC-Chapel Hill School of Education. They leverage the data that students create when they use digital resources to help them learn.

Jeff Greene stands in front of chalkboard
Jeff Greene

The Client

Greene and Bernacki lead a project funded by the National Science Foundation titled “Improving Undergraduate Student Success in Introductory STEM Courses Via Campus Data Systems and Targeted Support for Self-Regulated Learning.”

They collaborate with instructional experts in Biology (Kelly Hogan, Mara Evans and Alaina Garland) to provide students with a rich set of digital resources that provide an “active learning” experience for the hundreds of students who enroll in Biology 101 each semester.

Their Education team (including graduate student Robert Plumley and postdoctoral researcher Mladen Rakovic) observes how students who volunteer their data use these resources and how they perform in the course. They then collaborate with a UNC-Chapel Hill Quantitative Psychology team (faculty members Katherine Gates, Abigail Panter and graduate student Chris Urban) to build algorithms to predict student achievement.

Once an algorithm can identify students likely to benefit from support, the team can reach out — before the first course exam — to provide learning support to students whose data suggests they are likely to perform poorly on it.

Matt Bernacki stands in front of window
Matt Bernacki

The Challenge

Quality instructional design requires that instructors incorporate many tools to make learning active, engaging and dynamic. This often means that students engage with multiple different technology platforms as they learn biology concepts in their course.

To deliver this kind of experience to students, the team of Biology 101 instructors hosts key course documents, including syllabi, lecture notes and guided reading questions on a Sakai learning management system course site.

In class, students use a formative assessment tool called Learning Catalytics to check their knowledge and focus their discussions with classmates. Outside of class, instructors link students out to a digital textbook where students can read content and must complete online homework assignments to check their knowledge. They can also access enrichment resources including videos and simulations provided by the publisher. Instructors provide a discussion platform on Piazza where students can discuss course topics with peers and instructors and view others’ threads.

To further support students’ learning, instructors use CourseCare (developed at UNC-Chapel Hill by Kris Jordan of the Department of Computer Science) to enable students to seek help from peer mentors and schedule attendance at review and supplemental instruction sessions.

Each time a student uses a digital resource, the student creates a “learning event” that can inform the algorithms that determine how learning support is provided. When students use many different tools, these learning events are logged in many different places.

Uniting the different logs of student activity from half a dozen tools presents a monumental challenge. Data need to be acquired from each site, formatted in ways that enable them to be connected, and then maintained in a single environment where they can be organized and refined into traces of student learning events that predict achievement.

The Solution

The ITS Middleware Services group, managed by Patrick Casey, provides the expertise necessary to solve the Education Team’s data modeling challenge.

Dave Safian
Dave Safian

Led by Dave Safian, Middleware Services employees developed a data model using Splunk — a search and aggregation engine designed to handle diverse, often unstructured, logs of data — to organize incoming data into a robust learning analytics platform. It can afford educational data mining analyses that produce algorithms to identify students who need support before they begin to fail exams.

“Splunk is a powerful analytics platform that excels at combining structured and unstructured data, which helps us better understand how disparate systems can be combined into a single view,” said Dave Safian of Middleware Services. “Splunk has been an essential part of reporting for IT operations and IT security on our campus for many years. The student success project uses similar types of operational data but looks at it in a different way. It shows how we can aggregate data from several systems from and leverage it to improve educational outcomes for students.”

Safian added, “I think we are just scratching the surface of what’s possible in understanding how our students interact with technology on campus. We are eager to continue to partner with campus units such as the School of Education.”

The Results

Using data collected from roughly 300 Biology 101 students in Spring of 2019 and the Splunk solution developed by Middleware Services, the Education and Quantitative Psychology team developed an algorithm that can successfully identify 74% of students who will earn a C or worse on the course final exam, using only the first three weeks of students’ learning event data.

The prediction model is even more accurate for first generation college students (77%) and those from ethnic groups who are historically underrepresented in STEM professions (80%).

The Future

This fall, the Education team applied its algorithm to the data of students in Biology 101 who opted in to receive feedback on their learning.

Based on the prediction model, students received a check-in 10 days before the exam, and those who appeared likely to perform poorly on future exams were given advice from students who succeeded in the course in prior semesters. The advice included study strategies these students used to excel in the course and which aligned to principles of learning from educational research.

In coming months, the Education team will examine whether students who received this advice adopted behaviors associated with better learning outcomes, and whether they were able to outperform peers on their Biology 101 exams.


Comments are closed.