
Taylor’s University strengthens student engagement and performance through formative use of Smart Worksheets
At Taylor’s University, the School of Biosciences evaluated how LearnSci Smart Worksheets could support student engagement and conceptual understanding across two undergraduate modules. The focus was on moving beyond simple participation to better understand how students approached complex bioscience problems.
Two Smart Worksheets were introduced as compulsory formative activities following teaching. Using platform analytics, the team analysed completion patterns, raw and adjusted scores, autosolve usage, and score efficiency ratios to identify learning gaps and patterns in student reasoning.
The analytics revealed clear patterns of engagement and provided evidence of learning progression within activities. These insights informed targeted scaffolding strategies and enabled more responsive, evidence-informed teaching.
The Challenge:
Improving engagement and conceptual reasoning in formative bioscience learning
Bioscience education requires students to integrate conceptual reasoning, data interpretation, and procedural fluency. Even after exposure to theory, many students can struggle to apply this knowledge in practice, particularly in areas such as biochemical testing and titration analysis.
Formative activities are commonly used to support this transition, but they often provide limited visibility into how students engage with challenging material. While educators may know whether students have completed an activity, it is harder to identify where misconceptions arise, which concepts require the most effort, and how effectively students convert attempts into correct understanding.
Dr Yin Sim Tor and colleagues in the School of Biosciences at Taylor's University wanted to better understand how students were engaging with complex concepts and to identify learning gaps earlier in the process. Improving the quality of formative engagement while gaining actionable insight into student reasoning became a key priority.
This reflected a broader challenge within the modules: students were able to engage with theoretical content, but there was limited insight into how well they could apply this knowledge in structured problem solving tasks. Without this visibility, opportunities to provide targeted scaffolding, reinforce difficult concepts and support different levels of learner progression were constrained, and it was difficult to identify where tasks exceeded students’ existing knowledge structures or where additional scaffolding was needed to support progression.

The Solution:
Embedding Smart Worksheets to provide formative practice and actionable learning analytics
LearnSci Smart Worksheets were introduced as compulsory formative activities embedded within two bioscience modules. The worksheets were aligned with module learning outcomes and positioned after relevant theoretical teaching, allowing students to apply conceptual knowledge through structured problem-solving tasks.
Students completed the Smart Worksheets independently and received immediate automated feedback on their responses. As the activities were self-marking, they provided consistent formative practice without increasing marking workload for academic staff.

Integrating Smart Worksheets within module delivery
Two Smart Worksheets were deployed within the curriculum: Introduction to Food Tests in a Year 1 Semester 1 bioscience course and Determine an Unknown Concentration of Acid by Titration in a Year 2 Semester 4 module. The activities were delivered through the learning platform after students had been introduced to the relevant theory.
The worksheets were designed to guide students through progressively more complex problems, supporting the development of conceptual reasoning and procedural understanding. Immediate feedback enabled students to identify and correct errors during the activity, reinforcing learning as it occurred.
Alongside the Smart Worksheets, LearnSci Analytics provided detailed metrics, including completion rates, raw and adjusted scores, autosolve usage, and score efficiency ratios. These metrics provided insight into how students engaged with different sections of the worksheets and highlighted areas where additional instructional support might be needed.

The Results:
Up to 21%
learning gains demonstrated through Smart Worksheets
0–1%
autosolve usage across most worksheet sections
Consistent improvement in learning efficiency and clear identification of conceptual blockers
Smart Worksheet Analytics revealed clear patterns of engagement and learning progression across both modules. Feedback-driven improvement was visible within activities, while section-level analysis highlighted specific areas where students required additional support.
Learning efficiency improved through feedback
Analysis of the Introduction to Food Tests worksheet showed consistent improvement between raw and adjusted scores across sections. The score efficiency ratio increased in later sections, indicating that students were learning from feedback and becoming more efficient in converting attempts into correct answers.
Using analytics to inform targeted teaching and support
In the Determine an Unknown Concentration of Acid by Titration worksheet, one section showed lower performance alongside increased reliance on autosolve support. This revealed a specific reasoning challenge linked to titration curve interpretation and pKa-related calculations, enabling targeted instructional reinforcement.
The analytics also revealed broader patterns in how students engaged with the worksheets. In the Year 1 activity, students generally relied on their own reasoning even in more challenging sections, suggesting growing confidence in applying conceptual knowledge. In contrast, the Year 2 worksheet highlighted a specific area where students required additional support, indicating a gap between theoretical understanding and applied problem-solving.
By embedding LearnSci Smart Worksheets as analytics-enhanced formative checkpoints, Taylor’s University moved beyond measuring completion to analysing learning efficiency. The integration of raw and adjusted scores, autosolve data, and score efficiency ratios provided a multidimensional view of student reasoning.
A gradual decline in attempts in later sections suggested cognitive fatigue during extended problem solving. These insights informed adjustments to teaching and activity design, including additional scaffolding for complex topics, restructuring of later tasks and the introduction of strategies to support sustained engagement.
The initiative demonstrated that formative analytics can surface hidden misconceptions, highlight cognitive fatigue, and distinguish between independence and support dependency. More importantly, it showed that when these insights inform targeted instructional refinement, they strengthen conceptual understanding and improve performance within and across modules.

For bioscience educators seeking to align instructional design with cognitive readiness, this case offers a clear module of how structured digital resources and learning analytics can support differentiated scaffolding, earlier intervention, and measurable gains in student understanding without increasing marking burden.