Straightforward Ways You Can Use Data to Improve Learning Outcomes

Blog images (1)

How to practically use adaptive learning to keep people engaged

Data is an incredibly valuable resource, providing insights on student performance, assessment effectiveness, and improvement over time. 

And, as EdTech industry expert Mariana Aguilar acknowledges:

‘Data can be one of the most powerful resources that educators have at their fingertips to accelerate learning, and so we need to treat that data with care and respect as we leverage it to drive positive learning outcomes’.

While almost all education publishers acknowledge this, many face challenges in building the right storage and preparatory systems to enable effective data use. 

In this article, we’ll explain how you can lay the foundation for streamlined data processing and analysis — and outline 5 straightforward ways you can then employ this data to improve learning outcomes.

Suggested reading: Read our case study, ‘Developing a Custom Assessment Engine and Platform for Optimised Learning Outcomes’, to discover how Explore Learning utilised data to streamline their online assessments. 

Laying the foundation for effective data use

To make the most of your data, you need to be able to store, access and analyse it. This could be difficult if you:

  • Currently use a Software-as-a-Service solution that doesn’t give you full control of or access to your data
  • Have data that is siloed across numerous legacy applications or technology
  • Have limited in-house technical expertise

To lay the foundation for effective data use, you will need to unify your data in a single, secure location. This is typically known as a ‘data lake’ approach, and is the first stepping stone towards implementing innovative AI, NLP and ML technologies.

Talk Think Do works alongside educational institutions to assess the performance and potential limitations of their data storage and processing platforms. Based on our experience, it’s wise to undertake a thorough discovery and technical review process before making any significant changes to your system.

Pro tip: Read our article, ‘Power Apps vs. SaaS vs. Custom Build: Which Solution is Best For Your Business?’, to discover your best-fit data storage and management solution.

What does the ‘data lake’ approach involve?

Data lakes are used to store large quantities of unstructured data until it is required for analysis or AI application. Raw data can be ‘filtered’ into the lake to ensure that only relevant data ends up in the system — meaning that analysts don’t have to worry about low-quality or erroneous data.

Data lakes can also be refined to produce ‘data lakehouses’, which combine the indexing and analytics benefits of data warehouses with the vast storage capacity of data lakes.

Establishing a data lakehouse typically involves four key steps:

  1. Implementing automated processes: Ensuring that all data is written to a data lake in semi-real time. This provides a well-structured feed that is easily consumable by analytics tools. With Learnosity systems, for example, a data lake might include highly detailed content data and information from the solutions database. Learn more about Learnosity on our partner page.
  2. Consolidating storage: Sometimes, the need for raw data is not identified until months or years into the future. Streamlining data into a low-cost storage solution lets you keep hold of valuable data that could inform future practices. 
  3. Consuming the data: You can choose to either consume data directly using tools like PowerBI, or use Apache Spark or Azure Data Factory to analyse, transform and load data into aggregated datasets. 
  4. Monitoring changes: By collecting fine-grained data around all changes to the system, you can identify key development opportunities. 

Ensuring your data is filtered and stored in the right way is crucial. Without a secure foundation from which to grow, you won’t be able to effectively utilise your data or make the most of current AI developments.

New call-to-action

5 ways to utilise your data

Once your data is being stored and filtered appropriately, you can start using it to improve learning outcomes. 

Keep in mind that ‘data’ can include anything from the number of assessment questions a student answers correctly to the time spent on each webpage — and will therefore have a huge variety of use cases and applications.

That being said, here are 5 of the top use cases for the education publishing industry:

  1. Review the fairness of assessments

Data such as answer rates, learner age, and dwell time can be fed into analysis tools to identify if a certain demographic or group of learners is continually getting a question wrong. This could indicate that:

  • There is an error with the question
  • The format of the assessment is not accessible to all learners
  • A specific question is not accessible due to confusing language or images

Once you have pooled the relevant information in your data lake to identify issues in the content, you can start to make changes accordingly.

This is where AI can make a phenomenal difference. And thanks to recent advances in large language models, it’s easy for you to improve the accessibility of your content even with limited assessment data. For example, you could implement:

  • Audio prompts: Supporting students with learning difficulties or visual impairments.
  • UX / UI adjustments: To provide clean, simple visuals or engaging, bright designs, depending on the individual user’s learning needs.
  • Transcription and translation: Translate or transcribe assessments in real time as they are delivered to students.
  1. Optimise the review process

Human analysts can only do so much when it comes to reviewing assessments. Due to the scale and complexity of data produced on a daily basis, most analysts end up making decisions based on a select portion of data including final grades and pass rates.

This can produce misleading results that are too reductive in their focus. Using AI tools to analyse questions and answers can help you gauge the quality of your assessments.

If a disproportionate number of students are getting a question wrong, AI can suggest:

  • Rewording the question to use simpler language or provide additional context.
  • Adjusting question distractors. Distractors, the incorrect options in multiple-choice assessments, may need adjusting to be more precise or plausible. 
  • Changing the difficulty to ensure students are presented with questions appropriate to their learning level. Use LLMs to run controlled pre-trials that can efficiently identify the difficulty of assessments and their suitability to different abilities, ages, and cultures.
  • Delivering supporting content such as hints or further information on the topic.

Using AI in the review process also allows you to demonstrate and justify the success of your learning system based on how quickly students are improving compared to with typical, paper-based assessments.

  1. Identify students who are struggling

Assessment data lets you see who is struggling and why. If the data that you’re seeing is not a consistent issue across your platform, you will know that a student may need extra support and can guide them to improve.

This is especially important considering the increased popularity of remote learning. While studying online is hugely beneficial in making education accessible to a wider range of students, it can:

  • Reduce engagement levels
  • Make students feel isolated
  • And make tracking performance harder

Being able to identify students who are struggling early on helps to ensure they are sufficiently supported and key learning objectives can be maintained.

  1. Create adaptive assessments

Aggregate trends related to learning outcomes and use this data to iteratively improve assessments and create more tailored content. Focus particularly on:

  • Learning pathway engagement (identifying the most and least successful pathways)
  • Topic interdependency and correlative answers
  • Topics getting repeat incorrect answers
  • Student performance over time
  • Questions per/minute
  • Correct answer rates

 This will feed into your adaptive learning systems, providing assessments that are:

  • Engaging: You can deliver easier or harder questions depending on performance to ensure students don’t get dismayed or demotivated.
  • Educational: If a student has repeatedly struggled with a single topic or question, your system can automatically deliver supporting content to aid their learning. A good adaptive learning platform will provide a balance of new topics and topic reinforcement questions to maintain a consistent difficulty level. 
  • Innovative: You can use an adaptive, data-driven learning engine to explore new methods of question delivery and a/b test their effectiveness.
  • Self-regulating: Systems should be able to assess the list of questions answered and establish whether the user is likely to be able to answer the next question correctly, or whether they should return to more basic topics for reinforcement.
  1. Use foundational AI models to enable rapid content creation

Adaptive learning and data-driven assessments inevitably require larger amounts of content to enable students to follow a range of different learning pathways. 

Use foundational LLM models like GPT-4 to develop content based on the information already on your systems. Typically, content generation through foundational models will involve using an SDK like Microsoft Semantic Kernel to perform:

  • Content tokenisation 
  • Data grounding
  • Retrieval augmented generation
  • Prompt engineering 
  • Prompt orchestration

As opposed to the more traditional method of using ML to train specific AI models, using foundational models lets you achieve a high-quality output, quickly. 

How Explore Learning utilised their data to power a custom adaptive learning engine

Explore Learning is a private tutoring provider supporting children aged 4-16. They wanted to improve learning outcomes for their students, and approached Talk Think Do with three key goals:

  • Gain control over their assessment data, which was previously held by a third-party SaaS company.
  • Implement a custom adaptive learning system
  • Monitor and regulate assessments according to student performance

Talk Think Do rapidly built and iterated a custom adaptive learning engine and platform, designed to give Explore Learning improved control over all of their assessments. This was optimised to ensure it was accessible, creative and engaging for students.

Explore Learning’s new adaptive assessment engine helps students to make 19 months of progress in a single year.

Suggested reading: Read the full case study, ‘Developing a Custom Assessment Engine and Platform for Optimised Learning Outcomes’, to discover more about how digital innovation benefitted Explore Learning.

Improving learning outcomes through innovation

Student learning is driven by innovation. And, in 2023, this innovation is driven by data. Use the information already stored on your systems to:

  • Build more effective learning pathways
  • Generate more content for adaptive learning
  • And provide more accurate performance reviews

Talk Think Do has extensive experience developing and implementing complex education publishing software. We follow a cloud-native, DevOps-driven approach that allows us to deliver secure, reliable and customised results.

Our services include cloud application development, DevOps implementation, OpenAI implementation, and managed support. 

To learn more about how we can help your business, book a free consultation with us today.

New call-to-action

1  Five ways digital assessment can enhance the candidate experience

2  (PDF) Online vs. Paper-Based Testing: A Comparison of Test Results

3  How our smart learning tool works

4  AI-enabled adaptive learning systems: A systematic mapping of the literature – ScienceDirect

Table of Contents

    Get access to our monthly
    roundup of news and insights

    You can unsubscribe from these communications at any time. For more information on how to unsubscribe, our privacy practices, and how we are committed to protecting and respecting your privacy, please review our Privacy Policy.

    See our Latest Insights

    The platform for advanced AI apps in 2025

    The recent announcements at Microsoft Ignite 2024, particularly the introduction of Microsoft Fabric’s SQL Database and Azure AI Foundry, present significant advancements that align seamlessly with our mission to deliver cutting-edge generative AI implementations for our clients. Enhancing Generative AI Implementations with Microsoft Fabric’s SQL Database The SQL Database in Microsoft Fabric is engineered to…

    Learn More

    Customising Microsoft Copilot: Exploring Options for Tailored AI Assistance

    If you’ve been following AI developments in 2024, Microsoft Copilot is a tool you’re likely already familiar with. Aimed at improving workplace productivity, streamlining decision-making, and optimising business processes, Copilot is being used by tens of thousands of people at an impressive 40% of Fortune 100 companies.1 While it’s still too early to tell what…

    Learn More

    Evaluating AI Tools Using a Task-Based Framework to Optimise Productivity

    We’ve all heard about how AI can improve productivity, boost work quality, and open doors to new business opportunities. But the reality is that these kinds of successful results rely on considerable preparation and careful implementation. According to recent surveys, 63% of respondents in successful businesses say that the implementation of generative AI is a…

    Learn More

    Legacy systems are costing your business growth.

    Get your free guide to adopting cloud software to drive business growth.