Wednesday, 6 August 2025

Unlocking the ROI of Learning

Or how to free your data from the LMS

Relevant to: HR, L&D, Operations, Finance

You’re under pressure to show the impact of learning. You need to prove that training links to performance, that learning programmes improve retention, and that investment in learning is moving the profitability needle.

But when you go looking for answers, all roads lead to your learning management system (LMS) — and stop there.

You know the data is in there somewhere. Completion rates, assessment scores, time spent learning, department-level engagement — all the raw material for powerful insights. But what you get instead are clunky exports, static reports, and dashboards that don’t speak the language of the business.

If this sounds familiar, you're not alone.

You’ve got the data!

You know you’ve got the data, that’s why you have an LMS. The bad news is that it’s not always easy to get at – particularly if your LMS is software as a service (SaaS).

If you have found your valuable learning data locked in your LMS, I feel your pain. This wasn’t what you were expecting.

Using your data



There are different ways to use data stored in your LMS. For many, the standard reports are more than enough. But for those that want to analyse learning data together with other data, or do more in-depth analysis, you have to get data out of your LMS and into a reporting data store. Here’s how that might work:

  1. API extraction – most modern LMS platforms provide RESTful APIs. With proper authentication (usually via OAuth 2.0 or an API key) you can programmatically extract learning records.
  2. Data pipeline & transformation – Data is ingested into a reporting database such as Azure SQL Database, where it's cleaned, normalized, and enriched with metadata.
  3. Semantic modelling – Using tools like Power BI, you can then build a semantic layer that defines business terms — e.g., "active learner", "average time to completion", "learning impact score".
  4. Dynamic dashboards – These models power interactive visuals, filterable by time, team, location, or training programme — and update in real time or on a schedule.

If you are wondering why you need to store, check and clean data in yet another data store, I talk about that here: Can you trust your data? The bottom line is that your data needs to be clean, up to date, and readily available for analytics purposes. 

Power BI is L&D’s new best friend

With this approach, you’re no longer limited by your LMS’s front end. You get full control of your learning data — and the power to connect it to performance or finance data for deeper insights.

So, although you may sometimes feel that your data is locked in your LMS, there are ways to get at it and analyse it in friendly tools like Power BI.

Here at Anatec AI we’ve been working with data, interfaces, APIs, and learning systems for many years. So, we are well placed to help if you need it. And we can help with Power BI dashboards, scorecards, DAX queries and semantic models. There’s nothing we’d like more. 

If you have a question or want to chat about any challenges you’re facing, get in touch.

Keywords: learning analytics, data analytics, learning management system, LMS, Power BI, L&D dashboard, L&D scorecard

Tuesday, 5 August 2025

Can you trust your data?

If you’re relying on data to make decisions, here’s a question for you:

Can you actually trust your data?

Bad data leads to bad decisions. And if you’ve ever tried to build a Power BI dashboard using unprepared data… you know what I'm talking about. That’s because good analytics starts before the dashboard—with trustworthy data.

So, let’s talk about what makes data trustworthy—and how to get there.

The 3 C’s of trustworthy data

To be useful, your data needs to be:

Correct

Current

Constant

These three qualities are the backbone of any solid reporting system. Miss one, and your insights could be misleading at best—or flat-out wrong.

1. Correct data: accuracy isn’t optional

Data errors creep in more easily than you think. A common name mix-up might credit someone with attending a course they never showed up for. Or a stock item might never be recorded because “we were going to use it straight away.”

Sound familiar?

Then you’ve got things like:

Null values

Duplicates

Outliers

Inconsistent fields between systems

All of these distort the truth your analytics are supposed to reveal. Cleaning and validating your data isn't optional—it's foundational.

Question for you:

What’s the most unexpected data error you've ever uncovered?

2. Current data: how fresh is “fresh enough”?

Everyone has a different definition of “up-to-date.”

For a factory floor, real-time data might be essential.

For HR reports, yesterday’s numbers might do just fine.

But what matters most is transparency—do you know how current your data is, and can you trust that timestamp?

3. Constant data: reliable, available, and secure

Once your data is cleaned and verified, you need to keep it:

Securely stored

Regularly backed up

Available wherever it’s needed

You don’t want your cleaned dataset disappearing on a lost laptop or overwritten by mistake. Constancy means your data is dependable and accessible, day in and day out.

Choosing the right data platform: why Azure?

The Azure data platform gives you flexible, scalable ways to store your analytics-ready data, depending on your requirements:

Azure SQL Database

Great for datasets up to a few terabytes (TB)

Geo-redundant and cost-effective

Easily scalable up or down

Supports Medallion architecture using schemas or databases

(If you’re curious about that approach, I wrote more about it here.)

Microsoft Fabric

Ideal for high-performance analytics at scale

Better suited for large volumes of data

Higher cost, but better performance

Also supports the Medallion architecture.

Once your data is in the cloud, everything gets easier—from sharing semantic models to boosting Power BI performance.

Don’t skip the foundations

Data visualisation tools like Power BI are only as strong as the data underneath them. Trustworthy data isn't just clean—it's correct, current, and constant.

So, here's a challenge:

What’s your biggest headache when it comes to data quality or reporting?

Drop it in the comments—I’d love to hear what you're wrestling with (and maybe swap ideas on how to fix it). And as always, if you want to talk about data quality, you can get in touch here.


Anatec AI has worked with data quality issues for many years. We focus on helping companies make better use of their data to improve their performance and resilience.

Key words: reporting, analytics, data quality, Power BI, Microsoft Fabric, dashboard design


Tuesday, 29 July 2025

Learning data is valuable data

Learning data may be the most underused strategic asset in your company. The world’s most powerful companies are increasingly evidence-led, yet learning and development (L&D) is often under-represented when it comes to data analytics. Right now, it has never been more important to nurture and retain good people. AI tools such as ChatGPT can assist skilled and knowledgeable people; but they don’t replace them. 

Blended learning - messy data

The move from classroom training to on-line or blended learning has reduced costs and facilitates less time away from work. But it has made learning a lot harder to track. Micro-learning results can be difficult to evaluate when you want a strategic view of your learning.  E-learning, YouTube, and MOOC courses are all more complex than pure classroom training when it comes to managing data. 

The rise of the learning management system

Learning management systems (LMS) make it possible to centralise learning data. An LMS not only enables you to distribute your own learning content but also track other learning in different formats. The only challenge is to turn this valuable asset into actionable insights. 

LMS data and Power BI 

Whilst every LMS has a variety of reports, there’s no replacement for the ability to interrogate data in analytics tools such as Microsoft Power BI. The ability to visualise the data in different ways, add multiple “slicers”, and use AI to find patterns and trends is tremendously powerful. Plus, you can match your learning data with data from other sources such as performance data, HR retention data etc. This flexibility can unlock real insight into what your learning programmes are achieving. Power BI can help you to make sense of a more complicated world of learning, massively improving your chances of finding patterns amongst the noise. 

How Anatec AI can help

If you want to do more with your LMS data and are stuck somewhere along the way – we can help. There are multiple ways of getting data from your LMS and into Power BI, depending on your needs. We can design data interfaces and help you choose the right way to store reporting data. We build semantic models that focus on your business needs and can help you make better use of Power BI AI capabilities.

Whether you’ve got a simple question, or an analytics project you need help with, get in touch to see if we can help. We’d love to hear from you. 


Anatec AI has worked in the learning and development space for many years. We designed and built two custom learning systems before LMS’s were even a thing. Now we focus on helping companies make better use of their data to improve their performance and resilience.

Key words: learning analytics, LMS data, Power BI for L&D, evaluating learning impact

Tuesday, 10 June 2025

Real-time intelligence: why speed matters more than ever

How real-time analytics is reshaping decision-making, competitiveness, and customer experience

Modern tools like Microsoft Power BI have revolutionised the way we analyse and visualise data. So much so that we’ve almost forgotten that we are looking at historical data that can be days, weeks, or even months old. After years of having no alternative, we’ve become desensitised to the delay between data collection and making sense of what it means.

But in today’s fast-paced and data-driven world, decisions are only as good as the data they are based on. That means that reducing data latency is more important than ever. When it comes to data, speed really does matter.

Having real-time or near real-time data has become a source of competitive advantage, a way of improving products and services, and a way of staying relevant. Data latency is no longer just an IT issue, it is now firmly on the boardroom agenda.

One company that has transformed its product offering with real-time analytics is The New York Times. By migrating to the cloud, and getting real-time data about how their readers behave, they have improved their content and tailored it to user preferences. The result has been greater reader loyalty and greater relevance in an industry that has had to make huge adjustments from print to digital. Whilst the New York Times is a multi-billion dollar business, the principles they are using are universal.

In the past there were good reasons for analysing data in batches: processing streaming data was complex and expensive, in memory storage didn’t exist, and cloud computing was less mature. But these technical barriers have now gone. And new technologies like Microsoft Fabric make it much easier for businesses to benefit from real-time data analytics. Microsoft Azure enables businesses to migrate to the cloud incrementally, delivering business value at every step.

If you’re exploring how to reduce data latency in your business — or how to build real-time dashboards from streaming data — we’d love to help. We are experienced developers for the Azure data platform and can turn ideas into robust data solutions for a range of needs. Why not get in touch for a no-obligation exploratory chat? It can only speed things up ....

Monday, 5 May 2025

How to use Microsoft Power BI semantic models to analyse data from multiple sources

Unlock deeper data insights with Microsoft Power BI semantic models

Microsoft Power BI is a game-changer when it comes to analysing departmental data. One of its great strengths is that it enables you to analyse data from more than one source. That means you can bring in data from say an Excel spreadsheet, an Access database and a line of business application, and analyse it as if it were just one data file. 

Working with more than one data source enables you to unlock more value from your data. It also enables you to use powerful AI visuals such as Power BI’s Key Influencers which needs a wide range of different attributes to give good results. 

But having several data sources can make the data more complicated to work with. Common problems include not being able to get Power BI to “see” parts of your data, or selection visuals not working as you expect. 

Power BI semantic models unlock more value from your data

A semantic model, or more simply a data model, puts data into a format that makes business sense. It is a business-friendly layer between the raw data and the report visuals. 

For example, if you are interested in sales trends, it is likely to want to know sales per day rather than deal with thousands of individual transactions. Creating a DAX measure called “DaySales” within the model makes the data easier to work with, and improves accuracy. You might also want to view “DaySales” by attributes from other files, so the semantic model needs to include those relationships.

Relationships may be one to many, or many to many, and the key to a good semantic model is how best to handle these relationships to enable meaningful analysis of the data. Data in a semantic model is often in a star schema format, with one or more fact tables, and several dimension tables that provide context to the data you want to analyse. Modelling data using a star schema provides a wide range of ways of analysing key metrics. 

So how do you go about creating a semantic model? 

Semantic models step-by-step

Creating a semantic model is step-by-step process, which needs to be done in the right order: 

1. Understand the business needs 
2. Clean and transform data
3. Create relationships, measures, calculated columns, and hierarchies 
4. Add security restrictions
5. Test and optimize.

The first, and arguably most important step, is to understand the business needs. Who will be working with the data, and why? What are they trying to achieve? Who will view the reports or use the analysis? What restrictions should be included, for example should some people see only part of the data set? Analysing the business needs provides a wish-list of requirements that can then be used to create the semantic model. You may be tempted to skip this step, feeling that you know your own business. Although that may be true, analysing your requirements for data analysis is always worth the effort.

Secondly, the data must be clean. This means that it must have the right data types, duplicates are removed, and decisions made about any data that is missing or obviously wrong. As clever as Power BI is, it is only as good as the data it is given.

Thirdly, the semantic model is created by joining data tables to create a star schema, and adding DAX measures, aggregates, and additional columns. Columns can be renamed or hidden to make the data easier to work with. Hierarchies can be created to make common tasks easier, such as managing dates. 

After the semantic model has been created, security such as row-level security can be added if needed. 

Finally, and crucially, the model needs to be tested to ensure it provides the expected results.

As with many things related to data, the order in which you do things is vital. Trying to work with raw data that hasn’t been cleaned will not produce good business results. Equally, providing high quality data that doesn’t make business sense will not produce good results either.

How to create the right semantic model for your needs

Power BI offers excellent visualization and analysis capabilities, making it the go-to tool for departments that want to make better use of their data. With the right semantic model, it’s easy to ask questions, dig deeper, and use AI to analyse your business data. Our step-by-step process allows you to figure out what you need from your data and assess what additional steps you need to optimize your reports. And if you want to understand more about why star schemas are so powerful when it comes to analysing data, have a look at Chris Adamson’s book “Star Schemas: The Complete Reference”. 

Here at Anatec AI we have many years of experience in modelling data, including business analysis and data preparation for both people and AI. So, if you think your data might not be in the right format to deliver your business goals, get in touch for a chat.