how to utilise semantic-link-labs to gather detailed information about all your semantic models.
introduction.
At Fabcon ’24 in Stockholm, I attended a session by Sandeep and Michael presenting the stuff that had freshly been released to semantic-link-labs – it was impressive to say the least.
So impressive that when the session was over, I immediately picked out my phone and wrote down onto my blog idea list: “Check out semantic-link-labs and see what you can do with it”.
And here we are! Below is a script that uses semantic-link-labs saving data about all accessible semantic models into a delta table in a Lakehouse. The script exposes information like the date of creation, whether it is a default semantic model, whether RLS is applied, how many measures there are, how many reports are built on top of the semantic model and many more things. At the very end of this article, you can find an example screenshot displaying all columns that result from the script. Feel free to add stuff or align it!
For some inspiration, I also built a Power BI report on top of it. Let me know in the comments below, if you wanna have a copy of the report and I’ll send it to you via email. Also, there’s gonna be an equivalent post where we are going to collect all sorts of information about all our Power BI reports in our Fabric tenant. Of course, we will be utilising semantic-link-labs for this, too. You can see the report part in the visual below as well.
prerequisites.
1. A Fabric capacity and workspace
2. A Fabric Lakehouse
plan of action.
1. What’s the goal?
The goal is to utilise semantic-link-labs in order to get information about all our semantic models in Fabric (that is all semantic models that we have access to). The idea is to store this information in a delta table and to create a Power BI report on top of it, effectively enhancing our governance capabilities: With this, we can easily check out whether our analytics engineers and Power BI developers stick to guidelines provided, e.g. are there models with calculated columns or tables? ;-)
2. The script
Below the snippets of the script. Just create a notebook and copy & paste the snippets. Make sure to attach the notebook to the Lakehouse where the delta table shall reside before executing it. Feel free to set up a schedule for the notebook to run.
We start off by installing semantic-link-labs.
Next, we import all packages needed.
Now, we get all our semantic models and do some basic cleansing.
With this rather basic “semantic model” dataframe, we can loop over each row and retrieve even more details about our models, e.g. the number of calculated columns or tables, whether it is a default semantic model or whether RLS is applied.
Lastly, we join the basic and the enriched dataframe.
And finally, we are ready to save our semantic model meta info as a delta table to our attached Lakehouse.
Here how the table looks like after loading it into Power BI:
end.
And this is it! Once again, let me know if you wanna have a copy of the report. Otherwise, I highly encourage you to check out semantic-link-labs for other things as well.