Automating Unused Report Archival in Microsoft Fabric
- Ishan Deshpande

- 5 days ago
- 4 min read

Organizations love reports. They multiply like rabbits in every Fabric workspace. Over time, many reports fall into the “just-in-case someone needs it” zone. Their datasets still refresh daily, consuming compute and increasing chances of failed jobs, even though nobody has opened them in months.
There had to be a better way.
This blog explores a practical, automated solution in Microsoft Fabric that identifies unused reports, archives them, tracks everything in a log table, and eventually clears them out permanently. You will find a reference notebook with all code at the end of this blog.
Solution Goal
Automatically detect, archive, and clean up unused reports in Microsoft Fabric using:
Fabric Audit Logs
Semantic Link Labs Library
Fabric REST APIs
Fabric Notebooks
Prerequisites
To implement this solution, ensure the following are in place:
Fabric Capacity
Fabric Audit Logs captured for at least 60 days (refer to my previous blog for setup guidance)
An account with Fabric Admin role and contributor access to all workspaces (for bulk workspace assignment check following post - Linkedin)
Service Principal with the following permissions: Report.ReadWrite.All or Item.ReadWrite.All (Credentials stored in Azure key Vault)
Archive Workspace - A regular Fabric workspace to temporarily store unused reports.
A table named report_archive_history with following columns - ReportId, WorkspaceId, ArchiveDate, ReportName, ArchivedReportName, DeletedOn and RestoredOn.
High-Level Solution Design
Identify reports that have not been viewed in the last 90–100 days(100 days used here considering quarterly-used reports).
Copy these reports to an Archive Workspace (Archived Reports remain connected to the original dataset).
Insert activity records into a Report Archive History table.
Delete the reports from source workspace (Only reports, not the datasets).
Retain archived reports for 60 days. Within this period if any user request that they need it, restore to the original workspace, else delete it permanently after 60 days.
Identify datasets that no longer serve any reports and remove them as well.
Schedule this logic using Fabric Notebook to run weekly.
Complete Solution Walkthrough
This automation runs inside a Fabric Notebook and orchestrates the entire lifecycle: detecting unused reports, archiving them, logging activities, and cleaning up source workspaces. Below is a step-by-step narration of what happens under the hood.
Import Required Libraries

We start by loading:
Semantic Link Labs library for quick report copy operations
Python utilities for REST API calls
Libraries for access token generation
Declare Global Variables

We define reusable configuration values such as:
Workspace ID of the Archive Workspace
Fabric service principal details
Azure Key Vault reference details
Fetch Secrets Securely from Key Vault

To avoid any credential exposure, the notebook retrieves all secrets from the Key vault
These will later authenticate calls to Microsoft Fabric REST APIs.
Identify Unused Reports

Based on Audit Logs stored in Fabric Lakehouse we can get a list of unused reports.
Business logic - Report that are not viewed in last 90 days (the days can vary based on your business logic but 90/100 days cover reports that are viewed on quarterly basis).
Copy Unused Reports into Archive Workspace

Loop through each unused report.
Copy it into Archive Workspace using Semantic Link Labs library.
Assign a timestamp suffix to avoid name collisions (because reports with identical names can exist in multiple workspaces).
Insert a record into Archive History Log table Capturing:
Report name (original + archived)
Source workspace ID
Archive workspace ID
Archive timestamp
Note: Semantic Link Labs authenticates via Signed-in User Account, so the one running this code needs to be a Fabric Admin. I tried Service Principal authentication for this action but encountered some issues. Still exploring solutions!
Delete Reports from Source Workspace

Once we have copied the report we need to delete it from the source workspace. Here we are using a traditional API call to delete the report, using service principal as an authentication because as of now semantic link labs don’t have a function to delete a report. The service principal needs to have the following permissions - Report.ReadWrite.All or Item.ReadWrite.All.
Next 2 points will be covered in detail in part 2 of this solution
Delete Reports from Archive Workspace
After the defined retention period (eg 60 days) is over we will delete the reports from Archive workspace and update the log table (report archive history) with the report deleted date.
Delete Datasets Associated with these Reports
Some datasets served only the reports now removed. We locate such datasets using metadata and remove stale compute sources completely.
Conclusion
Unused reports are silent culprits. They sit unnoticed, refreshing daily, quietly consuming compute and nudging costs upward. As Microsoft Fabric environments scale, maintaining workspace hygiene is no longer a luxury. It becomes a core piece of governance.
This automated archival framework helps organizations:
Free up compute capacity and reduce refresh failures
Maintain a clean and manageable analytics ecosystem
Ensure accountability with a full audit trail
There are a lot more things we can do in this like sending email alerts to report owners and authenticating everything using a service principal so there is no user dependency.
If you have ideas that can make this process smarter, faster, or more scalable, I would love to hear your perspective. Your insights could shape the next part of this solution and benefit the entire Fabric community.
Let’s build better governance together!


