top of page

Automating Unused Report Archival in Microsoft Fabric

  • Writer: Ishan Deshpande
    Ishan Deshpande
  • 5 days ago
  • 4 min read
ree

Organizations love reports. They multiply like rabbits in every Fabric workspace. Over time, many reports fall into the “just-in-case someone needs it” zone. Their datasets still refresh daily, consuming compute and increasing chances of failed jobs, even though nobody has opened them in months.

 

There had to be a better way.


This blog explores a practical, automated solution in Microsoft Fabric that identifies unused reports, archives them, tracks everything in a log table, and eventually clears them out permanently. You will find a reference notebook with all code at the end of this blog.



Solution Goal


Automatically detect, archive, and clean up unused reports in Microsoft Fabric using:

  • Fabric Audit Logs

  • Semantic Link Labs Library

  • Fabric REST APIs

  • Fabric Notebooks



Prerequisites


To implement this solution, ensure the following are in place:

  1. Fabric Capacity

  2. Fabric Audit Logs captured for at least 60 days (refer to my previous blog for setup guidance)

  3. An account with Fabric Admin role and contributor access to all workspaces (for bulk workspace assignment check following post - Linkedin)

  4. Service Principal with the following permissions: Report.ReadWrite.All or Item.ReadWrite.All (Credentials stored in Azure key Vault)

  5. Archive Workspace - A regular Fabric workspace to temporarily store unused reports.

  6. A table named report_archive_history with following columns - ReportId, WorkspaceId, ArchiveDate, ReportName, ArchivedReportName, DeletedOn and RestoredOn.



High-Level Solution Design


  1. Identify reports that have not been viewed in the last 90–100 days(100 days used here considering quarterly-used reports).

  2. Copy these reports to an Archive Workspace (Archived Reports remain connected to the original dataset).

  3. Insert activity records into a Report Archive History table.

  4. Delete the reports from source workspace (Only reports, not the datasets).

  5. Retain archived reports for 60 days. Within this period if any user request that they need it, restore to the original workspace, else delete it permanently after 60 days.

  6. Identify datasets that no longer serve any reports and remove them as well.

  7. Schedule this logic using Fabric Notebook to run weekly.



Complete Solution Walkthrough


This automation runs inside a Fabric Notebook and orchestrates the entire lifecycle: detecting unused reports, archiving them, logging activities, and cleaning up source workspaces. Below is a step-by-step narration of what happens under the hood.


  1. Import Required Libraries

ree

We start by loading:

  • Semantic Link Labs library for quick report copy operations

  • Python utilities for REST API calls

  • Libraries for access token generation


  1. Declare Global Variables

ree

We define reusable configuration values such as:

  • Workspace ID of the Archive Workspace

  • Fabric service principal details

  • Azure Key Vault reference details


  1. Fetch Secrets Securely from Key Vault

ree

To avoid any credential exposure, the notebook retrieves all secrets from the Key vault

These will later authenticate calls to Microsoft Fabric REST APIs.


  1. Identify Unused Reports

ree

Based on Audit Logs stored in Fabric Lakehouse we can get a list of unused reports.


Business logic - Report that are not viewed in last 90 days (the days can vary based on your business logic but 90/100 days cover reports that are viewed on quarterly basis).


  1. Copy Unused Reports into Archive Workspace

ree

  • Loop through each unused report.

  • Copy it into Archive Workspace using Semantic Link Labs library.

  • Assign a timestamp suffix to avoid name collisions (because reports with identical names can exist in multiple workspaces).

  • Insert a record into Archive History Log table Capturing:

    • Report name (original + archived)

    • Source workspace ID

    • Archive workspace ID

    • Archive timestamp


Note: Semantic Link Labs authenticates via Signed-in User Account, so the one running this code needs to be a Fabric Admin. I tried Service Principal authentication for this action but encountered some issues. Still exploring solutions!


  1. Delete Reports from Source Workspace

ree

Once we have copied the report we need to delete it from the source workspace. Here we are using a traditional API call to delete the report, using service principal as an authentication because as of now semantic link labs don’t have a function to delete a report. The service principal needs to have the following permissions  - Report.ReadWrite.All or Item.ReadWrite.All.


Next 2 points will be covered in detail in part 2 of this solution


  1. Delete Reports from Archive Workspace

After the defined retention period (eg 60 days) is over we will delete the reports from Archive workspace and update the log table (report archive history) with the report deleted date.


  1. Delete Datasets Associated with these Reports

Some datasets served only the reports now removed. We locate such datasets using metadata and remove stale compute sources completely.



Conclusion


Unused reports are silent culprits. They sit unnoticed, refreshing daily, quietly consuming compute and nudging costs upward. As Microsoft Fabric environments scale, maintaining workspace hygiene is no longer a luxury. It becomes a core piece of governance.


This automated archival framework helps organizations:

  • Free up compute capacity and reduce refresh failures

  • Maintain a clean and manageable analytics ecosystem

  • Ensure accountability with a full audit trail

 

There are a lot more things we can do in this like sending email alerts to report owners and authenticating everything using a service principal so there is no user dependency.

If you have ideas that can make this process smarter, faster, or more scalable, I would love to hear your perspective. Your insights could shape the next part of this solution and benefit the entire Fabric community.


Let’s build better governance together!


bottom of page