Reducing Dataverse / Dynamics Storage Capacity

Power Platform storage seems to get consumed very quickly. I have had to come up with capacity management strategies for numerous clients over the years. I really do wish Microsoft would increase the storage capacity in the Power Platform.


In this blog post, I will discuss the various Power Platform capacity management strategies that I have used the most. These strategies are on top or overlap with the standard recommendations, which can be found at this link, https://docs.microsoft.com/en-us/power-platform/admin/free-storage-space. In summary, the recommendations are;


  1. Reduce log capacity

  2. Delete audit history log files

  3. Delete plugin trace logs

  4. Reduce file capacity

  5. Delete email attachments

  6. Delete note attachments

  7. Reduce database capacity

  8. Delete emails

  9. Delete suspended workflows.

  10. Delete bulk imports

  11. Delete old process sessions

  12. Delete old completed and or failed system jobs


Files

Client 1 - was a call centre with server-side sync configured on several support email accounts.


Emails were stored in 3 places. All emails were stored in Dynamics and Outlook, and the important emails were stored in the client's record management system.

All emails were stored in Dynamics, outlook, and important emails were added to the client's record management system, as per the client's business processes. This meant the same "record" was stored in three systems!


To reduce Power Platform storage consumption, we decided to delete emails over 6 weeks old and over a specific size, given emails could be found in other systems if required. This dramatically reduced file storage consumption and allowed the client to, for the best part, get a 360 view of the customer.


Alternatively, if deleting attachments was not feasible for regulatory or compliance reasons, then we could implement an archive process to move attachments from Power Platform to Azure blob storage or SharePoint. We could do this with Power Automate, C# Batch process in Azure, Azure Functions or Azure Logic Apps.



System Logs and Environment Management

Client 2 - Was going through a digital transformation, and new Power Platform applications were heavily integrated with other systems. The integrations caused a large number of workflows to run, resulting in a large number of audit logs, process sessions, plugin trace logs and system job logs. Additionally, the client was copying production to many environments dev1, dev2, dev3, build, test, UAT and training. Copying an already large Power Platform environment to many environments compounded the storage capacity consumption.


To reduce Power Platform storage consumption, we deleted, in production, on a daily frequency process sessions over a month old, completed system job logs and failed system job logs over a month old.


To keep storage to a minimum in the nonproduction environments, we implemented an aggressive policy to delete all audit logs, process sessions, system jobs, plugin trace logs and business data not required for testing.


Virtual Entities (The most exciting storage resolution, but We should have never got to this stage though)

Client 3 - was importing millions of records. Millions of customers, and then each customer had a high number of case records associated with them. This resulted in an astronomical amount of database storage being consumed.


The solution to this should have been to delete all cases that were 2 years or older. However, the client was certain they required the full history of a customer.


To reduce Power Platform storage consumption, we moved all cases over a year old to Azure table storage (which is very cheap) via a virtual entity. This was amazing for reducing the database storage capacity, but it did have other challenges. Most notably, the business had to accept that the "archived cases" virtual entity would be visible to all users as virtual entities can only be secured at the organisation level. The second challenge was that reporting had to be done in Power BI as it was difficult to union the cases and "archived cases" into one dataset.


In summary, I outlined how I have solved Dataverse storage capacity issue for 3 clients. These are the most common ways I have solved storage capacity issues for many clients, and I hope this gives you the reader ideas on reducing storage capacity in your own Dataverse tenants.

232 views0 comments

Recent Posts

See All

There are a number of tricks that can help improve the speed of data migrations. I have bullet pointed them below and included links and references at the bottom of the page. Turn off all workflows (i