What is API limit for Microsoft Dataverse (Common Data Service)?

API limit for Microsoft Dataverse (Common Data Service)

Microsoft writes that the Microsoft Dataverse (Common Data Service) API limit help ensure service levels, availability, and quality. For me, this is regarding best practice architecture on how to use external APIs and good programming design…. let me explain.

The first thing to acknowledge is that there are two different limitations that serve two different purposes. Microsoft refers to them as “Entitlement limits” and “Service protection limits”.

Entitlement limits are not enforced yet, read here:  Microsofts statement on when limits will “go live” how and when. Where service protection limits have been enforced since 2018’ish.

Entitlement limits

The entitlement limits are there to help me focus on good programming design and skills.

The documentation says that Entitlement limits are counted towards the number of requests users are making each day. So, in plain English the limit relates to user of the application. (see Assumptions in bottom of blog post)


The count towards the “API limit” are defined by Microsoft as all “requests include all data operations that interact with table rows where rows are created, retrieved, updated, or deleted (CRUD)”. No matter how the user initiated the CRUD ( plug-ins, async workflows, custom controls, custom apps and $batch (ExecuteMultiple) operations etc.) it will count towards this limit.

Different licenses give different entitlement:

User licensesCRUD Entitlement / 24 hours
Dynamics 365 Enterprise applications20,000
Dynamics 365 Professional10,000
Dynamics 365 Team Member5,000
Power Apps per user plan5,000
Power Apps per app plan1,000 per user pass
Power Automate per user plan (+ RPA)5,000
Portals200 per user
Portals Anonymous page views3 per page view

API limits for Power Apps portals are a bit special as they are pooled at portal level depending on number of logins/page views assigned to portal.


“If user exceeds their request entitlement the administrator would be notified and would be able to assign Power Apps and Power Automate request capacity to that user.

While occasional and reasonable overages will be tolerated, customers exceeding licensed capacity should adjust their purchase quantity per standard Microsoft terms to remain in compliance.

Microsoft plays nice and communicates with the Admin without taking the end users as hostages.

Focus on good programming design and skills

Be proactive and do code review of custom build plugins for them do as few CRUD operations as possible..

Do code review and limiting read and write operations will help stay below limit. (Limit equates to opening ~2 records a minute assuming that the user is opening records constantly for 8 hours straight http://develop1.net/public/post/2019/09/04/how-do-the-powerplatform-api-limits-affect-model-driven-apps)

Do not use normal user accounts for automated data integration, use non interactive users.

If the above fails, please monitor the administrator notification in office 365 notification center for these events and react to either limit CRUD-operations or procure a larger entitlement per user to stay compliant.

Do you want to get insight on how close you are to these limits then look at the Power Platform Analytics report that list most active users (unique users) over time, who performed an operation that caused one of these SDK calls: Retrieve, Retrieve Multiple, Delete, Create, and Update. Filter downloaded Excel to remove “non interactive users” to only view “normal users”.


Service protection API limits

Microsoft writes that this limit is set in place “To ensure consistent availability and performance for everyone” and that “The limits will not affect normal users of the platform”. As I understand it,  the Service protection limits are there to help me focus on best practice architecture on how to use Common Data service APIs.

This limit related mostly to “Service accounts” which in Common Data Service are there four types:


Microsoft writes:

“Request limit are evaluated within a five-minute sliding window”
“Service protection limits apply to all external web service requests”
“Service protection API limits are not applied against API calls made within workflows, custom workflow activities, or plug-in code. These operations are invoked internally.”

SubscriptionRequest per 24 hours
Dynamics 365 Enterprise100,000
Dynamics 365 Professional50,000
Microsoft Power Apps or Power Automate25,000


Microsoft writes: “When this limit is exceeded, an exception will be thrown by the platform.” for code using the application user to handle in a retry setup & “if a user or flow exceeds the limits consistently for an extended period (more than 14 days), that non-interactive user may be disabled, or flow turned off.”

For most large enterprise businesses this will be hard to estimate up front as Microsoft writes: “Each web server available to your environment will enforce these limits independently. Most environments will have more than one web server. The actual number of web servers that are available to your environment depends on multiple factors that are part of the managed service Microsoft provide. One of the factors is how many user licenses you have purchased.”

Read how to handle this you your integration code including details of behavior here:  https://docs.microsoft.com/en-us/powerapps/developer/common-data-service/api-limits

Best practice architecture on how to use external APIs.

Use non interactive users for integration.

Read more about Data Integration architecture for Microsoft Common Data Service, Dynamics 365, and PowerApps

Do not expect the APIs to be up and running, therefore build fault tolerances into you design.

Move away from batch import and move closer to an event driven trigger data import.

Setup some monitoring on this “exception” for you to know when this limit is reached and affection integrations etc.

Source of information

There are more details to this topic that I write, but it-is adding details, it does not change the overall information given in this blogpost. As always, Customers are responsible to follow license-guide and latest documentation, so please read these and only use blog post as references to what to read in official documentation or license guides.




Power Apps and Power Automate Licensing Guide.

Community Paranoia

Historical the documentation and communication from Microsoft on these API limits have been and still caused some “paranoia”. The last year or so people have learned the reality of impacts of these limits, or better said, lack of negative impact so blog posting on this topic has quieted down.

On top of this documentation has become better but not perfect. You will find some of the community blogpost that have different angles and perspective, where most of them are from 2018-19. The reason is most ambiguity was found in communication and documentation from Microsoft in these years. Find most of them doing this search: https://www.google.com/search?q=dynamics+365+api+limits

ISV Partners on the topic:




Assumptions taken in this blog post

It is unclear in the documentation as of 14. Nov. 2020 what the term “users” cover in text regarding “entitlement limits”. I assume that it is toward entitlement limit as entitlement limits is for “end users/interactive users” of the product and not for Application users etc. (non-interactive users) as non-interactive users are covered by “Service protection limits”.

Please write a comment below if you still have API Limit Paranoia after reading this blog post, I might have missed something important.

6 thoughts on “What is API limit for Microsoft Dataverse (Common Data Service)?

    1. Service protection limit related mostly to “Service accounts/integration users” which in Common Data Service are there four types:
      Application users

      Non-interactive users

      Administrative users

      SYSTEM user

      1. Maybe this is a stupid question, but does that mean that an integration user is limited to 100.000 API calls per 24 hour, no matter how many users you have? How will you then do an initial load of example 2 million records?

Leave a Reply