Data Lake Analytics makes the complex task of managing distributed infrastructure and complex code easy. It dynamically provisions resources and lets you do analytics on exabytes of data. When the job completes, it winds down resources automatically, and you pay only for the processing power used. As you increase or decrease the size of data stored or the amount of compute used, you don’t have to rewrite code. Many of the default limits can be easily raised for your subscription by contacting support.
Resource | Default Limit | Comments |
---|---|---|
max concurrent jobs | 3 | |
Max parallelism per account | 60 | Use any combination of up to a maximum of 60 units of parallelism across three jobs. |