Since the data (source code, pipeline definitions, workitem data) is
stored somewhere in the cloud ( Azure Devops manages that) , do we
need to pay extra for that storage?
For TFS you have to store the data in local sql server, but for Azure Devops Service the storage is free in cloud. You don't need to pay for that. They're free, but there're some limitations that you may need to know:
Artifact(package feed) limits:
5000 versions per package ID; use retention policies to automatically clean up old versions
Unlimited package IDs per feed
NuGet packages are limited to 500 MB
npm packages are limited to 500 MB
Maven packages are limited to 500 MB per file
Python packages are limited to 500 MB per file
Universal Packages have been tested up to 1 TB and are recommended for managing large binary content
Git limits:
We impose a few resource limits on Git repositories in Azure Repos. Our goal is to ensure reliability and availability for all customers. Also, by keeping the amount of data and number of pushes reasonable, you can expect to have a better overall experience with Git.
Repository size
Repositories should generally be no larger than 10GB. But for uncommon circumstances, repositories may be larger than 10GB. For instance, the Windows repository is at least 300GB. For that reason, we do not have a hard block in place.
Push size
Very large pushes use up a lot of resources, blocking or slowing other parts of the service. Such pushes often don't map to normal software development activities. Someone may have inadvertently checked in build outputs or a VM image, for example. For these reasons and more, pushes are limited to 5GB at a time.
Git LFS doesn't count towards the 5GB repo limit. The 5GB limit is only for files in the actual repo, not blobs stored as part of LFS. If you get failing pushes on the 5GB limit verify your .gitattributes file includes the extensions of the files you mean to track using LFS and that this file was saved and staged before you staged the large files to be tracked.
Limits about WorkItems,Projects count,rate.
When we build using Azure pipelines, the builds are happening in
virtual machine managed by Azure DevOps somewhere in the cloud. Does
this mean we need to pay for this usage of build machines?
No, that's not necessary.
There're two kinds of build agents: Microsoft hosted agents and Self-hosted agents:
If you want to run the pipeline using local machine, you can use self agent.
If you want to run the pipeline using VM in the cloud, use hosted agent(VM managed by Azure Devops). These two agents both are free, what you need to pay attention to is the parallel job!!!
Azure Devops's pipeline uses parallel jobs to run a single job with several tasks. It means it's free if you want to run several jobs one by one, but if you have specific reason to run several jobs in parallel, you need to pay for the extra parallel job. See Run parallel jobs.
For microsoft-hosted parallel job, we provide a free tier of service by default in every organization:
Public project: 10 free Microsoft-hosted parallel jobs that can run for up to 360 minutes (6 hours) each time, with no overall time limit per month.
Private project: One free job that can run for up to 60 minutes each time, until you've used 1,800 minutes (30 hours) per month.
Self-hosted parallel jobs:
Public project: Unlimited parallel jobs.
Private project: One self-hosted parallel job. Additionally, for each active Visual Studio Enterprise subscriber who is a member of your organization, you get one additional self-hosted parallel job.
When the free tier is no longer sufficient:
Private project: You can pay for additional capacity per parallel job. Buy self-hosted parallel jobs.
There are no time limits on self-hosted jobs.
In addition:
Here's Azure DevOps benefits for Visual Studio subscribers
.