In my opinion there is really not necessarily a best practice in doing it one way or other way, but some other considerations, like security and deployment duration, among others.
From a security point of view, you have to consider that your agent's machine must have access to the TFS/VSTS account. So if it is an internal TFS server you will not have this issue. If it is VSTS, then you will have to consider this with your infrastructure guys.
If you will have one dedicated server for deployment, all the work will have to be done remotely on target servers and for that VSTS tasks uses WinRM, which will have to be enabled on each target machine. These tasks require administrative access to your server and for that reason they will ask you to enter an administrator account/password on the task configuration interface (web browser). This another point that will better discuss with IT, because you will be storing sensitive information on your TFS/VSTS system.
When you have the agent installed on all servers, you will be doing local deployments after the artifacts have being downloaded. This is faster than deploying them remotely with WinRM (self experienced it). But you will have to get access to your TFS/VSTS account on every agent machine.
Personally I prefer to go with the agent installed on each server, but you have to consider your own environment characteristics.