In the first link you've shared, there's an image that shows the transfer speed (estimated) depending on the network bandwidth.

So let's say you have a bandwith of 1Gbps, then the data will be available in your GCP project in ~30 hours as you are transfering 12TB which is close to 10TB. That makes it 1 day and a half to transfer.
If you really want to transfer 12TB/day because you need that data to be available each day, and increasing bandwidth is not a possibility, I would recommend you to batch data and create different transfer services for each batch. As an example:
- Split 12TB into 12 batches of 1TB -> 12 transfer jobs of 1TB each
- Each batch will take 3 hours to complete, therefore you will have available 8/12TB a day.
This can be applied to smaller batches of data if you want to have a more fine-grained solution.