I have an application, that reads opus packets from a file. The file confirms opus packets in ogg format. My application sends each opus packet every 20 millisecond (it is configurable).
For 20 millisec, it sends packets of sizes ranging from 200 bytes to 400 bytes, say average size is 300 bytes.
Sending 300 bytes for 20millsec, is it right or its too much of data. How can I calculate for 20millisec how much data (in bytes) I can send to remote.
Can somebody help me to understand how to calculate number of bytes I need to send to remote party per 20millisec.