I am writing an Android application that communicates with a custom made device BLE via GATT services. Said device provides a service with 2 characteristics for reading and writing data. When some data is written to the WRITE characteristic, the BLE device will then send it through a wired UART interface to some other device. That other device will then respond to the BLE device through that same UART interface. Upon reception, the BLE device will send a notification that new data is available on the READ characteristic of its service so my Android application can retrieve it.
What I would like to do is measure the time elapsed between when I send a request from my Android application to when I receive the notification that new data is available.
I have implemented a "stopwatch" as a long
. I set it to System.currentTimeMillis();
when I write a data and I compare its value to another call to System.currentTimeMillis();
upon notification reception giving something like :
long stopwatch = System.currentTimeMillis();
// ...
// ...
long elapsed = System.currentTimeMillis() - stopwatch;
I have set 2 stopwatches to compare 2 measured times.
The first stopwatch is reset when I call gatt.writeCharacteristic(myCharacteristic)
and the second one is reset when BluetoothGattCallback.onCharacteristicWrite()
is called.
I have registered my application for the notifications from the READ characteristic so I stop both stopwatches when BluetoothGattCallback.onCharacteristicChanged()
is called.
The thing is, I have on average 100ms between those measurements ! I think it is quite a lot.
Average time starting when I call gatt.writeCharacteristic(myCharacteristic)
is 140ms
Average time starting when BluetoothGattCallback.onCharacteristicWrite()
is called is 40ms.
So I was wondering what was the proper way to time such an exchange and when should I reset my stopwatch in order to get the most accurate time measurement.