5
votes

I'm using .Net framework 4 and create a serial port GUI application in C#. In some cases, the SerialPort object (port) doesn't receive all the data (I compare it with a Mixed Signal Oscilloscope)

For example, if the connected device sends:

0x00 0x01 0x02 0x03 0x04 0x05 0x06 0x07 0x08 0x09

I can receive:

0x00 0x01 0x02 0x03 0x04 0x08 0x09

I tried different codes having the same issue:

  • Use the data received event to fill a buffer:

        private void dataReceived(object sender, SerialDataReceivedEventArgs e) {
            while (port.BytesToRead > 0){
                byte[] newBytes = new byte[port.BytesToRead];
                int LengthRead = port.Read(newBytes, 0, newBytes.Length);
                Array.Resize(ref newBytes, LengthRead);
                System.Buffer.BlockCopy(newBytes, 0, buffer, positionInBuffer, newBytes.Length);
                positionInBuffer += newBytes.Length;
            }
        }
    
  • Looping for an expected number of bytes, in this case causing the TimeoutException:

            while (port.BytesToRead < expectedSize)
        {
            System.Threading.Thread.Sleep(10);
            waitingLoop++;
            if (waitingLoop > TimeOut)             // wait for a 1s timeout
                throw new TimeoutException();
        }
    
        newBytes = new byte[expectedSize];
        LengthRead = port.Read(newBytes, 0, newBytes.Length);
        if (LengthRead != newBytes.Length)
            throw new TimeoutException(); // or any exception, doesn't matter...
    

I tried to change the ReadBufferSize size, the timeout info,... but nothing works. Any ideas?

3

3 Answers

4
votes

There are subtle mistakes in this code. The first snippet's Array.Resize() does nothing, maybe buffer needs to be resized instead? The second snippet bombs on a TimeoutException when more bytes were received than expectedSize. Which is certainly possible and not a timeout problem.

But I think the real problem is what you do with the array outside of this code. In general, calling SerialPort.Read() only gets you a couple of bytes. Serial ports are slow. You need some kind of protocol to know when a 'full' response is received. A very common (non-binary) one is simply terminating the response with a line-feed. Then you'd just use ReadLine() in the DataReceived event handler. It isn't obvious what the proper approach would be in your case.

Also implement the ErrorReceived event. It will tell you when something bad happened, like SerialError.Overrun or RXOver, errors that also make bytes disappear.

2
votes

Try to configure software (XON/XOF) or hardware (e.g. RTS/CTS) handshaking on the serial port, to match the configuration of the device that's transmitting.

0
votes

Thanks for the answers.

Actually I have some kind of protocol in the serial data. It's something like:

<ID><SIZE><DATA....><CHECKSUM>

I usually correctly get the ID, the SIZE and then I listen for the expected 'size' data bytes (size is less than 1000 and the serial port running at 115200, so it should be a short time). But when I wait for them, for example with the 2nd code (waiting loop), they never come up on my C# code even if they are really passing thought the serial line (checked with the scope). Still running the 2nd code, I got the 1st TimeOutException (on the waiting loop) which means I don't receive all the data within the 1s (also tried to increase to 5s). So the error is catch before going to the SerialPort.Read() function.

Anyway, I will check for the ErrorReceived event to check if I catch something...

Regarding the Array.Resize() I wrote this in case the returned data of SerialPort.Read() has less bytes than the expected ones. Buffer is pre-sized at the max size of my SerialPort buffer (4096). Regarding my 2nd mistake in the code you are complitly right. I should better use a more appropriate exception :)