0
votes

I want to use OMNeT++ and INET for a network simulation. The focus of my simulation lies in the correct representation of the timing behavior. Therefore, the simulation should not only consider the time of the transmission, but also how long the packet is delayed within the stack. Such delays can occur because of necessary checksum calculations for e.g. TCP, UDP or IPv4. As far as I've seen, the checksum calculation is not considered in INET, it's only possible to represent an incorrect checksum by a bit error.

But I wanted to ask here to make sure that I didn't miss an option that allows to consider such effect on the timing behavior.

I'm thankful for your feedback.

1

1 Answers

2
votes

You are correct, the time consumption inside the stack or the time spent during the processing of packets is not considered or included in INET a priori. This is a complicated topic, because these kinds of "delays" strongly depend on the actual real-life system, the situation of the system, the actually used software etc. Even if all kinds of processing delays are modeled and included, one big question would remain (among others): How to set the delays? To which values? How to verify the correct value settings? Etc...

This discussion aside, if you want to include processing delays, you could start by modeling them via self-messages. Whenever a "processing delay" relevant operation starts, a self-message with a delay (the actual processing time) is sent to the module itself. When the message gets processed, the actual code is executed and the simulation time would advance.

This, if course, requires that depending functions are blocked for the duration of the processing... might be a complex work to introduce such things into the INET stack.