Here is the link of the model:https://mbe.modelica.university/behavior/discrete/decay/#chattering The simulation result for the following result in Dymola 2021 would be:
model WithChatter(stopTime=1.001s)
model WithChatter(stopTime=1.5s)
As we could see, noEvent operator does decrease the CPUTime, but it also causes the system stiff, it would be easier to understand with more explanation about why noEvent would cause the system stiff.
Based on the event logging of model WithChatter, the simulation process actually uses the minimum time step because the der(x) is not a continuous function. But why doesn't this approach suit the model WithNoEvents?(https://mbe.modelica.university/behavior/discrete/decay/#speed-vs-accuracy)
If noEvent operator means using the integrator directly, it might require the functions in the equation system have to be continuous? So Does this mean that the model used in the Chattering example(https://mbe.modelica.university/behavior/discrete/decay/#chattering) isn't appropriate, since the function in this model is not continuous?